I successfully used Cython to convert .py to .pyd, but have been running into difficulties trying to import cythonized modules afterwards.
setup.py
from setuptools import setup
from Cython.Build import cythonize
setup(
ext_modules = cythonize("world.py")
)
world.py
print('Hello World')
When I try to import using import world, it gives me the following error:
ImportError: dynamic module does not define module export function (PyInit_world)
I am using Windows 8.1 64bit and Python 3.7.7 32bit.
I have been trying different strategies for hours. Any help will be appreciated.
I am trying to build a library using Numpy and Cython. While compiling the .pyx file went smoothly, I can't test out the files in a test file.
It just says ", line 1, in
import blank_cy #The name of the .pyd
ImportError: DLL load failed: The specified module could not be found.
I have tried looking at other similar problems but I still can't figure it out. Also, I am not sure what information I need on here so please ask. I'll just list off some things.
The .pyx file imports numpy as np and math and cimports numpy as np.
The compiling process does not produce any errors.
I renamed the file to match my import
Without imports it works fine.
Thank you so much.
Here's an example.
This would be the test.pyx
import numpy
cimport numpy
print("Hello World");
The setup.py:
from setuptools import setup
from Cython.Build import cythonize
import numpy
setup(ext_modules = cythonize("test.pyx"),include_dirs=[numpy.get_include()])
The test file to import test.pyd
import test
I renamed the file to match my import
Don't do this! This is your problem.
When it imports an extension named my_module Python looks for a function called PyInit_my_module (the function name is slightly different for Python 2, or if the module name has non-ascii characters, but the same basic idea applies) as the module initialisation function.
Since you've renamed your module the name of the initialisation function that Cython has created no longer matches and thus the whole thing breaks.
Just ensure that your pyx files have the module name that you ultimately want to use.
I'm trying to run a python code (test.py) which uses an external module written in C (swapped). Here is the directory structure,
test.py
setup.py
swapped.so
ext_src/
c_swapped.c
c_swapped.h
swapped.c
In order to use the swapped module, I first run setup.py, which is configured as following:
from distutils.core import setup
from distutils.extension import Extension
from distutils.command import build_ext
import numpy as np
import sys
sys.argv[1:] = ['build_ext', '--inplace']
ext_modules = [Extension(
name="swapped",
sources=["ext_src/swapped.c", "ext_src/c_swapped.c"],
language="c", include_dirs=[np.get_include()])
]
setup(
name = 'RLScore',
ext_modules = ext_modules,
)
And in test.py, swapped is imported as:
import swapped
So when I run these on python 2.7.6, test.py works just fine, but on python 3.4.5, I get the following error:
import swapped
File "swapped.pyx", line 2, in init rlscore.utilities.swapped (ext_src/swapped.c:3533)
System error parent module '' not loaded cannot perform relative import
I wish to run this code on python 3.4.5, so is there a way that I can make swapped module working?
Thanks a lot!
Edit: I installed virtual environment as python=2.7, so it works now. But still, if there is a way to run the package without using virtualenv, I'd like to learn.
I have a project which has a C extension which requires numpy. Ideally, I'd like whoever downloads my project to just be able to run python setup.py install or use one call to pip. The problem I have is that in my setup.py I need to import numpy to get the location of the headers, but I'd like numpy to be just a regular requirement in install_requires so that it will automatically be downloaded from the Python Package Index.
Here is a sample of what I'm trying to do:
from setuptools import setup, Extension
import numpy as np
ext_modules = [Extension('vme', ['vme.c'], extra_link_args=['-lvme'],
include_dirs=[np.get_include()])]
setup(name='vme',
version='0.1',
description='Module for communicating over VME with CAEN digitizers.',
ext_modules=ext_modules,
install_requires=['numpy','pyzmq', 'Sphinx'])
Obviously, I can't import numpy at the top before it's installed. I've seen a setup_requires argument passed to setup() but can't find any documentation on what it is for.
Is this possible?
The following works at least with numpy1.8 and python{2.6,2.7,3.3}:
from setuptools import setup
from setuptools.command.build_ext import build_ext as _build_ext
class build_ext(_build_ext):
def finalize_options(self):
_build_ext.finalize_options(self)
# Prevent numpy from thinking it is still in its setup process:
__builtins__.__NUMPY_SETUP__ = False
import numpy
self.include_dirs.append(numpy.get_include())
setup(
...
cmdclass={'build_ext':build_ext},
setup_requires=['numpy'],
...
)
For a small explanation, see why it fails without the "hack", see this answer.
Note, that using setup_requires has a subtle downside: numpy will not only be compiled before building extensions, but also when doing python setup.py --help, for example. To avoid this, you could check for command line options, like suggested in https://github.com/scipy/scipy/blob/master/setup.py#L205, but on the other hand I don't really think it's worth the effort.
I found a very easy solution in [this post][1]:
Or you can stick to https://github.com/pypa/pip/issues/5761. Here you install cython and numpy using setuptools.dist before actual setup:
from setuptools import dist
dist.Distribution().fetch_build_eggs(['Cython>=0.15.1', 'numpy>=1.10'])
Works well for me!
This is a fundamental problem with packages that need to use numpy (for distutils or get_include). I do not know of a way to "boot-strap" it using pip or easy-install.
However, it is easy to make a conda package for your module and provide the list of dependencies so that someone can just do a conda install pkg-name which will download and install everything needed.
Conda is available in Anaconda or in Miniconda (python + just conda).
See this website: http://docs.continuum.io/conda/index.html
or this slide-deck for more information: https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda
The key is to defer importing numpy until after it has been installed. A trick I learned from this pybind11 example is to import numpy in the __str__ method of a helper class (get_numpy_include below).
from setuptools import setup, Extension
class get_numpy_include(object):
"""Defer numpy.get_include() until after numpy is installed."""
def __str__(self):
import numpy
return numpy.get_include()
ext_modules = [Extension('vme', ['vme.c'], extra_link_args=['-lvme'],
include_dirs=[get_numpy_include()])]
setup(name='vme',
version='0.1',
description='Module for communicating over VME with CAEN digitizers.',
ext_modules=ext_modules,
install_requires=['numpy','pyzmq', 'Sphinx'])
To get pip to work, you can do similarly as Scipy: https://github.com/scipy/scipy/blob/master/setup.py#L205
Namely, the egg_info command needs to be passed to standard setuptools/distutils, but other commands can use numpy.distutils.
Perhaps a more practical solution is to just require numpy to be installed beforehand and import numpy inside a function scope. #coldfix solution works but compiling numpy takes forever. Much faster to pip install it first as a wheels package, especially now that we have wheels for most systems thanks to efforts like manylinux.
from __future__ import print_function
import sys
import textwrap
import pkg_resources
from setuptools import setup, Extension
def is_installed(requirement):
try:
pkg_resources.require(requirement)
except pkg_resources.ResolutionError:
return False
else:
return True
if not is_installed('numpy>=1.11.0'):
print(textwrap.dedent("""
Error: numpy needs to be installed first. You can install it via:
$ pip install numpy
"""), file=sys.stderr)
exit(1)
def ext_modules():
import numpy as np
some_extention = Extension(..., include_dirs=[np.get_include()])
return [some_extention]
setup(
ext_modules=ext_modules(),
)
This should now (since 2018-ish) be solved by adding numpy as a buildsystem dependency in pyproject.toml, so that pip install makes numpy available before it runs setup.py.
The pyproject.toml file should also specify that you're using Setuptools to build the project. It should look something like this:
[build-system]
requires = ["setuptools", "wheel", "numpy"]
build-backend = "setuptools.build_meta"
See Setuptools' Build System Support docs for more details.
This doesn't cover many other uses of setup.py other than install, but as those are mainly for you (and other developers of your project), so an error message saying to install numpy might work.
#coldfix's solution doesn't work for Cython-extensions, if Cython isn't pre-installed on the target-machine, as it fails with the error
error: unknown file type '.pyx' (from 'xxxxx/yyyyyy.pyx')
The reason for the failure is the premature import of setuptools.command.build_ext, because when imported, it tries to use Cython's build_ext-functionality:
try:
# Attempt to use Cython for building extensions, if available
from Cython.Distutils.build_ext import build_ext as _build_ext
# Additionally, assert that the compiler module will load
# also. Ref #1229.
__import__('Cython.Compiler.Main')
except ImportError:
_build_ext = _du_build_ext
And normally setuptools is successful, because the import happens after setup_requirements are fulfilled. However by importing it already in setup.py, only fall back solution can be used, which doesn't know any about Cython.
One possibility to bootstrap Cython alongside with numpy, would be to postpone the import of setuptools.command.build_ext with help of an indirection/proxy:
# factory function
def my_build_ext(pars):
# import delayed:
from setuptools.command.build_ext import build_ext as _build_ext#
# include_dirs adjusted:
class build_ext(_build_ext):
def finalize_options(self):
_build_ext.finalize_options(self)
# Prevent numpy from thinking it is still in its setup process:
__builtins__.__NUMPY_SETUP__ = False
import numpy
self.include_dirs.append(numpy.get_include())
#object returned:
return build_ext(pars)
...
setup(
...
cmdclass={'build_ext' : my_build_ext},
...
)
There are other possibilities, discussed for example in this SO-question.
You can simply add numpy into your pyproject.toml file. This works for me.
[build-system]
requires = [
"setuptools>=42",
"wheel",
"Cython",
"numpy"
]
build-backend = "setuptools.build_meta"
I'm trying to run Hadoopy, which has a file _main.pyx, and import _main is failing with module not found in __init__.py.
I'm trying to run this on OS X w/ standard python 2.7.
Add this code before you try to import _main:
import pyximport
pyximport.install()
Note that pyximport is part of Cython, so you'll have to install that if it isn't already.
You need to make sure you have followed all steps:
Install the Cython package using pip
pip install Cython
Create a Cython file bbox.pyx
cimport cython
import numpy as np
cimport numpy as np
DTYPE = np.float32
ctypedef np.float32_t DTYPE_t
#cython.boundscheck(False)
def compare_bboxes(
np.ndarray[DTYPE_t, ndim=2] boxes1,
np.ndarray[DTYPE_t, ndim=2] boxes2):
...
Create setup.py in the same directory
from distutils.core import setup, Extension
from Cython.Build import cythonize
import numpy
package = Extension('bbox', ['bbox.pyx'], include_dirs=[numpy.get_include()])
setup(ext_modules=cythonize([package]))
Build the Cython
python3 setup.py build_ext --inplace
Create your main python script run.py in the same directory
import pyximport
pyximport.install(setup_args={"script_args" : ["--verbose"]})
from bbox import compare_bboxes
def main(args):
boxes1 = args.boxes1
boxes2 = args.boxes2
result = compare_bboxes(boxes1, boxes2)
Run your main script in the same directory
python run.py