I have a Cython project containing several .pyx files. To distribute my project I would like to provide my generated .c files as recommended in the Cython documentation, to minimize problems with different Cython versions.
My current work flow is to build the project using:
me#machine$ python setup.py build_ext --inplace
This cythonizes (i.e. translate .pyx files to .c / .cpp files using Cython) all .pyx files and then compiles them.
For a release I do not need the compiled (.so) files, so I basically ignore them. The problem is, I waste a lot of time with the useless compilation.
Is there a way to cythonize all .pyx files in the folder using the setup.py, without compiling them?
Edit: Why not just use $ cython my_file.pyx
I have about 20 .pyx files which have different compiler directives. Calling cython on the command line for each file would be slower than just waiting for it to compile.
On the other hand creating a shell script to cythonize them would leave me with a second list of compiler directives that needs to be kept up to date. I would rather not do this, in order to keep my possible points of failure minimal.
One way to create the C files without compiling them is first remove in setup.py "setup(...)" line and replace it with only the "cythonize("*.pyx")" part. Then run:
me#machine$ python setup.py
Related
I'm trying to build the setup.py package for my Python 3.4 app which contains a Cython module. My plan is to have a setup.py file which requires Cython and compiles the .pyx file with the idea that typically I would just run that on Win32 (x86 & x64) and Mac and then generate platform wheels and upload them to PyPI, so regular users wouldn't need Cython or to compile anything.
Currently I can get the setup.py script to build the .pyd file, but it doesn't put the built .pyd file in the same location as the original .pyx file which is where I need it to be.
I tried manually copying the .pyd file, but then when I generate the wheel via bdist_wheel I get a pure Python wheel which I think is wrong since there are different version of my built .pyx file depending on the platform. I also tried subclassing the Distribution class and forcing is_pure() to return False (as described here in the Building Wheels section), but that doesn't work. (Still generates a pure Python wheel.)
I assume that if I can get the setup.py script to put the compiled file in the right location, then the wheel will not be pure Python and everything will be fine.
So my questions are:
Is there some kind of setting in setup.py to tell it to put the compiled Cython files into the same locations as the source files?
Or is there a way to copy the compiled files to the location I need them but for the wheel to still be built as non-pure?
Or am I going about this in the wrong way. Should I change my Python code to pull the compiled packages from somewhere else? (That doesn't seem right to me.)
[EDIT] Or should I put the compiled binary .pyd / .so / .dylib into the package, add logic to get the right .pyd based on architecture at runtime, and then have a pure Python wheel? (That also doesn't seem right to me.)
More information on my setup:
I'm using setuptools (not distutils) in my setup.py script. A snippet of my setup.py is here (with a lot of stuff removed to keep it to the stuff that's relevant):
from setuptools import setup, Extension
from Cython.Build import cythonize
from Cython.Distutils import build_ext
...
extensions = [
Extension(name='audio_interface',
sources=['mpf/mc/core/audio/audio_interface.pyx'],
include_dirs=include_dirs,
library_dirs=library_dirs,
libraries=libraries,
extra_objects=extra_objects,
extra_compile_args=extra_compile_args,
extra_link_args=extra_link_args),
...
setup(
...
ext_modules=cythonize(extensions),
cmdclass= {'build_ext': build_ext},
...
Thanks!
Brian
I had a similar issue, and I have found a solution that could work also for you.
From the official Python documentation for distutils.core.Extension, the name argument is:
the full name of the extension, including any packages — ie. not a
filename or pathname, but Python dotted name
So, if you modify your setup.py to:
...
extensions = [
Extension(name='mpf.mc.core.audioaudio_interface', # using dots!
sources=['mpf/mc/core/audio/audio_interface.pyx'],
include_dirs=include_dirs,
library_dirs=library_dirs,
libraries=libraries,
extra_objects=extra_objects,
extra_compile_args=extra_compile_args,
extra_link_args=extra_link_args),
...
you will have the compiled binary in the location that you want.
I found a workaround which solves the immediate problem which is to add --plat-name to the python setup.py bdist_wheel command. (This was added in wheel 0.27.0. No idea why it didn't come up in all my searches before.)
So this solves my problem via Item #2 on my list. I compile the pyx to .pyd or .so, copy it back to the proper location, and then use --plat-name to create a platform-specific wheel.
However my question still somewhat remains because if we add more Cython modules, do we compile and then just copy them all back to their original locations manually, or is there a way for them to be compiled and go automatically to their original locations? Also the way I'm doing it involves two different setup.py files--one to compile and a second to install the package.
So I'm adding this here for the benefit of future searchers but my original questions still remain.
My goal is to have the ability to call functions in C++ with meaningful arguments.
I can't do that with just subprocess.call because then I go into main(int argc,char** argv) and I have a bunch of strings to deal with. I do not want to have to parse matrices out of strings.
I'm trying to use Cython because that seems like the reasonable thing to do. But although there are a good amount of guides for getting Cython running most of them are for 2.7, and it's rare to see two advise the same thing.
My question basically is does anybody here know how to get Cython running on Py3.5? Or know of a guide or something? I'm lost.
Okay so I had a pretty silly mistake, was compiling with msvs, spent so much time trying to get mingw to work but forget that, 'msvc' does the trick. For any passersby if you're on 3.5+ you should be using Visual Studio 2015. After installing cython with 'pip3 install cython', create a setup.py file where you put this
from distutils.core import setup
from Cython.Build import cythonize
setup(ext_modules = cythonize(
"TestCython.pyx", # our Cython source
#sources=["Rectangle.cpp"], # additional source file(s)
language="c++", # generate C++ code
))
Create a .pyx file (let's say 'TestCython.pyx') where you write whatever you want for example (let's say 'print("Hello World")'. The sources argument for cythonize is optional.
Then cd into where your .pyx and .py file is and run
'python setup.py build_ext --inplace --compiler=msvc'
This should compile to a .cpp and then .pyd file (the latter is the one you will use). If you just had the hello world that will get printed out as soon as you import TestCython.
Refer to the docs and google for anything else. ;)
Python setuptools can create a source distribution:
python setup.py sdist # create a source distribution (tarball, zip file, etc.)
Or a binary distribution:
python setup.py bdist # create a built (binary) distribution
As far as I understand, there should not be any performance difference:
bdist installs the already-compiled .pyc files from the binary package.
sdist compiles the .py files to .pyc files, and installs them.
When executed it should not matter how were the .pyc files compiled - they should have the same performance.
Is there any performance difference between dist and sdist python packages?
If you have a pure Python code, the difference in time deploying will be slim. Note that there is no difference in performance between .py and .pyc, except that the latter will be read slightly faster the first time. The so called optimised .pyo only strip the asserts, and optionally, get rid of the docstrings, so they are not very much optimised.
The big difference comes when you have C files. sdist will include them if properly referenced, but the user will need a working and appropiate compiler, Python header files, and so on. Also, you will have to take the time to build them on every client. The same distribution will be valid for any platform you deploy in.
On the other hand, bdist compiles the code once. Installing in the client is immediate, as they don't need to build anything, and easier as they don't require a compiler installed. The downside is that you have to build for that platform. Setuptools is capable of doing cross-compilation, provided you have installed and configured the right tools.
I downloaded a pyx and c file from the internet and I am just wondering how can i incorporate it into python? The documentation on Cython is fairly vague and mainly focus on generating pyx/c from py file. It would be great if you could give me some solid examples on how to do this properly. Many thanks
The Cython executable turns a .pyx file into a .c file. You then use your favorite C build tool to compile it into a shared library (e.g. an .so file on Linux). Then poof, you have an extension module. Note that you'll need to figure out all the extra arguments to your C compiler like the header paths for Python and numpy. These all depend very heavily on not only your OS but also the particulars of how you've installed Python and SciPy on it.
If this all sounds a bit scary, see if the .pyx file is simple enough that you can use pyximport to handle all the messy compilation for you. If external libraries are needed, you'll probably need to construct a .pyxbld file.
Cython is a compiler: it converts a .pyx into a .c, which you then build to a .so or .pyd. Take a look at the cython docs on compilation. You will probably want to use pyximport module technique if you want to modify the code and experiment, and then use a setup.py when done and you need a final version of your Cython .pyx module.
In my hands i have a .pyc and not the corresponding .py or .pyx.
Is it possible to cythonize a .pyc ?
should it be decompressed to .py first and how ?
Cython is a superset of Python, and hence you can cythonize a .py file. I say this because .pyc files can be decompiled to .py file. There are several libraries that can do this, however, I would suggest that you have a look at this question asked previously.
Although this can be done, there are no real benefits, your python code can at most gain a 20% speed boost.