Import external file using Cython - python

I downloaded a pyx and c file from the internet and I am just wondering how can i incorporate it into python? The documentation on Cython is fairly vague and mainly focus on generating pyx/c from py file. It would be great if you could give me some solid examples on how to do this properly. Many thanks

The Cython executable turns a .pyx file into a .c file. You then use your favorite C build tool to compile it into a shared library (e.g. an .so file on Linux). Then poof, you have an extension module. Note that you'll need to figure out all the extra arguments to your C compiler like the header paths for Python and numpy. These all depend very heavily on not only your OS but also the particulars of how you've installed Python and SciPy on it.
If this all sounds a bit scary, see if the .pyx file is simple enough that you can use pyximport to handle all the messy compilation for you. If external libraries are needed, you'll probably need to construct a .pyxbld file.

Cython is a compiler: it converts a .pyx into a .c, which you then build to a .so or .pyd. Take a look at the cython docs on compilation. You will probably want to use pyximport module technique if you want to modify the code and experiment, and then use a setup.py when done and you need a final version of your Cython .pyx module.

Related

Extending Python 3.5 (Windows) with C++

My goal is to have the ability to call functions in C++ with meaningful arguments.
I can't do that with just subprocess.call because then I go into main(int argc,char** argv) and I have a bunch of strings to deal with. I do not want to have to parse matrices out of strings.
I'm trying to use Cython because that seems like the reasonable thing to do. But although there are a good amount of guides for getting Cython running most of them are for 2.7, and it's rare to see two advise the same thing.
My question basically is does anybody here know how to get Cython running on Py3.5? Or know of a guide or something? I'm lost.
Okay so I had a pretty silly mistake, was compiling with msvs, spent so much time trying to get mingw to work but forget that, 'msvc' does the trick. For any passersby if you're on 3.5+ you should be using Visual Studio 2015. After installing cython with 'pip3 install cython', create a setup.py file where you put this
from distutils.core import setup
from Cython.Build import cythonize
setup(ext_modules = cythonize(
"TestCython.pyx", # our Cython source
#sources=["Rectangle.cpp"], # additional source file(s)
language="c++", # generate C++ code
))
Create a .pyx file (let's say 'TestCython.pyx') where you write whatever you want for example (let's say 'print("Hello World")'. The sources argument for cythonize is optional.
Then cd into where your .pyx and .py file is and run
'python setup.py build_ext --inplace --compiler=msvc'
This should compile to a .cpp and then .pyd file (the latter is the one you will use). If you just had the hello world that will get printed out as soon as you import TestCython.
Refer to the docs and google for anything else. ;)

How to auto compile python/c extension?

I wrote a python/c extension file lda_model.c
and I added setup.py:
from setuptools import setup, Extension
modules = [Extension('c_lda_model', sources=["lda_model.c"])]
setup(ext_modules=modules)
Now I have to compile the C code by
python setup.py build
before running python code where call the C code.
Is there any way to automatically compile the invoked C extension,
while running the python code?
Not with the standard way of writing extensions, which is what you just do.
However, there are a couple other approaches of writing native code extensions to Python which do compilation in execution time.
One such example is Weave that comes with Scipy - there are others ways if you look for.

How do I get the source code of various functions used in python library ta-lib?

I want to cross-verify the functions used in the module and also want a brief understanding on how they are implemented.
The library can be found here-https://github.com/mrjbq7/ta-lib
I have checked in my Python lib and all the functions are given in .pyd format so don't know how to access them?
In short: You can't like this.
Why? .pyd are python "extension" modules that are written in C or C++. Apparenty for you lib, the code is written in C and then wrapped to python using Cython (the .pyx files). So if you you want to have a look at source code look at the C source code in the GitHub repo (if you understand C of course).

Building a ctypes-"based" C library with distutils

Following this recommendation, I have written a native C extension library to optimise part of a Python module via ctypes. I chose ctypes over writing a CPython-native library because it was quicker and easier (just a few functions with all tight loops inside).
I've now hit a snag. If I want my work to be easily installable using distutils using python setup.py install, then distutils needs to be able to build my shared library and install it (presumably into /usr/lib/myproject). However, this not a Python extension module, and so as far as I can tell, distutils cannot do this.
I've found a few references to people other people with this problem:
Someone on numpy-discussion with a hack back in 2006.
Somebody asking on distutils-sig and not getting an answer.
Somebody asking on the main python list and being pointed to the innards of an existing project.
I am aware that I can do something native and not use distutils for the shared library, or indeed use my distribution's packaging system. My concern is that this will limit usability as not everyone will be able to install it easily.
So my question is: what is the current best way of distributing a shared library with distutils that will be used by ctypes but otherwise is OS-native and not a Python extension module?
Feel free to answer with one of the hacks linked to above if you can expand on it and justify why that is the best way. If there is nothing better, at least all the information will be in one place.
The distutils documentation here states that:
A C extension for CPython is a shared library (e.g. a .so file on Linux, .pyd on Windows), which exports an initialization function.
So the only difference regarding a plain shared library seems to be the initialization function (besides a sensible file naming convention I don't think you have any problem with). Now, if you take a look at distutils.command.build_ext you will see it defines a get_export_symbols() method that:
Return the list of symbols that a shared extension has to export. This either uses 'ext.export_symbols' or, if it's not provided, "PyInit_" + module_name. Only relevant on Windows, where the .pyd file (DLL) must export the module "PyInit_" function.
So using it for plain shared libraries should work out-of-the-box except in
Windows. But it's easy to also fix that. The return value of get_export_symbols() is passed to distutils.ccompiler.CCompiler.link(), which documentation states:
'export_symbols' is a list of symbols that the shared library will export. (This appears to be relevant only on Windows.)
So not adding the initialization function to the export symbols will do the trick. For that you just need to trivially override build_ext.get_export_symbols().
Also, you might want to simplify the module name. Here is a complete example of a build_ext subclass that can build ctypes modules as well as extension modules:
from distutils.core import setup, Extension
from distutils.command.build_ext import build_ext
class build_ext(build_ext):
def build_extension(self, ext):
self._ctypes = isinstance(ext, CTypes)
return super().build_extension(ext)
def get_export_symbols(self, ext):
if self._ctypes:
return ext.export_symbols
return super().get_export_symbols(ext)
def get_ext_filename(self, ext_name):
if self._ctypes:
return ext_name + '.so'
return super().get_ext_filename(ext_name)
class CTypes(Extension): pass
setup(name='testct', version='1.0',
ext_modules=[CTypes('ct', sources=['testct/ct.c']),
Extension('ext', sources=['testct/ext.c'])],
cmdclass={'build_ext': build_ext})
I have setup a minimal working python package with ctypes extension here:
https://github.com/himbeles/ctypes-example
which works on Windows, Mac, Linux.
It takes the approach of memeplex above of overwriting build_ext.get_export_symbols() and forcing the library extension to be the same (.so) for all operating systems.
Additionally, a compiler directive in the c / c++ source code ensures proper export of the shared library symbols in case of Windows vs. Unix.
As a bonus, the binary wheels are automatically compiled by a GitHub Action for all operating systems :-)
Some clarifications here:
It's not a "ctypes based" library. It's just a standard C library, and you want to install it with distutils. If you use a C-extension, ctypes or cython to wrap that library is irrelevant for the question.
Since the library apparently isn't generic, but just contains optimizations for your application, the recommendation you link to doesn't apply to you, in your case it is probably easier to write a C-extension or to use Cython, in which case your problem is avoided.
For the actual question, you can always use your own custom distutils command, and in fact one of the discussions linked to just such a command, the OOF2 build_shlib command, that does what you want. In this case though you want to install a custom library that really isn't shared, and then I think you don't need to install it in /usr/lib/yourproject, but you can install it into the package directory in /usr/lib/python-x.x/site-packages/yourmodule, together with your python files. But I'm not 100% sure of that so you'll have to try.

Any python "compiler" that can statically link the python2x.dll dependency?

It's my understanding that py2exe can only dynamically link a python2x.dll file. Are there any Python "compilers" out there that can package it all into one standalone .exe file for easier portability?
If so or if not, which is the best compiler z0mg!
If you check the bottom of the py2exe SingleFileExecutable wiki page you'll see that it can create one-file executables. They do include the DLL inside, but you shouldn't notice that. I believe it works with a freakish hack that intercepts the LoadLibrary calls to allow them to read from elsewhere in the .exe file, but again you shouldn't notice that. We've used it before... it works.
PyInstaller claims to be able to create a single-executable that's user-friendly. Perhaps that would meet your needs. I've never used it.
py2exe can package it all in single executable, without needing any python installation on target system, it may include python2x.dll with it, but for the end user how does it matter
From what I understand, it is possible to statically link python into an executable, but then you lose your ability to load other dynamic modules (.pyd files) like os and zlib and math. Unless you are able to statically compile those as well into your main program.
And as far as I know, the only compiler that can do this is the C compiler that is compiling python from source. :)
I'm not sure its worth the effort at all.
Better just use p2exe and create a directory of files that can be zipped and shipped.

Categories

Resources