Nuitka - compiling module including precompiled .so libraries - python

I need to use both nuitka (code obfuscation) and numba (code optimization), but nuitka crashes when executing #njit (understandable).
My idea is to precompile some code with numba using it's AOT compiling feature, so nuitka can use it later when doing it's thing.
Issues I encountered :
when leaving only the precompiled .so file, nuitka ignores it => ImportError
when leaving source .py file with .so file, nuitka uses the .py file despite seeing both files
it displays a warning about finding both files
I tried adding --no-prefer-source-code (which should be default) but it still uses the .py file instead of the precompiled .so
Did anyone at some point managed to import external .so modules in a nuitka process ?
NB: I know, I can just copy the precompiled numba .so file and get it to be imported 'normally' by the compiled nuitka .so, but it's not scalable if I add more precompiled numba code in the futur.
The idea would be to include the precompiled numba .so inside nuitka's .so (is it even possible ?)

Related

Does Cython compile imported modules as part of the binary?

I'm just now reading into cython and I'm wondering if cython compiles imported modules as part of the executable of if you still need to have the modules installed on the target machine to run the cython binary.
The "interface" of a Cython module remains at the Python level. When you import a module in Cython, the module becomes available only at the Python level of the code and uses the regular Python import mechanism.
So:
Cython does not "compile in" the dependencies.
You need to install the dependencies on the target machine.
For "Cython level" code, including the question of "cimporting" module, Cython uses the equivalent of C headers (the .pxd declaration files) and dynamically loaded libraries to access external code. The .so files (for Linux, DLL for windows and dylib for mac) need to be present on the target machine.

PyPy + Cpython extension written with Rust (rust-cpython)

I was going to try using PyPy. But an extension (.so file) I wrote with rust-cpython can't be loaded when executed with pypy3:
ImportError: No module named 'pkg.lib'
where lib is my lib.so file.
CPython(3.5) loads it fine. I thought PyPy had support for loading CPython extensions.
If not - what do I need to do to load .so file compiled with Rust (rust-cpython)?
PyPy has source compatibility only with CPython's C extension modules. You need to recompile the .c sources. Normally this is done by running setup.py with PyPy instead of CPython.

Python Import error for f2py modules compiled with OpenMP

I'm currently experiencing an issue in wrapping some Fortran subroutines for use in a python3 script. This issue has only come up since I have attempted to use OpenMP in the subroutines.
For example, if I compile a module 'test.pyd' using f2py -c -m --fcompiler=gfortran --compiler=mingw32 --f90flags='-fopenmp' test test.f90 -lgomp, in which 'test.f90' is a Fortran subroutine which contains a parallelized loop, upon attempting to import this module into my script, I encounter ImportError: DLL load failed: A dynamic link library (DLL) initialization routine failed..
Removing the -fopenmp flag in compiling, or the !$omp comments in the Fortran subroutine remove this error.
Changing the subroutine to a roughly equivalent Fortran program, the program compiles to a .exe and runs correctly in parallel.
I'm on a Windows 10 platform, with an AMD64 processor, using GNU Fortran and C compilers from TDM-GCC
I just tried your build command, and it looks prefectly fine. I am myself able to run a parallel subroutine from a python module compiled just the way you are doing.
How are you executing the python code that is using your module? I think the problem is that you don't have the openmp dll (which is named libgomp-1.dll) in your path
I would advise you to run (from a bash shell) :
where libgomp-1.dll
If the command can't find it, then you should probably add the path to your openmp dll (which is usually "C:\tools\mingw64\bin\") to your PATH.
In order to do this, you can use:
export PATH=$PATH:C:\tools\mingw64\bin\ && python script_using_module.py
There is a good chance the way you are executing your python code doesn't account properly for the path, since you can run the parallel executable without a problem.

How to make a module using ctypes cross-platform?

I'm writing a python module that uses ctypes to call C functions. I'm developing on Windows, so I've compiled the C code to a .dll file. What's the best way to make my module work smoothly on other platforms?
Would it suffice to compile an additional .so file for Unix systems, include both of the shared object files in my module, and have the Python code load one of the two files depending on which system it's running on? If so, how can I compile a Unix-compatible .so file on Windows?

Cythonize but not compile .pyx files using setup.py

I have a Cython project containing several .pyx files. To distribute my project I would like to provide my generated .c files as recommended in the Cython documentation, to minimize problems with different Cython versions.
My current work flow is to build the project using:
me#machine$ python setup.py build_ext --inplace
This cythonizes (i.e. translate .pyx files to .c / .cpp files using Cython) all .pyx files and then compiles them.
For a release I do not need the compiled (.so) files, so I basically ignore them. The problem is, I waste a lot of time with the useless compilation.
Is there a way to cythonize all .pyx files in the folder using the setup.py, without compiling them?
Edit: Why not just use $ cython my_file.pyx
I have about 20 .pyx files which have different compiler directives. Calling cython on the command line for each file would be slower than just waiting for it to compile.
On the other hand creating a shell script to cythonize them would leave me with a second list of compiler directives that needs to be kept up to date. I would rather not do this, in order to keep my possible points of failure minimal.
One way to create the C files without compiling them is first remove in setup.py "setup(...)" line and replace it with only the "cythonize("*.pyx")" part. Then run:
me#machine$ python setup.py

Categories

Resources