Renamed .c files to .cpp, importing Cython library fails - python

I had a working Cython program which wrapped some C libraries and custom C code. Recently, I had to switch my project to C++, so I renamed all my C code to *.cpp. Cython compiled fine and produced the .so file. However, when I try to import my library in Python, I get the following error.
File "example.py", line 1, in <module>
from tag36h11_detector import detect
ImportError: dlopen(/Users/ajay/Documents/Code/calibration/apriltag_detector/examples/tag36h11_detector.cpython-36m-darwin.so, 2): Symbol not found: _free_detection_payload
Referenced from: /Users/ajay/Documents/Code/calibration/apriltag_detector/examples/tag36h11_detector.cpython-36m-darwin.so
Expected in: flat namespace
in /Users/ajay/Documents/Code/calibration/apriltag_detector/examples/tag36h11_detector.cpython-36m-darwin.so
Because I'm not sure about the source of the error, I'm not sure what relevant information to provide.
Here's my setup.py
from distutils.core import setup, Extension
from Cython.Build import cythonize
import numpy
setup(ext_modules=cythonize(Extension(
name='tag36h11_detector',
sources=["tag36h11_detector.pyx",
"tag36h11_detector/tag36h11_detector.cpp"],
include_dirs=["/usr/local/include/apriltag", numpy.get_include()],
libraries=["apriltag"])))
I compile it with python setup.py build_ext --inplace
Thanks for any assistance!

Add language=c++ to your setup:
setup(ext_modules=cythonize(Extension(
name='XXX',
....
language="c++",
)))
You probably use gcc. The frontend of gcc (and many other compilers) decides whether the file is compiled as C (cc1 is used) or C++ (cc1plus is used) depending on its extension: ".c" is C, ".cpp" is C++.
If you use extra_compile_args=["-v"], in your setup you can see exactly which compiler is used:
Cython creates "tag36h11_detector.c" and because of its extension the C-compiler (cc1) is used.
For the file "tag36h11_detector/tag36h11_detector.cpp" the C++-compiler (cc1plus) is used.`
One of the differences between C and C++ is the name mangling: C expects that the names of the symbols in the object files are not mangled, but C++ mangles it.
For example for a function with signature int test(int) C tells to the linker to search for a symbol called test, but C++ creates a symbol called _Z4testi instead, and thus it cannot be find in the linkage step.
Now, what happens during the linkage? On Linux, the default-behavior of linking a shared object is, that we can have undefined symbols. It is implicitly assumed, that those symbols will be availably during the run-time, when the shared library is loaded. That means the program fails only when the shared object is loaded and the symbol cannot be found, i.e. when you import your module.
You could add extra_link_args=["-Wl,--no-undefined"] to ensure that the compilation fails if there are undefined symbols ain order to not have any surprises during the runtime.
One way to fix it could be too say to C++-compiler to emit unmangled names using extern "C" in your code, as pointed out in the comments.
A less intrusive approach would be to make clear to compiler, that C++-convention is used by adding language="c++" to the setup.
With language="c++", Cython creates "XXX.cpp" (instead of "XXX.c") from "XXX.pyx", and thus gcc chooses C++-compiler for the cythonized file, which is aware of the right name-mangling.

Related

"Undefined reference to" errors using numpy array in C++

I'm trying to compile some C++ code into a dll to import into Python for the first time. I want to be able to return a Numpy array from one of the functions, with an example line that looks like lNumpyArray = PyArray_SimpleNewFromData( 2, lDimensions, NPY_UINT8, (void*)lImage->GetDataPointer() );
At the start of the code I have included Python.h and arrayobject.h.
Using GCC (working on Windows) I have been able to compile the code to a .O file without errors. However, when trying to go to a dll, I'm getting a lot of errors like undefined reference to '__imp__Py_Dealloc'. From my limited understanding, it might be because I'm missing a library somewhere in the linker. Is there some other library I need to include for using Numpy arrays in C, or should I be looking elsewhere? The gcc command I've been using is included below.
gcc -Wall -shared Pipeline.cpp -I"C:/Python38/include" -I"C:/Program Files/Pleora Technologies Inc/eBUS SDK/Includes" -I "C:/Python38/Lib/site-packages/numpy/core/include" -L"C:/Python38/libs" -L"C:/Program Files/Pleora Technologies Inc/eBUS SDK/Libraries" -o lib.dll -lPvBuffer64 -lPvDevice64 -lPvStream64 -lPvAppUtils64 -lPvSystem64 -lSimpleImagingLib64 -lPvGenICam64 -lPvSerial64 -lPvBase64 -lPtUtilsLib64
I would avoid using the C Python interface to numpy and use a library link xtensor (found here).

How do I use the correct dll files to enable 3rd party C libraries in a Cython C extension?

I have a C function that involves decompressing data using zstd. I am attempting to call that function using Cython.
Using this page from the docs as a guide I can compile and run the code below with no problem.
(I don't actually use the zstd lib here)
// hello.c
#include <stdio.h>
#include <zstd.h>
int hello() {
printf("Hello, World!\n");
void *next_in = malloc(0);
void *next_out = malloc(0);
return 0;
}
# Hello.pyx
cdef extern from "hello.c":
int hello()
cpdef int callHello():
hello()
# hello_wrapper.setup.py
from setuptools import setup, Extension
from Cython.Build import cythonize
ext_modules = [
Extension(
"hello_wrapper",
["hello_wrapper.pyx"],
libraries=["zstd"],
library_dirs=["path/to/zstd/lib"],
include_dirs=['path/to/zstd/include'],
)
]
setup(
ext_modules = cythonize(ext_modules, gdb_debug=True)
)
Using the commands as follows I get the expected output:
>py hello_wrapper.setup.py build_ext --inplace
>py
Python 3.8.3 (tags/v3.8.3:6f8c832, May 13 2020, 22:20:19) [MSC v.1925 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import hello_wrapper
>>> hello_wrapper.callHello()
Hello, World!
0
However when I modify hello.c to actually use the zstd library:
// hello.c
#include <stdio.h>
#include <zstd.h>
int hello() {
printf("Hello, World!\n");
void *next_in = malloc(0);
void *next_out = malloc(0);
size_t const dSize = ZSTD_decompress(next_out, 0, next_in, 0); //the added line
return 0;
}
While hello_wrapper.setup.py compiles fine, when I get to the import statement, I get the following error:
>>> import hello_wrapper
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: DLL load failed while importing hello_wrapper: The specified module could not be found.
From reading This SO article, I gather that this error means I'm not correctly pointing to or perhaps creating in the first place the required DLL files for zstd.lib to work its magic. Is this correct? If so, how might I do that? If not, what is the problem?
We link our cython-extension against a windows-dll, that means:
*.lib-file (i.e. zstd.lib) is needed in "path/to/zstd/lib" during the compile time
*.dll-file (i.e. zstd.dll) is needed somewhere where Windows can find it when the module is imported.
Normally, Windows will not look in the "path/to/zstd/lib". And so we get a somewhat cryptic error message:
ImportError: DLL load failed: The specified module could not be found.
Which doesn't mean there is something wrong with the module - it just happens to depend on a dll which cannot be found.
While linux has -rpath-option for the linker with which "path/to/zstd/lib" could be passed (it can be added with runtime_library_dirs-argument to Extension), there is no such option on Windows.
The dll-search-algorithmus for Windows can be found here. In a nutshell, dll is searched in (possible in another order as presented here)
The directory from which the module is loaded.
The current working directory.
system-directory (e.g. C:\Windows\System32)
windows-directory(e.g. C:\Windows)
directories that are listed in the PATH-variable
others
However, since Python3.8 the above default algorithm isn't used for CPython: The current working directory and the PATH-variable are no longer used during the call, but there is os.add_dll_directory which could be used to add paths which will be used when resolving dependencies.
Putting the dll into system- or windows-directory doesn't sound too appealing, which leave us with the following options:
(the easiest?) to copy the zstd.dll next to the compiled extension
to use os.add_dll_directory to add location to the search for Python>=3.8
to add the zstd-path to the PATH-variable, e.g. set PATH="path/to/zstd/lib";%PATH% (for Python<3.8)
Another option is somewhat more tricky: Given that
If a DLL with the same module name is already loaded in memory, the
system checks only for redirection and a manifest before resolving to
the loaded DLL, no matter which directory it is in. The system does
not search for the DLL.
we can use ctypes to "preload" the right dll, which will be used (without the need to search for it on the disc) when the wrapper-module is imported, i.e.:
import ctypes;
ctypes.CDLL("path/to/zstd/lib/zstd.dll"); # we preload with the full path
import hello_wrapper # works now!
The above applies if the extension is built and used on the same system (e.g. via build_ext --inplace). installation/distribution is somewhat more cumbersome (this is covered by this SO-post), one idea would be:
to put *.h-, *.lib- and *.dll-files into 'package_data' (it seems to happen automatically anyway)
the right relative library_path (or programmatically the absolute path) can be set in the setup.py so *.lib is found by the linker.
dll will be put next to the compiled *.pyd-file in the installation.
An example could be the following more or less minimal setup.py, where everything (pyx-file, h-files, lib-file, dll-file) are put into a package/folder src/zstd:
from setuptools import setup, Extension, find_packages
from Cython.Build import cythonize
ext_modules = [
Extension(
"zstd.zstdwrapper",
["src/zstd/zstdwrapper.pyx"],
libraries=["zstd"],
library_dirs=["src/zstd"],
include_dirs=[], # set automatically to src/zstd during the build
)
]
print(find_packages(where='src'))
setup(
name = 'zstdwrapper',
ext_modules = cythonize(ext_modules),
packages = find_packages(where='src'),
package_dir = {"": "src"},
)
And now it can be installed with python setup.py install or used to create e.g. a source-distribution via python setup.py sdist which then can be installed via pip.

Cython UserWarning: Unknown Extension options: 'cython_compile_time_env'

Using cython version 0.11.2, would like to conditionally compile some parts of code on .pyx file.
below is the code in setup.py
CythonExtension('sslip2', sources = ['sslip2.pyx'],
cython_compile_time_env=dict(CONCOM=1),
libraries = ['ssl'],
),
in sslip2.pyx file
IF CONCOM== 1:
def something():
but this is returning below warning and not helpful,
UserWarning: Unknown Extension options: 'cython_compile_time_env'
is it possible to define some variable in Cython Extension and use the same in .pyx file for conditional compilation, I am new to Cython and i think something is missing here, really appreciate any pointers on conditionally compiling the code

Issue when using the shared object created with Python C Extension

I am writing a Python C Extension on Linux (Rhel 6.3) with Python 2.6.6.
There is a shared library lib_common.so and i have written a C code(Python c Extension) to call the methods in the library lib_common.so.
I have created a setup.py which includes the library and the C code.
It was able to create the module mymod.so (mymod) successfully.
I copied this so to the /usr/lib64/python2.6/site-packages/ directory and i also copied lib_common.so to the same directory
Now when i invoke Python interpreter and import the module (mymod) i am getting an error which says that the function which is present in lib_common.so is undefined
ImportError: /usr/lib64/python2.6/site-packages/mymod.so: undefined symbol: My_Fun
My doubt is whether i am missing any step here because of which i am getting this error?

Cython built extension fails to export data types and functions

I have just managed to build my first C extension for Python, using Cython to call into an existing C library.
I declared and defined my data types and functions into logical components (following the logical structure of the C library), and I combined them into one pyx file - after errors occured when I tried to add the files individually (IIRC I got an error something along the lines of init already defined - and after researching the issue on Google, I found that I had to combine all the pyx files info one pyx file) - see this link.
This is a copy of the contents of my foo.pyx file:
#include "myarray.pyx"
#include "myset.pyx"
#include "mycalc.pyx"
and this is a copy of my setup file:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
cmdclass = {'build_ext': build_ext},
ext_modules = [Extension("foo", ["foo.pyx"],
libraries=["foo_core"])
]
)
The extension gets built succesfully into foo.so
I can then type "import foo" at the Python CLI. That also works. However, when I try to access any of the classes I declared/defined in myarray.pxd, myarray.pyx etc, I get the error message:
AttributeError: 'module' object has no attribute 'myArray'
I then tried dir(), to see what the foo module was exporting. To my surprise, this is what it listed:
>>> dir(foo)
['__builtins__', '__doc__', '__file__', '__name__', '__package__', '__test__']
Why is Cython failing to export the structs, classes and functions I have declared and defined?. I don't think there is anything wrong with my pxd and pyx files because like I said, it compiles succesfully and the shared lib (python extendion) is produced.
I am using Cython 0.15.1 and Python 2.6.5 on Ubuntu
# declares start of a comment line so your foo.pyx is effectively empty.
include is a blunt instrument. Use *.pxd and cimport instead.

Categories

Resources