Advice on creating Python libraries from C++? - python

I recently created a 3D and 2D force layout diagram visualizer in C++ using the OpenGL library (it uses some physics, sort of). Can someone give me some introductory pointers about making this useful as a Python library (issues or considerations and potential pitfalls I might run into)?

If I understand your question correctly you want to know how to write C extensions for use in Python. Here's a trivial simple example of how:
hello.c:
#include <Python.h>
static PyObject* helloworld(PyObject* self)
{
return Py_BuildValue("s", "Hello, Python extensions!!");
}
static char helloworld_docs[] =
"helloworld( ): Any message you want to put here!!\n";
static PyMethodDef helloworld_funcs[] = {
{"helloworld", (PyCFunction)helloworld,
METH_NOARGS, helloworld_docs},
{NULL}
};
void inithelloworld(void)
{
Py_InitModule3("helloworld", helloworld_funcs,
"Extension module example!");
}
setup.py:
#!/usr/bin/env python
from setuptools import setup, Extension
setup(
name='helloworld',
version='1.0',
ext_modules=[
Extension('helloworld', ['hello.c'])
]
)
Build:
$ python setup.py develop
running develop
running egg_info
creating helloworld.egg-info
writing helloworld.egg-info/PKG-INFO
writing top-level names to helloworld.egg-info/top_level.txt
writing dependency_links to helloworld.egg-info/dependency_links.txt
writing manifest file 'helloworld.egg-info/SOURCES.txt'
reading manifest file 'helloworld.egg-info/SOURCES.txt'
writing manifest file 'helloworld.egg-info/SOURCES.txt'
running build_ext
building 'helloworld' extension
creating build
creating build/temp.linux-x86_64-2.7
x86_64-linux-gnu-gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c hello.c -o build/temp.linux-x86_64-2.7/hello.o
creating build/lib.linux-x86_64-2.7
x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/hello.o -o build/lib.linux-x86_64-2.7/helloworld.so
copying build/lib.linux-x86_64-2.7/helloworld.so ->
Creating /home/prologic/.virtualenvs/hellopyc/lib/python2.7/site-packages/helloworld.egg-link (link to .)
Adding helloworld 1.0 to easy-install.pth file
Installed /home/prologic/tmp/hello-py-c
Processing dependencies for helloworld==1.0
Finished processing dependencies for helloworld==1.0
Test:
$ python
Python 2.7.6 (default, Mar 22 2014, 22:59:56)
[GCC 4.8.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import helloworld
>>> helloworld.helloworld()
'Hello, Python extensions!!'
>>>
Good luck!

While James Mills answer is fine, i would consider other option - use Boost.Python library to write python extensions. Here is a basic example. It looks like Boost.Python is not maintained (last update - 2009), but still it's a very good option. Right now i'm using it in one project and i've got to say that after a little difficult start, now it's really usefull and quite easy. Most important for me is that it handles reference counting(which is painful in Python/C API), allows to modify python objects inside c++ code and allows you to register converters for custom datatypes. So instead of writing:
MyClass test = convertPythonObjectToMyClass(pythonObject);
MyOtherClass test2 = convertPythonObjectMyOtherClass(pythonObject);
you can register your converters and than use this:
MyClass test = boost::python::extract<MyClass>(pythonObject);
MyOtherClass test2 = boost::python::extract<MyOtherClass>(pythonObject);
Of course you can consider other options as well - see this questions for more information about other solutions.

Related

SWIG: how to wrap *.a library file for Python?

I have a big library file, libcore.a, generated from a big C language project. My goal is to wrap this library into a package for Python projects. SWIG is the wrapping tool I am going to use. Here is what I did, but failed.
I have made following interface file, coreapi.i:
//file: coreapi.i
%module coreapi
%{
#include "coreapi.h"
%}
%include "coreapi.h"
and the header file, coreapi.h:
//file: coreapi.h
#ifndef coreAPI_H
#define coreAPI_H
extern int coreapi(void* data);
#endif //coreAPI_H
and a helper C file, coreapi.c:
//file: coreapi.c
extern int core(void* data); // function defined in libcore.a
int coreapi(void* data)
{
return core(data);
}
and what I tried in command line (note in the last command, -L./ -lcore is to link the libcore.a file explicitly):
$ swig -python coreapi.i // this generates coreapi.py and coreapi_wrap.c
$ gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Users/me/anaconda3/include -arch x86_64 -I/Users/me/anaconda3/include -arch x86_64 -I/Users/me/anaconda3/include/python3.7m -c coreapi_wrap.c
$ gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Users/me/anaconda3/include -arch x86_64 -I/Users/me/anaconda3/include -arch x86_64 -I/Users/me/anaconda3/include/python3.7m -c coreapi.c
$ gcc -bundle -undefined dynamic_lookup -L/Users/me/anaconda3/lib -arch x86_64 -L/Users/me/anaconda3/lib -arch x86_64 -L. -lcore -arch x86_64 coreapi.o coreapi_wrap.o -o _coreapi.cpython-37m-darwin.so
Everything seems good, but the generated .so file is barely like 57KB while the libcore.a is like 47MB. And when I tried to import it in iPython I got:
$ ipython
Python 3.7.9 (default, Aug 31 2020, 07:22:35)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.19.0 -- An enhanced Interactive Python. Type '?' for help.
In [1]: import coreapi
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-1-5427a8938720> in <module>
----> 1 import coreapi
~/coreapi/coreapi.py in <module>
13 from . import _coreapi
14 else:
---> 15 import _coreapi
16
17 try:
ImportError: dlopen(/Users/me/coreapi/_coreapi.cpython-37m-darwin.so, 2): Symbol not found: _core
Referenced from: /Users/me/coreapi/_coreapi.cpython-37m-darwin.so
Expected in: flat namespace
in /Users/me/coreapi/_coreapi.cpython-37m-darwin.so
Any advice on how to achieve this goal is appreciated. My development environment is Mac OS Big Sur. Thanks.

Python/C++: can import Armadillo (arma::) but not subroutine arma::arma_rng::randn

QUESTION
When creating a Python extension in C++ that uses Armadillo, I get the errors:
A) In Mac OS Mojave 10.14.4, Python 3.7.5:
Traceback (most recent call last):
File "./py_program.py", line 5, in <module>
import cmodule
ImportError: dlopen(/Users/angel/.pyenv/versions/3.7.5/lib/python3.7/site-packages/cmodule.cpython-37m-darwin.so, 2): Symbol not found: __ZTWN4arma23arma_rng_cxx11_instanceE
Referenced from: /Users/angel/.pyenv/versions/3.7.5/lib/python3.7/site-packages/cmodule.cpython-37m-darwin.so
Expected in: flat namespace in /Users/angel/.pyenv/versions/3.7.5/lib/python3.7/site-packages/cmodule.cpython-37m-darwin.so
B) In Ubuntu 20, Python 3.8.2:
Traceback (most recent call last):
File "./py_program.py", line 5, in <module>
import cmodule
ImportError: /usr/local/lib/python3.8/dist-packages/cmodule.cpython-38-x86_64-linux-gnu.so: undefined symbol: _ZN4arma23arma_rng_cxx11_instanceE
Both of them due to the use of arma::arma_rng::randn<double>(), see below.
How can I fix it?
DETAILS
I want py_program.py to import the C++ module (extension) defined in cmodule.cpp.
Following the documentation https://docs.python.org/3/extending/extending.html , I have the files py_program.py, setup.py and cmodule.cpp.
Their contents are:
py_program.py
#!/usr/bin/env python3
"""Import and use cmodule."""
import cmodule
cmodule.printvol(3.)
setup.py
"""To install the module defined in cmodule.cpp."""
from distutils.core import setup, Extension
setup(name='cmodule', version='1.0', \
ext_modules=[
Extension(
'cmodule', ['cmodule.cpp'],
extra_compile_args=['-std=c++11'],
language='c++')],
)
cmodule.cpp
/* Module to be used in Python.
All Python stuff follows Sec. 1.8 of
https://docs.python.org/3/extending/extending.html */
#define PY_SSIZE_T_CLEAN
#include <Python.h>
#include <armadillo>
double f()
// Fails at using Armadillo.
// The module works if I delete this function.
{
double rn_y = arma::arma_rng::randn<double>();
return rn_y;
}
arma::cx_double g()
// Succeeds at using Armadillo.
{
arma::cx_double value(0., 1.);
return value;
}
static PyObject *
cmodule_printvol(PyObject *self, PyObject *args)
// A method of the module.
{
double voltage;
if (!PyArg_ParseTuple(args, "d", &voltage))
return NULL;
printf("voltage is %f.\n", voltage);
Py_RETURN_NONE;
}
static PyMethodDef cmodule_methods[] = {
// Declare the modules methods.
{"printvol", cmodule_printvol, METH_VARARGS, "Print voltage."},
{NULL, NULL, 0, NULL} /* sentinel */
};
static struct PyModuleDef cmodule = {
// Create the module.
PyModuleDef_HEAD_INIT,
"diff",
NULL,
-1,
cmodule_methods
};
PyMODINIT_FUNC
PyInit_cmodule(void)
// Initialize the module.
{
return PyModule_Create(&cmodule);
}
I run them as follows:
python setup.py install
python py_program.py
In Ubuntu, the output of python3 setup.py install is:
running install
running build
running build_ext
building 'cmodule' extension
creating build
creating build/temp.linux-x86_64-3.8
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c cmodule.cpp -o build/temp.linux-x86_64-3.8/cmodule.o -std=c++11
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cmodule.o -o build/lib.linux-x86_64-3.8/cmodule.cpython-38-x86_64-linux-gnu.so
running install_lib
copying build/lib.linux-x86_64-3.8/cmodule.cpython-38-x86_64-linux-gnu.so -> /usr/local/lib/python3.8/dist-packages
running install_egg_info
Removing /usr/local/lib/python3.8/dist-packages/cmodule-1.0.egg-info
Writing /usr/local/lib/python3.8/dist-packages/cmodule-1.0.egg-info
What causes the problem is
double rn_y = arma::arma_rng::randn<double>();
Actually, if I delete the function f(), I get no error.
Notice that Armadillo is loaded successfully, since g() uses it no problem.
What is happening?
In setup.py, the argument libraries=['armadillo'] to Extension() fixes the problem:
"""To install the module defined in cmodule.cpp."""
from distutils.core import setup, Extension
setup(name='cmodule', version='1.0', \
ext_modules=[
Extension(
'cmodule', ['cmodule.cpp'],
extra_compile_args=['-std=c++11'],
libraries=['armadillo'], // this solves the problem
language='c++')],
)
Mysteriously, without it, arma:: can be used correctly. But not 'submodules' like arma::arma_rng.
This solution is general: the same problem happens with other libraries. Actually, I reproduced the same (and made it work) with the GNU Scientific Library (libraries=['gsl']).

Compiling a wrapper for a c library using Cython - Linker cant find .dylib of external c library on OSX

I have written a wrapper in Cython for an integration function from NAG (https://www.nag.co.uk/content/nag-library-c) c library.
It compiles using the python setup.py build --inplace, where setup file is:
from setuptools import Extension, setup
from Cython.Build import cythonize
import re
def loadMacros(headerFile):
""" Given a .h file, return dict of touples with macros """
regex = re.compile("#define +(\w+) *(\w*)")
with open(headerFile) as f:
macros = dict(map(lambda x: re.match(regex, x).groups(),
[l for l in f if re.match(regex, l)]))
# Remove recursive entries - Note this is not foolproof..
# while not set(macros.keys()).isdisjoint(macros.values()):
# for k, v in macros.items():
# if v in macros:
# macros[k] = macros[v]
return macros
nagHome = "/Users/hfmw1m17/NAG/nlmi627dbl" # "/opt/NAG/cll6a23dhl"
macros = loadMacros(nagHome + "/lp64/include/nag.h")
macros = list(macros.items())
e = Extension("nag_integrate",
define_macros=macros,
sources=["nag_integrate.pyx"],
include_dirs=[nagHome + "/lp64/include",nagHome + "/lp64/lib"],
library_dirs=[nagHome + "/lp64/lib"],libraries=["nag_nag"],extra_objects=[
nagHome+"/lp64/lib/libnag_nag.dylib"],runtime_library_dirs=[nagHome+"/lp64/lib/"],extra_link_args=['-Wl,-rpath']
)
setup(ext_modules=cythonize(e,annotate=True,language_level=3))enter code here
with output:
/Users/hfmw1m17/anaconda3/envs/TowingTankAcoustics/bin/python3.7 setup.py build_ext --inplace
Compiling nag_integrate.pyx because it changed.
[1/1] Cythonizing nag_integrate.pyx
running build_ext
building 'nag_integrate' extension
gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Users/hfmw1m17/anaconda3/envs/TowingTankAcoustics/include -arch x86_64 -I/Users/hfmw1m17/anaconda3/envs/TowingTankAcoustics/include -arch x86_64 -DNAG_H= -DNAG_MICROSOFT_THREAD_SAFE= -DNAG_THREAD_SAFE= -DNULLFN=0 -DNULLDFN=0 -DNAGERR_DEFAULT= -DNAGUSER_DEFAULT= -DNAGCOMM_NULL= -DNAGMESG_DEFAULT= -DE01_DEFAULT= -DE04_DEFAULT= -DG13_DEFAULT= -DH02_DEFAULT= -DINIT_FAIL= -DSET_FAIL= -DINIT_MESG= -DINIT_STREAM= -DRDUMMY= -DIDUMMY= -DINIT2DUMMY= -DVprintf= -DVfprintf= -DVsprintf= -DVscanf= -DVfscanf= -DVstrcpy= -DABS= -DFABS= -DSIGN= -DMAX= -DMIN= -DDROUND= -DROUND= -DSQZABS= -DCONJ= -DVCONJ= -DZMULT= -D_nag_expand= -Dnag_stringize= -I/Users/hfmw1m17/NAG/nlmi627dbl/lp64/include -I/Users/hfmw1m17/NAG/nlmi627dbl/lp64/lib -I/Users/hfmw1m17/anaconda3/envs/TowingTankAcoustics/include/python3.7m -c nag_integrate.c -o build/temp.macosx-10.9-x86_64-3.7/nag_integrate.o
gcc -bundle -undefined dynamic_lookup -L/Users/hfmw1m17/anaconda3/envs/TowingTankAcoustics/lib -arch x86_64 -L/Users/hfmw1m17/anaconda3/envs/TowingTankAcoustics/lib -arch x86_64 -arch x86_64 build/temp.macosx-10.9-x86_64-3.7/nag_integrate.o /Users/hfmw1m17/NAG/nlmi627dbl/lp64/lib/libnag_nag.dylib -L/Users/hfmw1m17/NAG/nlmi627dbl/lp64/lib -L/Users/hfmw1m17/NAG/nlmi627dbl/lp64/lib/ -lnag_nag -o build/lib.macosx-10.9-x86_64-3.7/nag_integrate.cpython-37m-darwin.so -Wl,-rpath
copying build/lib.macosx-10.9-x86_64-3.7/nag_integrate.cpython-37m-darwin.so ->
Process finished with exit code 0
However when i import a function from the .so object created i get the following error:
ImportError: dlopen(/Users/hfmw1m17/WaterTankISM/WaterTankISM/nag_integration/nag_integrate.cpython-37m-darwin.so, 2): Library not loaded: libnag_nag.dylib
Referenced from: /Users/hfmw1m17/WaterTankISM/WaterTankISM/nag_integration/nag_integrate.cpython-37m-darwin.so
Reason: image not found
libnag_nag.dylib is a dynamic library produced by NAG.
Using otool -L on the shared object of my wrapper results in:
libnag_nag.dylib (compatibility version 0.0.0, current version 27.0.0)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1281.100.1)
I think its an issue with the linker not being able to find the dynamic library when compiling. Any suggestions on how to solve this problem?
many thanks
Solved using:
install_name_tool -add_rpath path_to_dylib_directory
after running python setup.py build --inplace.
Would like a way of doing this in the setup.py file if anyone can figure that out.
many thanks

Disable link step of distutils Extension

Is it possible to disable the creation of shared objects with distutils.core.Extension? I want to stop the compiler before linking (i.e. g++ -c ...).
I am swigging a native file, which creates an object file and a python file. I have other code to compile that I'll later link with this object file, so I don't want this to proceed after the .o compilation.
$ python setup.py build
running build
....
building 'foo' extension
swigging src/foobar.i to src/foobar.cpp
swig -python -c++ -o src/foobar.cpp src/foobar.i
I want to stop here, but it continues.
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/src
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -Isrc -I/usr/include/python2.7 -c src/foobar.cpp -o build/temp.linux-x86_64-2.7/src/foobar.o
g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro build/temp.linux-x86_64-2.7/src/foobar.o -o build/lib.linux-x86_64-2.7/foobar.so
Do I need to use the CCompiler class directly? Or is there a way to wrangle the Extension class?
23 ext_modules=[
24 # Swig
25 Extension(
26 name='foobar',
27 sources=['src/foobar.i'],
28 include_dirs=['src'],
29 swig_opts=['-c++'],
30 ),
31 ]
It is not possible to stop the linking step without modifying the underlying ccompiler object. One could theoretically override the link_shared_object function of the underlying ccompiler to do nothing (See the build_ext source).
However, to answer the original intent behind this question, the C/C++ files can be passed to the Extension with the Swig interface file without needing to compile them independently and link later. It is not necessary to separate the swig file generation and the library compilation.
You could do something like this:
from distutils.command import build_ext
def cmd_ex(command_subclass):
orig_ext = command_subclass.build_extension
def build_ext(self, ext):
sources = self.swig_sources(list(ext.sources), ext)
command_subclass.build_extension = build_ext
return command_subclass
#cmd_ex
class build_ext_ex(build_ext):
pass
setup(
name = ...,
cmdclass = {'build_ext': build_ext_ex},
ext_modules = ...
)
to override the default behavior of distutils command.
Setuptools – run custom code in setup.py

Python, ImportError: undefined symbol: g_utf8_skip

There are like tens of similar questions on StackOverflow, but after several hours of lurking I finally gave up.
So I'm trying to write a C extension for Python. Let's call it mylib. Here is the header file:
mylib.h
#ifndef mylib_H
#define mylib_H
#include <Python.h>
< ... >
#include <glib.h>
< ... >
and setup.py:
from distutils.core import setup, Extension
include_list = [
"/usr/include/glib-2.0", "-lglib-2.0",
"/usr/lib/x86_64-linux-gnu/glib-2.0/include"
]
module = Extension('mylib', ['mylib.c'])
setup(name='mylib', version='1.0',
include_dirs=include_list,
ext_modules=[module])
If I run python setup.py install, I get the following (which I take as successful installation):
running install
running build
running build_ext
building 'mylib' extension
creating build
creating build/temp.linux-x86_64-2.7
x86_64-linux-gnu-gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/glib-2.0 -I-lglib-2.0 -I/usr/lib/x86_64-linux-gnu/glib-2.0/include -I/usr/include/python2.7 -c mylib.c -o build/temp.linux-x86_64-2.7/mylib.o
mylib.c: In function ‘c_sound_utf8’:
mylib.c:117:5: warning: ‘g_unicode_canonical_decomposition’ is deprecated (declared at /usr/include/glib-2.0/glib/gunicode.h:627) [-Wdeprecated-declarations]
decomposition = g_unicode_canonical_decomposition(c_composed, &decomposition_len);
^
creating build/lib.linux-x86_64-2.7
x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/mylib.o -o build/lib.linux-x86_64-2.7/mylib.so
running install_lib
copying build/lib.linux-x86_64-2.7/mylib.so -> /usr/local/lib/python2.7/dist-packages
running install_egg_info
Removing /usr/local/lib/python2.7/dist-packages/mylib-1.0.egg-info
Writing /usr/local/lib/python2.7/dist-packages/mylib-1.0.egg-info
But when I try to use mylib from inside Python, I get the following:
>>> import mylib
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: /usr/local/lib/python2.7/dist-packages/mylib.so: undefined symbol: g_utf8_skip
After rambling around StackOverflow for some time I got an idea that I should either 1. rebuild the needed library or 2. put all the links to needed library after all generated module names.
Rebuilding didn't work (or I did it the wrong way). As for placing links to the needed library after everything else - well, I didn't find out the way to make distutils change the order of links in its compile string. Is there a way?
I also tried providing extra_link_args/extra_compile_args to my extension (without any effect):
module = Extension('mylib', ['mylib.c'],
extra_link_args=["-Xlinker", "-export-dynamic"])
I felt pretty miserable and kept googling on. Then I found out about SWIG. I decided to try it by making another library, (uppercase) MYLIB (I changed filenames and all text occurences of mylib to MYLIB). I wrote a shell script:
#!/bin/bash
GLIB_IMPORT_OPTS="-I/usr/include/glib-2.0 -I/usr/lib/x86_64-linux-gnu/glib-2.0/include -lglib-2.0"
PY_IMPORT_OPTS="-I/usr/include/python2.7/ -lpython2.7"
swig -Wall -python MYLIB.i
gcc -fPIC -Wall -c MYLIB.c $GLIB_IMPORT_OPTS
gcc -fPIC -Wall -shared MYLIB.o MYLIB_wrap.c -o _MYLIB.so $GLIB_IMPORT_OPTS -L. $PY_IMPORT_OPTS $GLIB_IMPORT_OPTS
When I ran this thing, everything worked fine (I could import the library and do stuff with it). Here, as you can see, links are at the very end of the compile line. So now I'm trying to understand: what did I miss with the distutils way? How can I make it work?
Well, actually I found the solution. A had to add library links to extra_link_args:
extra_link_args=["-I", "/usr/include/glib-2.0", "-l", "glib-2.0", "-I", "/usr/lib/x86_64-linux-gnu/glib-2.0/include"]
which appends them to the end of compile string.
I found adding -fPIC to "extra_compile_args" in the Extension constructor also helped. Like so:
my_module = Extension('modulename',
...
extra_compile_args=["-fPIC"]
sources = ['mycode.c'])

Categories

Resources