Create a DLL exposing python code - python

Can I use cython to create a shared library with exported C functions that have python code as the core? Like wrapping Python with C??
It is to be used in plugins.
tk

Using Cython, you can write function declared as C ones with the cdef keyword (and public... important!), with Python inner code:
yourext.pyx
cdef int public func1(unsigned long l, float f):
print(f) # some python code
Note: in the following is assumed that we are working in the root of drive D:\
Building (setup.py)
from distutils.core import setup
from Cython.Distutils import build_ext
setup(
cmdclass = {'build_ext': build_ext},
name = 'My app',
ext_modules = cythonize("yourext.pyx"),
)
Then run python setup.py build_ext --inplace
After running the setup.py (if you are using distutils), you'll get 2 files of interest:
yourext.h
yourext.c
Looking into the .c will show you that func1 is a C function, in the end.
Those two files are all we need to do the rest.
C main program for testing
// test.c
#include "Python.h"
#include "yourext.h"
main()
{
Py_Initialize(); // start python interpreter
inityourext(); // run module yourext
func1(12, 3.0); // Lets use shared library...
Py_Finalize();
}
As we don't use the extension (.pyd) by itself, we need to make a little trick/hack in the header file to disable the "DLL behavior". Add the following at the beginning of "yourext.h":
#undef DL_IMPORT # Undefines DL_IMPORT macro
#define DL_IMPORT(t) t # Redefines it to do nothing...
__PYX_EXTERN_C DL_IMPORT(int) func1(unsigned long, float);
Compiling "yourext" as a shared library
gcc -shared yourext.c -IC:\Python27\include -LC:\Python27\libs -lpython27 -o libyourext.dll
Then compiling our test program (linking to the DLL)
gcc test.c -IC:\Python27\include -LC:\Python27\libs -LD:\ -lpython27 -lyourext -o test.exe
Finally, run the program
$ test
3.0
This is not obvious, and there is many other ways to achieve the same thing, but this works (have a look to boost::python, ..., other solutions may better fit your needs).
I hope this answers a little bit your question or, at least, gave you an idea...

Related

How do I properly link python extension modules in C++?

I have a simple example python extenstion I want to use from C/C++. The code is as follows
example.pxy:
from numpy import random
cdef public void somefunc():
print(random.randint(500))
setup.py:
from setuptools import setup
cfom Cython.Build import cythonize
import numpy
setup(
ext_modules=cythonize("example.pyx"),
zip_safe=False,
include_dirs=[numpy.get_include()]
)
Running python3 setup.py build_ext --inplace --compiler="mingw32" -DMS_WIN64 then creates example.c, example.h and example.cp310-win_amd64.pyd. The C++ code I am using to call someFunc is:
example.cpp:
#include <Python.h>
#include "example.h"
int main()
{
Py_Initialize();
somefunc();
return 0;
}
This I compile using g++ example.cpp -DMS_WIN64. But that command seems to be incomplete. There are sill objects left that need to be linked, namely the ones from the example.pyx. How do I do this? I do not see any generated .dllor.lib` or similar.
Additionally, if I use #include "exmaple.c" in
example.cpp, I get a very long list of missing symbols from the linker. The objects are all named __im_Py*.
I am using MINGW64 on Windows 10. The python installation I am trying to link against is a regular python installation from the system. I have an environment variable CPLUS_INCLUDE_PATH=C:\Program Files\Python310\include.

Calling CPython extension from Go

I currently have a source files written in C++ and wrapped into a Python module (uses Boost and Parser libs).
I ported the C++ folder under go/src and .so files along with main.go
Program Structure
src
/main
main.go
network.so
/network
file1.cpp (this has a function **object DBdata::getTable())
file1.hpp (#define FILE1_H_)
main.go
package main
// #cgo pkg-config: python3
// #cgo CFLAGS : -I./ -I/usr/include/python3.6
// #cgo LDFLAGS: -L/usr/lib/python3.6/config-3.6m-x86_64-linux-gnu -L/usr/lib -lpython3.6 -lpthread -ldl -lutil -lm
// #include <Python.h>
import "C"
import "fmt"
func main() {
cmd := exec.Command("python", "-c", "import network; network.getTable()")
cmd.Dir = "/home/username/go/src/network" //directory where is my python code
out,err := cmd.CombinedOutput()
}
After building this main.go, I get the error as
/usr/include/boost/config/no_tr1/memory.hpp: fatal error: memory: No
such file or directory

>  #  include

>             ^~~~~~~~
> compilation terminated.
How to import .so as Python modules in Go?
Can Swig be used in this place?
What is the better approach to expose Python module to Go?
Issue fixed. The .so placed in /bin and the Go was build and could access the functions under network

How can I build a setup.py to compile C++ extension using Python, pybind11 and Mingw-w64?

I'm currently trying to write a 'setup.py' script that when the user installs the python package automatically compiles my C++ extension bound with 'pybind11'. In Windows, I haven't got any problem making it happen with the 'VS19 MSVC' compiler. But I'm trying to make it happen if the user has installed 'MinGW-w64' instead.
These are the package files:
**main.cpp**
#include <pybind11/pybind11.h>
int add(int i, int j) {
return i + j;
}
namespace py = pybind11;
PYBIND11_MODULE(pybind11_example, m) {
m.def("add", &add);
}
**setup.py**
from setuptools import setup, Extension
import pybind11
ext_modules = [
Extension(
'pybind11_example',
sources = ['main.cpp'],
include_dirs=[pybind11.get_include()],
language='c++'
),
]
setup(
name='pybind11_example',
ext_modules=ext_modules
)
Having the two files in the same folder and running from the command prompt:
python setup.py build
If the user has VS19 MSVC compiler installed it successfully generates **pybind11_example.pyd** that can be tested to work running with python:
import pybind11_example as m
print(m.add(1, 2))
But if the user has a Mingw-w64 compiler installed raises an error saying that Visual Studio 2015 is required.
Note that I can easily compile **main.cpp** in to **pybind11_example.pyd** manually with Mingw-w64 running:
g++ -static -shared -std=c++11 -DMS_WIN64 -fPIC -I C:\...\Python\Python38\Lib\site-packages\pybind11\include -I C:\ ... \Python\Python38\include -L C:\ ... \Python\Python38\libs main.cpp -o pybind11_example.pyd -lPython38
Is there a way to write **setup.py** in a way that if the user has Windows with a MinGW-w64 compiler automatically compile **main.cpp** into **pybind11_example.pyd** when installing the package without needing to make it manually?
Chek the answer to this question. They try to solve the opposite case, force msvc instead of mingw, but the approach with setup.cfg might help you.
And here the answer demonstrates how to specify command line parameters depending on the choice made by setup tools: if it is msvc then one set of parameters, and another set for mingw.
I belive the second approach should suite your needs - whichever compier is installled, you have the proper command line to build your module.

Using SWIG to build Python Module using 3rd-Party Drivers

I'm using a 3rd-Party Vendor who is providing a Windows Driver (DLL) and C Header Files. What I'm trying to do is use SWIG to recompile the header file into a Python Module.
Here are the my files:
- BTICard.i
- BTICard.h
- BTICARD64.dll
- BTICARD64.lib
SWIG interface source
%module BTICard
%include <windows.i>
%{
#define SWIG_FILE_WITH_INIT
#include "BTICard.H"
#define BTICardAPI
%}
In Cygwin, I used the following commands:
swig -python -py3 BTICard.i
Which then generated the following files:
- BTICard.py
- BTICard_wrap.c
In Cygwin, compile for Python Module
gcc -c -fpic BTICARD.H BTICard_wrap.c -I/usr/include/python3.8
Which now allows BTICard to be imported in Python
import BTICard
import ctypes
BTICarddll = ctypes.WinDLL('BTICARD64')
pRec1553 = SEQRECORD1553() # Doesn't initialize
The BTICard.H contains the following:
typedef struct - Used to initialize various field structures
enum - Constant Declarations
According to the SWIG documentation, the typedef structs are supposed to be converted into Python classes. When I tried initializing the class, I got a NameError. I suspect the issue is with my interface file not recognizing these types so it failed to convert them.
Upon further investigation, I tried using the distutils approach and created the setup.py
#!/usr/bin/env python3.8
"""
setup.py file for SWIG
"""
from distutils.core import setup, Extension
example_module = Extension('_BTICard',
sources=['BTICard_wrap.c', 'BTICard.h'],)
setup (name = 'BTICard',
version = '0.1',
author = "TESTER",
description = """BTICard API""",
ext_modules = [example_module],
py_modules = ["BTICard"],
)
To build the package:
$ python3.8 setup.py build_ext --inplace
running build_ext
building '_BTICard' extension
error: unknown file type '.h' (from 'BTICard.h')
What's the issue here?
Is there a way I can access the Python source file after creating the object from gcc?
All I am trying to do is to validate a separate Python Wrapper that seems to have issues (it's a completely separate topic). Is there another way to create this Python Module?
The .i file isn't including the interface to export. It should look like:
%module BTICard
%{
#include "BTICard.H" // this just makes the interface available to the wrapper.
%}
%include <windows.i>
%include "BTICard.h" // This wraps the interface defined in the header.
setup.py knows about SWIG interfaces, so include the .i file directly as a source. Headers are included by the sources and aren't listed as sources. You may need other options but this should get you on the right track. You'll likely need the DLLs export library (BTICard.lib) and need to link to that as well:
example_module = Extension('_BTICard',
sources=['BTICard.i'],
libraries=['BTICard.lib'])

Cython: Segmentation Fault Using API Embedding Cython to C

I'm trying to embed Cython code into C following O'reilly Cython book chapter 8. I found this paragraph on Cython's documentation but still don't know what should I do:
If the C code wanting to use these functions is part of more than one shared library or executable, then import_modulename() function needs to be called in each of the shared libraries which use these functions. If you crash with a segmentation fault (SIGSEGV on linux) when calling into one of these api calls, this is likely an indication that the shared library which contains the api call which is generating the segmentation fault does not call the import_modulename() function before the api call which crashes.
I'm running Python 3.4, Cython 0.23 and GCC 5 on OS X. The source code are transcendentals.pyx and main.c:
main.c
#include "transcendentals_api.h"
#include <math.h>
#include <stdio.h>
int main(int argc, char **argv)
{
Py_SetPythonHome(L"/Users/spacegoing/anaconda");
Py_Initialize();
import_transcendentals();
printf("pi**e: %f\n", pow(get_pi(), get_e()));
Py_Finalize();
return 0;
}
transcendentals.pyx
cdef api double get_pi():
return 3.1415926
cdef api double get_e():
print("calling get_e()")
return 2.718281828
I'm compiling those files using setup.py and Makefile:
setup.py:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Build import cythonize
setup(
ext_modules=cythonize([
Extension("transcendentals", ["transcendentals.pyx"])
])
)
Makefile
python-config=/Users/spacegoing/anaconda/bin/python3-config
ldflags:=$(shell $(python-config) --ldflags)
cflags:=$(shell $(python-config) --cflags)
a.out: main.c transcendentals.so
gcc-5 $(cflags) $(ldflags) transcendentals.c main.c
transcendentals.so: setup.py transcendentals.pyx
python setup.py build_ext --inplace
cython transcendentals.pyx
clean:
rm -r a.out a.out.dSYM build transcendentals.[ch] transcendentals.so transcendentals_api.h
However, I came to error Segmentation fault: 11. Any idea can help with this? Thanks!
In that Makefile there is
transcendentals.so: setup.py transcendentals.pyx
python setup.py build_ext --inplace
Unless python refers to /Users/spacegoing/anaconda/bin/python3 it should be replaced since the module may be compiled for wrong python version, and cannot thus be loaded.
In main.c there is call import_transcendentals() that does not check the return value i.e. if the import fails or succeeds. In case of failure, get_pi() and get_e() point to invalid memory locations and trying to call them causes a segmentation fault.
Also, the module has to be located somewhere where it can be found. It seems that when embedding, the current directory is not searched for python modules. PYTHONPATH environment variable could be changed to include the directory where transcendentals.so is located.
The following is an altenative way of embedding the code to the C program and sidesteps the import issues since the module code is linked to the executable.
Essentially, a call to PyInit_transcendentals() is missing.
File transcendentals.h will be generated when the cython functions are defined public i.e.
cdef public api double get_pi():
...
cdef public api double get_e():
Your main.c should have the include directives
#include <Python.h>
#include "transcendentals.h"
and then in main
Py_Initialize();
PyInit_transcendentals();
There should be no #include "transcendentals_api.h" and no import_transcendentals()
The first reason is that according to the documentation
However, note that you should include either modulename.h or
modulename_api.h in a given C file, not both, otherwise you may get
conflicting dual definitions.
The second reason is, that since transcendentals.c is linked to the program in
gcc $(cflags) $(ldflags) transcendentals.c main.c
there is no reason to import transcendentals module. The module has to be initialized though, PyInit_transcendentals() does that for Python 3

Categories

Resources