I want to build simple app with pybind11, pybind is already installed in my Ubuntu system with cmake (and make install). I use this simple cmake file:
cmake_minimum_required(VERSION 3.0 FATAL_ERROR)
project(trt_cpp_loader )
find_package(pybind11 REQUIRED)
add_executable(trt_cpp_loader main.cpp)
set_property(TARGET trt_cpp_loader PROPERTY CXX_STANDARD 11)
This is main.cpp:
#include <iostream>
#include <pybind11/embed.h>
namespace py = pybind11;
using namespace std;
int main(){return 0;}
when I build it, I get:
In file included from /usr/local/include/pybind11/pytypes.h:12:0,
from /usr/local/include/pybind11/cast.h:13,
from /usr/local/include/pybind11/attr.h:13,
from /usr/local/include/pybind11/pybind11.h:44,
from /usr/local/include/pybind11/embed.h:12,
from /home/stiv/lpr/trt_cpp_loader/main.cpp:2:
/usr/local/include/pybind11/detail/common.h:112:10: fatal error: Python.h: No such file or directory
#include <Python.h>
^~~~~~~~~~
compilation terminated.
how can I fix this problem? (python-dev and python3-dev are already installed, Python.h is available)
You'll want to use the pybind11_add_module command (see https://pybind11.readthedocs.io/en/stable/compiling.html#building-with-cmake) for the default case of creating an extension module.
If the goal is indeed to embed Python in an executable, it is your reponsibility to explicitly add python headers & libraries to the compiler/linker commands in CMake. (see https://pybind11.readthedocs.io/en/stable/compiling.html#embedding-the-python-interpreter on how to do that)
Following the Wenzel Jakob's answer I want to put an example of CMakeLists.txt for compiling the example provided in this tutorial:
// example.cpp
#include <pybind11/pybind11.h>
int add(int i, int j) {
return i + j;
}
PYBIND11_MODULE(example, m) {
m.doc() = "pybind11 example plugin"; // optional module docstring
m.def("add", &add, "A function which adds two numbers");
}
and
# example.py
import example
print(example.add(1, 2))
and
# CMakeLists.txt
cmake_minimum_required(VERSION 2.8.12)
project(example)
find_package(pybind11 REQUIRED)
pybind11_add_module(example example.cpp)
now in the root run
cmake .
make
now run the python code by
python3 example.py
P.S. I have also written some instructions here for compiling/installing the pybind11.
Maybe just install the Python headers? For example, on Ubuntu you can install the sudo apt-get install python-dev (or python3-dev or pythonX.Y-dev) package. That could resolve this.
Related
This is a simple question but has been bothering me for 3 months now. When I use the setuptools/setup.py method to compile C++ code into a Python package on my windows os, it always defaults to MSVC, but part of the code uses stdlibc++ which is only accessible with GNU. Is there some way to specify it to MinGW, or somehow change the default behavior? I have looked into other methods, cppimport does not support windows, and the cmake method seems very complex to me.
For reference, a simple test to check whether the compiler is MSVC:
check_compiler.cpp
#include <iostream>
#include <pybind11/pybind11.h>
#include <pybind11/numpy.h>
#include <pybind11/stl.h>
namespace py = pybind11;
void test(){
#ifdef _MSVC_LANG
py::print("Compiled with MSVC. ");
#else
py::print("Not compiled with MSVC. ");
#endif
}
PYBIND11_MODULE(check_compiler, m) {
m.def("test", &test);
}
setup.py
"""
python setup.py install
"""
from pybind11.setup_helpers import Pybind11Extension
from setuptools import setup
import os
import shutil
ext_modules = [
Pybind11Extension(
'check_compiler',
sources=['check_compiler.cpp'],
language='c++',
cxx_std=11
),
]
setup(
name='check_compiler',
author='XXX',
ext_modules=ext_modules
)
# copy the package
for filename in os.listdir("dist"):
if filename.endswith(".egg"):
shutil.copy(os.path.join("dist", filename), ".")
Then run python setup.py install, an .egg file will be copied from subfolder to the current directory. Finally, run the following:
main.py
import check_compiler
check_compiler.test()
Similar question, but no accepted answer: How can I build a setup.py to compile C++ extension using Python, pybind11 and Mingw-w64?
Update: I was able to specify MinGW with cmake by adding -G "MinGW Makefiles" in the camke command. Still woudl welcome an answer of how to do this with setuptools, as it is the most convenient method.
I would like call python functions from C++. I am using Ubuntu 20.04 and have python 3.8 installed. The path where Python.h is located is /usr/include/python3.8/Python.h, which I found using the command line sudo find / -iname python.h in the terminal. In order to get started I tried to run the easiest code in code::blocks:
#include <Python.h>
#include <iostream>
using namespace std;
int main()
{
cout<<"starting interpreter."<<endl;
Py_Initialize();
PyRun_SimpleString("Hallo from Python");
Py_Finalize();
return 0;
}
I get the error message:
fatal error: Python.h: No such file or directory|
My question is: Which libraries do I have to include and how do I include these in code::blocks? For a detailed description I would be very thankful.
Greetings,
Ben
I'm trying to embed Cython code into C following O'reilly Cython book chapter 8. I found this paragraph on Cython's documentation but still don't know what should I do:
If the C code wanting to use these functions is part of more than one shared library or executable, then import_modulename() function needs to be called in each of the shared libraries which use these functions. If you crash with a segmentation fault (SIGSEGV on linux) when calling into one of these api calls, this is likely an indication that the shared library which contains the api call which is generating the segmentation fault does not call the import_modulename() function before the api call which crashes.
I'm running Python 3.4, Cython 0.23 and GCC 5 on OS X. The source code are transcendentals.pyx and main.c:
main.c
#include "transcendentals_api.h"
#include <math.h>
#include <stdio.h>
int main(int argc, char **argv)
{
Py_SetPythonHome(L"/Users/spacegoing/anaconda");
Py_Initialize();
import_transcendentals();
printf("pi**e: %f\n", pow(get_pi(), get_e()));
Py_Finalize();
return 0;
}
transcendentals.pyx
cdef api double get_pi():
return 3.1415926
cdef api double get_e():
print("calling get_e()")
return 2.718281828
I'm compiling those files using setup.py and Makefile:
setup.py:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Build import cythonize
setup(
ext_modules=cythonize([
Extension("transcendentals", ["transcendentals.pyx"])
])
)
Makefile
python-config=/Users/spacegoing/anaconda/bin/python3-config
ldflags:=$(shell $(python-config) --ldflags)
cflags:=$(shell $(python-config) --cflags)
a.out: main.c transcendentals.so
gcc-5 $(cflags) $(ldflags) transcendentals.c main.c
transcendentals.so: setup.py transcendentals.pyx
python setup.py build_ext --inplace
cython transcendentals.pyx
clean:
rm -r a.out a.out.dSYM build transcendentals.[ch] transcendentals.so transcendentals_api.h
However, I came to error Segmentation fault: 11. Any idea can help with this? Thanks!
In that Makefile there is
transcendentals.so: setup.py transcendentals.pyx
python setup.py build_ext --inplace
Unless python refers to /Users/spacegoing/anaconda/bin/python3 it should be replaced since the module may be compiled for wrong python version, and cannot thus be loaded.
In main.c there is call import_transcendentals() that does not check the return value i.e. if the import fails or succeeds. In case of failure, get_pi() and get_e() point to invalid memory locations and trying to call them causes a segmentation fault.
Also, the module has to be located somewhere where it can be found. It seems that when embedding, the current directory is not searched for python modules. PYTHONPATH environment variable could be changed to include the directory where transcendentals.so is located.
The following is an altenative way of embedding the code to the C program and sidesteps the import issues since the module code is linked to the executable.
Essentially, a call to PyInit_transcendentals() is missing.
File transcendentals.h will be generated when the cython functions are defined public i.e.
cdef public api double get_pi():
...
cdef public api double get_e():
Your main.c should have the include directives
#include <Python.h>
#include "transcendentals.h"
and then in main
Py_Initialize();
PyInit_transcendentals();
There should be no #include "transcendentals_api.h" and no import_transcendentals()
The first reason is that according to the documentation
However, note that you should include either modulename.h or
modulename_api.h in a given C file, not both, otherwise you may get
conflicting dual definitions.
The second reason is, that since transcendentals.c is linked to the program in
gcc $(cflags) $(ldflags) transcendentals.c main.c
there is no reason to import transcendentals module. The module has to be initialized though, PyInit_transcendentals() does that for Python 3
I'm exposing a simple C++ code to Python through BoostPython library:
#include <boost/python/detail/wrap_python.hpp>
#include <boost/python.hpp>
using namespace boost::python;
bool test_api( void ){
return true;
};
BOOST_PYTHON_MODULE(materials) {
def( "test_api", test_api );
}
After I try to import this module, the python interpreter returns the error:
ImportError: ./example.so: undefined symbol: _Py_RefTotal
I've linked the module statically against the boost python library and the python dynamic libraries libpython3.2m.so and libpython3.2m.so.1.0 are present in the work directory.
Any suggestions on where to find the missing symbol?
The Boost libraries were not consistent with the Python installation.
cd boost_source
./bootstrap.sh --with-libraries=python --prefix=../boost_target
To configure Boost to point to the correct Python installation:
vim tools/build/v2/user-config.jam
Edit the line that points to the Python:
using python : version_number
: path_to_python_executable
: path_to_python_include_directory
: path_to_python_library_directory
Then, run the build system:
./b2
_Py_RefTotal is defined in object.h under a precompiler guard:
$less include/python3.6m/object.h
#ifdef Py_REF_DEBUG
PyAPI_DATA(Py_ssize_t) _Py_RefTotal;
...
...
#endif /* Py_REF_DEBUG */
I was linking python3.6m but including headers from include/python3.6dm. Fixed issue including ptyhon3.6m
My C code:
#include<stdio.h>
#include "Python.h"
int main()
{
printf("Hello World");
return 0;
}
I have python-dev installed for python2.7. Moreover, Python.h is available in /usr/include/python2.7.
gcc myfile.c # Python.h: No such file or directory
I even tried :
gcc -L/usr/include/python2.7/ myfile.c # Python.h: No such file or directory
I tried building a python c module ujson with pip that uses Python.h, it was able to compile.
What am I missing / doing wrong ?
It should be -I, not -L:
gcc -I/usr/include/python2.7 myfile.c
Use
#include <Python.h>
instead of
#include "Python.h"
to include the header file.
The Python.h file should be the first file which is included.
#see Extending Python with C or C++ (Section 1.1 Note)
Since Python may define some pre-processor definitions which affect
the standard headers on some systems, you must include Python.h before
any standard headers are included.