C++ python binding: failed to import cv2 - python

I m using Ubuntu 20.04, python 3.8 and developing a C++ project in Qt Creator. I m trying to call a python file from C++ code. I have created an environment in conda and calling the py file. There is segmentation fault occurring when I m attempting to import cv2 module. Other modules like sys, numpy can be imported. cv2 is installed in the environment and can be accessed when I run python from cmd prompt.
Also I tried to run the python commands directly in a C++ file, like:
PyRun_SimpleString("import cv2");
But this also sends segmentation fault.
I tried giving path of the site-packages directory in CMakeLists.txt but it also resulted in same error.
mainwindow.cpp:
#include "mainwindow.h"
#include "./ui_mainwindow.h"
#include <stdio.h>
#include <pyhelper.hpp>
#include <string>
#include <iostream>
MainWindow::MainWindow(QWidget *parent)
: QMainWindow(parent)
, ui(new Ui::MainWindow)
{
ui->setupUi(this);
Py_InitializeEx(0);
PyRun_SimpleString("import cv2");
}
MainWindow::~MainWindow()
{
delete ui;
}
Qt Creator debugger stack trace is as below:

Related

Specity C++ compiler in setup.py for Pybind11

This is a simple question but has been bothering me for 3 months now. When I use the setuptools/setup.py method to compile C++ code into a Python package on my windows os, it always defaults to MSVC, but part of the code uses stdlibc++ which is only accessible with GNU. Is there some way to specify it to MinGW, or somehow change the default behavior? I have looked into other methods, cppimport does not support windows, and the cmake method seems very complex to me.
For reference, a simple test to check whether the compiler is MSVC:
check_compiler.cpp
#include <iostream>
#include <pybind11/pybind11.h>
#include <pybind11/numpy.h>
#include <pybind11/stl.h>
namespace py = pybind11;
void test(){
#ifdef _MSVC_LANG
py::print("Compiled with MSVC. ");
#else
py::print("Not compiled with MSVC. ");
#endif
}
PYBIND11_MODULE(check_compiler, m) {
m.def("test", &test);
}
setup.py
"""
python setup.py install
"""
from pybind11.setup_helpers import Pybind11Extension
from setuptools import setup
import os
import shutil
ext_modules = [
Pybind11Extension(
'check_compiler',
sources=['check_compiler.cpp'],
language='c++',
cxx_std=11
),
]
setup(
name='check_compiler',
author='XXX',
ext_modules=ext_modules
)
# copy the package
for filename in os.listdir("dist"):
if filename.endswith(".egg"):
shutil.copy(os.path.join("dist", filename), ".")
Then run python setup.py install, an .egg file will be copied from subfolder to the current directory. Finally, run the following:
main.py
import check_compiler
check_compiler.test()
Similar question, but no accepted answer: How can I build a setup.py to compile C++ extension using Python, pybind11 and Mingw-w64?
Update: I was able to specify MinGW with cmake by adding -G "MinGW Makefiles" in the camke command. Still woudl welcome an answer of how to do this with setuptools, as it is the most convenient method.

How do I properly link python extension modules in C++?

I have a simple example python extenstion I want to use from C/C++. The code is as follows
example.pxy:
from numpy import random
cdef public void somefunc():
print(random.randint(500))
setup.py:
from setuptools import setup
cfom Cython.Build import cythonize
import numpy
setup(
ext_modules=cythonize("example.pyx"),
zip_safe=False,
include_dirs=[numpy.get_include()]
)
Running python3 setup.py build_ext --inplace --compiler="mingw32" -DMS_WIN64 then creates example.c, example.h and example.cp310-win_amd64.pyd. The C++ code I am using to call someFunc is:
example.cpp:
#include <Python.h>
#include "example.h"
int main()
{
Py_Initialize();
somefunc();
return 0;
}
This I compile using g++ example.cpp -DMS_WIN64. But that command seems to be incomplete. There are sill objects left that need to be linked, namely the ones from the example.pyx. How do I do this? I do not see any generated .dllor.lib` or similar.
Additionally, if I use #include "exmaple.c" in
example.cpp, I get a very long list of missing symbols from the linker. The objects are all named __im_Py*.
I am using MINGW64 on Windows 10. The python installation I am trying to link against is a regular python installation from the system. I have an environment variable CPLUS_INCLUDE_PATH=C:\Program Files\Python310\include.

Include Python.h in linux (ubuntu) using code::blocks

I would like call python functions from C++. I am using Ubuntu 20.04 and have python 3.8 installed. The path where Python.h is located is /usr/include/python3.8/Python.h, which I found using the command line sudo find / -iname python.h in the terminal. In order to get started I tried to run the easiest code in code::blocks:
#include <Python.h>
#include <iostream>
using namespace std;
int main()
{
cout<<"starting interpreter."<<endl;
Py_Initialize();
PyRun_SimpleString("Hallo from Python");
Py_Finalize();
return 0;
}
I get the error message:
fatal error: Python.h: No such file or directory|
My question is: Which libraries do I have to include and how do I include these in code::blocks? For a detailed description I would be very thankful.
Greetings,
Ben

Calling open cv python code for image display from C visual studio

I am trying to call image display code of python from C using cython.
I had followed the procedure for creating .c and .h from .pyx and adding these into the C code in visual studio.
I had checked the python version on command prompt and it is Python 3.6.3 |Anaconda custom (64-bit). I am able to import cv2 there on command prompt.
But when I call this .c and .h file into C code I am getting error
NameError: name 'cv2' is not defined
Exception ignored in: 'read.readImage'
NameError: name 'cv2' is not defined
I had checked for python path is set in the environment. Still I am getting the error.
The code for read.pyx is
import numpy as np
import cv2
cdef public void readImage():
img = cv2.imread('dog.jpeg')
print('reading')
cv2.imshow('image',img)
cv2.waitKey(0)
cv2.destroyAllWindows()
The code for souce.cpp in visual studio is
#include "Python.h"
# include "read.h"
using namespace cv;
int main(void) {
Py_Initialize();
PyInit_read();
readImage();
Py_Finalize();
return 0;
}
The same python version is installed twice at different location. The issue was regarding python path.

Exposing C++ to Python error from BoostPython

I'm exposing a simple C++ code to Python through BoostPython library:
#include <boost/python/detail/wrap_python.hpp>
#include <boost/python.hpp>
using namespace boost::python;
bool test_api( void ){
return true;
};
BOOST_PYTHON_MODULE(materials) {
def( "test_api", test_api );
}
After I try to import this module, the python interpreter returns the error:
ImportError: ./example.so: undefined symbol: _Py_RefTotal
I've linked the module statically against the boost python library and the python dynamic libraries libpython3.2m.so and libpython3.2m.so.1.0 are present in the work directory.
Any suggestions on where to find the missing symbol?
The Boost libraries were not consistent with the Python installation.
cd boost_source
./bootstrap.sh --with-libraries=python --prefix=../boost_target
To configure Boost to point to the correct Python installation:
vim tools/build/v2/user-config.jam
Edit the line that points to the Python:
using python : version_number
: path_to_python_executable
: path_to_python_include_directory
: path_to_python_library_directory
Then, run the build system:
./b2
_Py_RefTotal is defined in object.h under a precompiler guard:
$less include/python3.6m/object.h
#ifdef Py_REF_DEBUG
PyAPI_DATA(Py_ssize_t) _Py_RefTotal;
...
...
#endif /* Py_REF_DEBUG */
I was linking python3.6m but including headers from include/python3.6dm. Fixed issue including ptyhon3.6m

Categories

Resources