I am having some problems building PyQt5 on Ubuntu 14.04. I am working with some code that has a hard dependency on Python 2.7; so, I am unable to use the python3 packages from Ubuntu's repository. Further searches of Ubuntu's packages reveal that there are dev and doc packages for Python 2 pyqt5. But, nothing to install the libraries necessary to write code.
This has led me to creating a custom build for PyQt5. I obtained the source for version 5.5 from here: https://www.riverbankcomputing.com/software/pyqt/download5 and I am using sip as provided by the Ubuntu repos (installation of kubuntu-desktop requires sip).
I read that its easy to have mismatched versions of sip so I did the following check:
Python 2.7.6 (default, Jun 22 2015, 17:58:13)
[GCC 4.8.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import sip
>>> print(sip, sip.SIP_VERSION_STR)
(<module 'sip' from '/usr/lib/python2.7/dist-packages/sip.so'>, '4.16.9')
And:
$ sip -V
4.16.9
Also I am using the Qt5 tools provided by the Ubuntu repos. This included installing qtdeclarative5-* (probably overkill) and qt5-default. Here is some information about qmake:
qmake --version
QMake version 3.0
Using Qt version 5.2.1 in /usr/lib/x86_64-linux-gnu
I currently have PyQt4 installed and read on the installation notes that this would be fine as long as they were both compiled against the same version of sip.
After downloading, I unpacked the tarball and attempted a build as follows:
sudo ln -s /usr/include/python2.7 /usr/local/include/python2.7
python configure.py --sip-incdir=/usr/include/python2.7/
make
The configuration output appeared to identify the correct version of sip and I get the following (seemingly) sip related compile errors from make:
make[1]: Entering directory `~/Downloads/PyQt-gpl-5.5/QtWebKit'
g++ -c -m64 -pipe -fno-exceptions -O2 -Wall -W -D_REENTRANT -fPIC -DSIP_PROTECTED_IS_PUBLIC -Dprotected=public -DQT_NO_DEBUG -DQT_PLUGIN -DQT_WEBKIT_LIB -DQT_NETWORK_LIB -DQT_GUI_LIB -DQT_CORE_LIB -I/usr/lib/x86_64-linux-gnu/qt5/mkspecs/linux-g++-64 -I. -I. -I/usr/include/python2.7 -I/usr/include/qt5 -I/usr/include/qt5/QtWebKit -I/usr/include/qt5/QtNetwork -I/usr/include/qt5/QtGui -I/usr/include/qt5/QtCore -I. -o sipQtWebKitQWebSecurityOrigin.o sipQtWebKitQWebSecurityOrigin.cpp
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp: In function ‘PyObject* meth_QWebSecurityOrigin_addAccessWhitelistEntry(PyObject*, PyObject*)’:
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:384:9: error: ‘SubdomainSetting’ is not a member of ‘QWebSecurityOrigin’
QWebSecurityOrigin::SubdomainSetting a2;
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:384:46: error: expected ‘;’ before ‘a2’
QWebSecurityOrigin::SubdomainSetting a2;
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:387:214: error: ‘a2’ was not declared in this scope
if (sipParseArgs(&sipParseErr, sipArgs, "BJ1J1E", &sipSelf, sipType_QWebSecurityOrigin, &sipCpp, sipType_QString,&a0, &a0State, sipType_QString,&a1, &a1State, sipType_QWebSecurityOrigin_SubdomainSetting, &a2))
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:389:21: error: ‘class QWebSecurityOrigin’ has no member named ‘addAccessWhitelistEntry’
sipCpp->addAccessWhitelistEntry(*a0,*a1,a2);
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp: In function ‘PyObject* meth_QWebSecurityOrigin_removeAccessWhitelistEntry(PyObject*, PyObject*)’:
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:417:9: error: ‘SubdomainSetting’ is not a member of ‘QWebSecurityOrigin’
QWebSecurityOrigin::SubdomainSetting a2;
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:417:46: error: expected ‘;’ before ‘a2’
QWebSecurityOrigin::SubdomainSetting a2;
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:420:214: error: ‘a2’ was not declared in this scope
if (sipParseArgs(&sipParseErr, sipArgs, "BJ1J1E", &sipSelf, sipType_QWebSecurityOrigin, &sipCpp, sipType_QString,&a0, &a0State, sipType_QString,&a1, &a1State, sipType_QWebSecurityOrigin_SubdomainSetting, &a2))
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:422:21: error: ‘class QWebSecurityOrigin’ has no member named ‘removeAccessWhitelistEntry’
sipCpp->removeAccessWhitelistEntry(*a0,*a1,a2);
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp: In function ‘void* init_type_QWebSecurityOrigin(sipSimpleWrapper*, PyObject*, PyObject*, PyObject**, PyObject**, PyObject**)’:
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:477:48: error: no matching function for call to ‘QWebSecurityOrigin::QWebSecurityOrigin(const QUrl&)’
sipCpp = new QWebSecurityOrigin(*a0);
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:477:48: note: candidates are:
In file included from ~/Downloads/PyQt-gpl-5.5/sip/QtWebKit/qwebsecurityorigin.sip:26:0:
/usr/include/qt5/QtWebKit/qwebsecurityorigin.h:64:5: note: QWebSecurityOrigin::QWebSecurityOrigin(QWebSecurityOriginPrivate*)
QWebSecurityOrigin(QWebSecurityOriginPrivate* priv);
^
/usr/include/qt5/QtWebKit/qwebsecurityorigin.h:64:5: note: no known conversion for argument 1 from ‘const QUrl’ to ‘QWebSecurityOriginPrivate*’
/usr/include/qt5/QtWebKit/qwebsecurityorigin.h:58:5: note: QWebSecurityOrigin::QWebSecurityOrigin(const QWebSecurityOrigin&)
QWebSecurityOrigin(const QWebSecurityOrigin& other);
^
/usr/include/qt5/QtWebKit/qwebsecurityorigin.h:58:5: note: no known conversion for argument 1 from ‘const QUrl’ to ‘const QWebSecurityOrigin&’
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp: At global scope:
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:516:48: error: ‘AllowSubdomains’ is not a member of ‘QWebSecurityOrigin’
{sipName_AllowSubdomains, static_cast<int>(QWebSecurityOrigin::AllowSubdomains), 21},
^
~/Downloads/PyQt-gpl-5.5/QtWebKit/sipQtWebKitQWebSecurityOrigin.cpp:517:51: error: ‘DisallowSubdomains’ is not a member of ‘QWebSecurityOrigin’
{sipName_DisallowSubdomains, static_cast<int>(QWebSecurityOrigin::DisallowSubdomains), 21},
^
make[1]: *** [sipQtWebKitQWebSecurityOrigin.o] Error 1
make[1]: Leaving directory `~/Downloads/PyQt-gpl-5.5/QtWebKit'
make: *** [sub-QtWebKit-make_first-ordered] Error 2
The outcome is that I can make install and get some of the functionality that I'd expect; however, I am missing some required functionality with the WebKit widgets. I hope that I have supplied enough information to describe where i'm stuck. I feel just shy of digging into the code; however, I'm assuming that the answer is actually much simpler.
Thanks in advance!
So,
I started digging through the source package for the file that is failing to be compiled. In the sip directory there is a sip file QWebSecurityOrigin that contains the following:
%If (Qt_5_2_0 -)
enum SubdomainSetting
{
AllowSubdomains,
DisallowSubdomains,
};
%End
I can reasonably expect that this code be included as qmake tells me the following:
qmake --version
QMake version 3.0
Using Qt version 5.2.1 in /usr/lib/x86_64-linux-gnu
Next I wanted to look into the qwebsecurityorigin.h that was provided by Qt to see if the error could come from there. Mine is installed here: /usr/include/qt5/QtWebKit/qwebsecurityorigin.h
#ifndef _WEBSECURITYORIGIN_H_
#define _WEBSECURITYORIGIN_H_
#include <QtCore/qurl.h>
#include <QtCore/qshareddata.h>
#include "qwebkitglobal.h"
namespace WebCore {
class SecurityOrigin;
class ChromeClientQt;
}
class QWebSecurityOriginPrivate;
class QWebDatabase;
class QWebFrame;
class QWEBKIT_EXPORT QWebSecurityOrigin {
public:
static QList<QWebSecurityOrigin> allOrigins();
static void addLocalScheme(const QString& scheme);
static void removeLocalScheme(const QString& scheme);
static QStringList localSchemes();
~QWebSecurityOrigin();
QString scheme() const;
QString host() const;
int port() const;
qint64 databaseUsage() const;
qint64 databaseQuota() const;
void setDatabaseQuota(qint64 quota);
void setApplicationCacheQuota(qint64 quota);
QList<QWebDatabase> databases() const;
QWebSecurityOrigin(const QWebSecurityOrigin& other);
QWebSecurityOrigin &operator=(const QWebSecurityOrigin& other);
private:
friend class QWebDatabase;
friend class QWebFrameAdapter;
friend class WebCore::ChromeClientQt;
QWebSecurityOrigin(QWebSecurityOriginPrivate* priv);
private:
QExplicitlySharedDataPointer<QWebSecurityOriginPrivate> d;
};
Note, no enum is defined. A search of Qt 5.5 suggests that the enum should be there: http://doc.qt.io/qt-5/qwebsecurityorigin.html#SubdomainSetting-enum
Finally, had recalled that I installed libqt5webkit separately from the bulk of the Qt libraries; so, I did a version check on the package:
dpkg -s libqt5webkit5
Package: libqt5webkit5
Status: install ok installed
Priority: optional
Section: libs
Installed-Size: 34225
Maintainer: Ubuntu Developers <ubuntu-devel-discuss#lists.ubuntu.com>
Architecture: amd64
Multi-Arch: same
Source: qtwebkit-opensource-src
Version: 5.1.1-1ubuntu8
This output is almost identical for the dev package. So this makes it appear as the bulk of the Qt5 distribution, in the repos, is on a different version than webkit. Furthermore, if QWebKit is on 5.1.1, it would explain why the enum is missing as the sip file seems to suggest it was an addition in 5.2.0.
So my solution was to download and install Qt 5.5 from the Qt website using the automated installer (ran with sudo, using defaults). I then started fresh with the PyQt5 source by blowing away the build directory and unpacking source again:
python configure.py --sip-incdir=/usr/include/python2.7/ --qmake=/opt/Qt/5.5/gcc_64/bin/qmake
make
sudo make install
The licenses are not compatible; however, a quick search through the pyqt5 configure.py script, using the error output, may give some insight into getting the code configured and compiling.
Related
I'm currently trying to write a 'setup.py' script that when the user installs the python package automatically compiles my C++ extension bound with 'pybind11'. In Windows, I haven't got any problem making it happen with the 'VS19 MSVC' compiler. But I'm trying to make it happen if the user has installed 'MinGW-w64' instead.
These are the package files:
**main.cpp**
#include <pybind11/pybind11.h>
int add(int i, int j) {
return i + j;
}
namespace py = pybind11;
PYBIND11_MODULE(pybind11_example, m) {
m.def("add", &add);
}
**setup.py**
from setuptools import setup, Extension
import pybind11
ext_modules = [
Extension(
'pybind11_example',
sources = ['main.cpp'],
include_dirs=[pybind11.get_include()],
language='c++'
),
]
setup(
name='pybind11_example',
ext_modules=ext_modules
)
Having the two files in the same folder and running from the command prompt:
python setup.py build
If the user has VS19 MSVC compiler installed it successfully generates **pybind11_example.pyd** that can be tested to work running with python:
import pybind11_example as m
print(m.add(1, 2))
But if the user has a Mingw-w64 compiler installed raises an error saying that Visual Studio 2015 is required.
Note that I can easily compile **main.cpp** in to **pybind11_example.pyd** manually with Mingw-w64 running:
g++ -static -shared -std=c++11 -DMS_WIN64 -fPIC -I C:\...\Python\Python38\Lib\site-packages\pybind11\include -I C:\ ... \Python\Python38\include -L C:\ ... \Python\Python38\libs main.cpp -o pybind11_example.pyd -lPython38
Is there a way to write **setup.py** in a way that if the user has Windows with a MinGW-w64 compiler automatically compile **main.cpp** into **pybind11_example.pyd** when installing the package without needing to make it manually?
Chek the answer to this question. They try to solve the opposite case, force msvc instead of mingw, but the approach with setup.cfg might help you.
And here the answer demonstrates how to specify command line parameters depending on the choice made by setup tools: if it is msvc then one set of parameters, and another set for mingw.
I belive the second approach should suite your needs - whichever compier is installled, you have the proper command line to build your module.
I have the following sample program:
// src/main.cpp
#include <boost/python.hpp>
char const* func()
{
return "String";
}
BOOST_PYTHON_MODULE(bridge)
{
boost::python::def("func", func);
}
When built using the following CMakeLists.txt, no compiler errors are given:
project(bridge)
cmake_minimum_required(VERSION 3.5)
set(PROJECT_SOURCE_DIR ${CMAKE_SOURCE_DIR}/src)
set(CMAKE_BINARY_DIR ${CMAKE_SOURCE_DIR}/bin)
set(EXECUTABLE_OUTPUT_PATH ${CMAKE_BINARY_DIR})
set(LIBRARY_OUTPUT_PATH ${CMAKE_BINARY_DIR})
set(SOURCE_FILES
${PROJECT_SOURCE_DIR}/main.cpp
)
# Include Python
#set(Python_ADDITIONAL_VERSIONS 3.5)
find_package(PythonLibs)
if (${PYTHONLIBS_FOUND})
include_directories(${PYTHON_INCLUDE_DIRS})
link_directories(${PYTHON_LIBRARIES})
endif()
# Include Boost
find_package(Boost 1.61.0 COMPONENTS python REQUIRED)
if (${Boost_FOUND})
include_directories(${Boost_INCLUDE_DIRS})
link_directories(${Boost_LIBRARY_DIR})
endif()
# Enable C++ 11
add_compile_options(-std=c++11)
add_compile_options("-lboost_python")
add_library(bridge SHARED ${SOURCE_FILES})
target_link_libraries(bridge ${PYTHON_LIBRARIES})
target_link_libraries(bridge ${Boost_LIBRARIES})
However, importing the shared library (libbridge.so) gives the following error:
/bin$ python
Python 2.7.11+ (default, Apr 17 2016, 14:00:29)
[GCC 5.3.1 20160413] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import libbridge
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: ./libbridge.so: undefined symbol: _ZN5boost6python6detail11init_moduleER11PyModuleDefPFvvE
I have compiled boost and boost_python without any problem, and other boost libraries are fully functional. What is wrong here?
Edit: In another post a solution was given by making the filename the same as the argument fed into BOOST_PYTHON_MODULE. After these modifications, the following error is now given by import libbridge:
ImportError: ./libbridge.so: invalid ELF header
Exporting the environment variable $LD_LIBRARY_PATH=$BOOST_ROOT/stage/lib does not seem to create a difference.
I have found a solution. The problem is due to Python version mismatch inside Boost. I decided to compile everything in Python 3 and it solves the problem. I proceeded as follows:
I uncommented the following line to the auto-generated user-config.jam located in $BOOST_ROOT/tools/build/example/
using python : 3.5 : /usr/bin/python3 : /usr/include/python3.5 : /usr/lib;
Boost.Python was built from scratch using the commands (executed in sudo to gain permission to /usr/local)
$BOOST_ROOT : ./b2 --with-python --clean
$BOOST_ROOT : ./b2 --with-python --install
I verified that the libraries are indeed Python 3 using
$BOOST_ROOT : nm -D stage/lib/libboost_python-3.so | grep PyClass_Type
No output should be given. If the library was compiled with Python 2, then U PyClass_Type would show up.
The CMakeLists.txt file in the sample project was slightly modified:
set(Python_ADDITIONAL_VERSIONS 3.5) // uncommented
find_package(Boost 1.61.0 COMPONENTS python3 REQUIRED) // python3 instead of python
add_compile_options("-lboost_python") // removed
Now python3 (not python) should be able to link against the compiled libbridge.so library.
I am running Python 3.4.4 :: Anaconda 4.0.0 (x86_64) on OS X Yosemite. My Cython's version is 0.23.4. I'm trying to embed some very trivial Cython code test.pyx into C code testcc.c. The problem is, if I use python2.7-config then everything works well (Python 2.7 is the built-in version on OS X). However if I use python3.4-config the following errors raised:
Undefined symbols for architecture x86_64:
"_inittest", referenced from:
_main in testcc-b22dcf.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
I have to use Python3 since all my other codes are written in it. Please help me solve this problem.
The following are my source files:
test.pyx:
cdef public void pythonAdd(int[] a):
a[1] = 5
a[0] = 4
testcc.c:
#include "Python.h"
#include "test.h"
#include <math.h>
#include <stdio.h>
int main(int argc, char **argv) {
Py_Initialize();
inittest();
int a [2] = {0 , 0};
pythonAdd(a);
printf("fist: %d, second: %d", a[0], a[1]);
Py_Finalize();
return 0;
}
And Compiling those two files using following setup.py:
from distutils.core import setup, Extension
from Cython.Build import cythonize
ext = Extension("testc", sources=["test.pyx"])
setup(name="testc", ext_modules=cythonize(ext))
The following is the command I compile those c files:
ldflags:=$(shell $(python3.4-config) --ldflags)
cflags:=$(shell $(python3.4-config) --cflags)
python setup.py build_ext --inplace
cython test.pyx
gcc $(cflags) $(ldflags) test.c testcc.c -o cysvm.out
Update:
I changed the inittest() to PyInit_test() as Jim suggested. The code compiles successfully. However when I run ./cysvm.out the following errors occured:
./cysvm.out
Could not find platform independent libraries <prefix>
Could not find platform dependent libraries <exec_prefix>
Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>]
Fatal Python error: Py_Initialize: unable to load the file system codec
ImportError: No module named 'encodings'
Current thread 0x00007fff772f5300 (most recent call first):
Update
I solved this by adding the following line before Py_Initialize(); in my c code:
Py_SetPythonHome(L"/PATH/to/python3");
This is probably due to the fact that in Python 3.x initialization of modules is not performed by calling init<module_name> but rather with PyInit_<module_name> (See PEP 3121). So, if you are linking with Python 3.x and executing via 3.x you need to change the initialization call.
In short, changing the call that initializes the module from:
inittest();
To:
PyInit_test();
and recompiling, should do the trick.
As for your second problem, an alternate solution other than using Py_SetPythonHome is setting PYTHONHOME to the output of python3.4-config --exec-prefix (or sys.exec_prefix) prior to compilation.
I am trying to cross-compile a simple SWIG Python extension on Linux for Windows (mingw32), using the distutils module.
The ultimate goal is to compile a Python wrapper for some library and being able to use it on Windows. Obviously I started with the most basic example and unfortunately it fails.
Here are the files I am using:
example.c
/* File : example.c */
/* A global variable */
double Foo = 3.0;
/* Compute the greatest common divisor of positive integers */
int gcd(int x, int y) {
int g;
g = y;
while (x > 0) {
g = x;
x = y % x;
y = g;
}
return g;
}
example.i - SWIG interface file
/* File : example.i */
%module example
%inline %{
extern int gcd(int x, int y);
extern double Foo;
%}
setup.py
# setup.py
import distutils
from distutils.core import setup, Extension
setup(name = "SWIG example",
version = "1.0",
ext_modules = [Extension("_example", ["example.i","example.c"])])
In order to compile using the native (Linux) gcc compiler, I am invoking:
python setup.py build
Everything works like a charm! Unfortunately when trying to specify the Windows target:
python setup.py build --compiler=mingw32
I get the error saying that gcc can't recognize -mdll switch:
running build
running build_ext
building '_example' extension
swigging example.i to example_wrap.c
swig -python -o example_wrap.c example.i
creating build
creating build/temp.linux-x86_64-2.7
gcc -mdll -O -Wall -I/home/jojek/anaconda/include/python2.7 -c example_wrap.c -o build/temp.linux-x86_64-2.7/example_wrap.o
gcc: error: unrecognized command line option ‘-mdll’
error: command 'gcc' failed with exit status 1
Fair enough, it makes perfect sense, since toolchain is not valid. I made sure that mingw32 is installed on my machine. By calling dpkg -L mingw32 I know that compiler is located in /usr/bin/i586-mingw32msvc-gcc.
My next step was to override the CC environmental variable with the actual path to my compiler. When I try to compile it again, then I am getting the following error with missing sys/select.h header file:
running build
running build_ext
building '_example' extension
swigging example.i to example_wrap.c
swig -python -o example_wrap.c example.i
creating build
creating build/temp.linux-x86_64-2.7
/usr/bin/i586-mingw32msvc-gcc -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/jojek/anaconda/include/python2.7 -c example_wrap.c -o build/temp.linux-x86_64-2.7/example_wrap.o
example_wrap.c:1: warning: -fPIC ignored for target (all code is position independent)
In file included from /home/jojek/anaconda/include/python2.7/Python.h:58,
from example_wrap.c:125:
/home/jojek/anaconda/include/python2.7/pyport.h:351:24: error: sys/select.h: No such file or directory
error: command '/usr/bin/i586-mingw32msvc-gcc' failed with exit status 1
Does anyone have an idea how to manage that task?
There's quite a bit going on behind the scenes when you compile Python modules using distutils. You're getting closer with each try in your question, however the problem you've now encountered is that you're using Linux header files with a Windows (cross) compiler. (sys/select.h isn't supported with mingw32, cygwin might be a different story though). In reality it's the lack of the configuration header file that's causing your cross compile to try and use the POSIX interfaces instead of Win32 alternatives.
My answer rewinds a few steps and starts out simply building the module by hand, using mingw32 on Linux and then we'll look at using distutils once we've proven that we have all that's required.
I'm also assuming that you don't have a Windows build box (or even a VM) available to simply build your extension on Windows natively since that's far simpler than cross compiling. If you're reading this and have the option to use a Windows box to build your Windows Python extensions, do that instead and save time and effort. That said it is possible to build Windows Python modules using only a Linux box.
Starting with mingw32 already installed and working on your Linux box (e.g. using the Debian/Ubuntu packages) the first step is to get the Windows header files (or configuration to be more specific). I'm assuming you're targeting the build that most people get when they type "python windows" into a search engine so I downloaded the Windows MSI installers from python.org and extracted them from there.
There are two things we want to get from the Python distribution:
python27.dll (Usually gets placed in c:\windows\system32 or c:\windows\syswow64)
The 'include' directory (Usually gets placed in c:\python27\include)
Under Linux there are a few different ways you can extract this. You could use Wine to install the MSI file. I used both cabextract and 7z with success in my testing though, for example with cabextract:
cabextract /tmp/python-2.7.10.msi -F '*.h'
cabextract /tmp/python-2.7.10.msi -F 'python27.dll'
(Note: if you use 7z you'll find the files you really want inside a second, inner archive named 'python').
At this point you could also extract the file 'libpython27.a' which usually lives inside c:\python27\libs\ however this file isn't sufficient or even useful for linking using mingw32.
Given the header files we've now got enough to compile our extension, although as noted above to get mingw32 to link against python27.dll we need to do a bit more work first. We're going to need a tool called pexports to list all the exported symbols in the Python DLL and let dlltool generate a stub library for mingw32 to link against. I downloaded pexports directly and then extracted it with:
tar xvf ~/Downloads/pexports-0.47-mingw32-bin.tar.xz
Once that's extracted we get a single Windows executable. I used Wine in my example here to run it directly; alternatively, you could extract the source, and build it as a tool to run natively on the Linux host:
tar xvf ~/Downloads/pexports-0.47-mingw32-src.tar.xz
(cd pexports-0.47 && ./configure && make)
or you could have duplicated the functionality of the tool using the Python module pefile (which runs fine cross platform) to extract the exports that we care about if you were looking to avoid using Wine as well.
Anyway with pexports you can generate a .def file that contains the information we need for dlltool:
wine bin/pexports.exe -v python27.dll > python27.def
or, (if you've built pexports as a native tool), simply:
./pexports-0.47/pexports -v python27.dll > python27.def
where python27.dll is what we extracted from the .msi file earlier.
(This was my pexports reference)
Once you've got the .def file you can use the mingw32 dlltool to generate a .a file that we'll use later to link our Python module against:
i586-mingw32msvc-dlltool -A --dllname python27.dll --def python27.def --output-lib libpython27.a
Now we've reached a point where we can think about running SWIG itself to generate the code for us to compile. I simplified your example interface even further to be just:
%module test
%inline %{
int gcd(int x, int y) {
int g;
g = y;
while (x > 0) {
g = x;
x = y % x;
y = g;
}
return g;
}
%}
And then ran SWIG on my Linux box as:
swig -Wall -python test.i
This generated test_wrap.c which I compiled with:
i586-mingw32msvc-gcc test_wrap.c -I../include -Wall -Wextra -shared -o _test.pyd ./libpython27.a
And there we have a Windows Python module built using just Linux.
To check it really runs I copied test.py and _test.pyd to a Windows box and then did:
Python 2.7.10 (default, May 23 2015, 09:40:32) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import test
>>> test.gcd(1024, 512)
512
>>>
Now all that remains is to make sure distutils can find the right include files and libraries to link against by manipulating its paths.
For 64 bits, I couldnt make it work with pexports, so I used gendef to generate the python27.def file.
gendef is a tool which generate def files from DLLs. A def file is a list of symbols exported by a DLL. The primary use of this tool is to allow creation of a import library of DLLs created by non-GCC compilers. It can handle both x86 (win32) and amd64 (win64) executables.
https://sourceforge.net/p/mingw-w64/wiki2/gendef/
Hope It helps!
What am I missing in my Boost.Python configuration/installation?
I'm trying to compile tutorial example, and I get error with libboost_python not found
cd /usr/share/doc/libboost1.42-doc/examples/libs/python/example/tutorial
bjam
error: Unable to find file or target named
error: 'libboost_python'
error: referred from project at
error: '.'
But the library is there, ldconfig.real has been run:
/usr/lib/libboost_python.a -> libboost_python-py27.a
/usr/lib/libboost_python-mt-py26.a -> libboost_python-py26.a
/usr/lib/libboost_python-mt-py26.so -> libboost_python-py26.so.1.42.0
/usr/lib/libboost_python-mt-py27.a -> libboost_python-py27.a
/usr/lib/libboost_python-mt-py27.so -> libboost_python-py27.so.1.42.0
/usr/lib/libboost_python-py26.a
/usr/lib/libboost_python-py26.so -> libboost_python-py26.so.1.42.0
/usr/lib/libboost_python-py26.so.1.42.0
/usr/lib/libboost_python-py27.a
/usr/lib/libboost_python-py27.so -> libboost_python-py27.so.1.42.0
/usr/lib/libboost_python-py27.so.1.42.0
/usr/lib/libboost_python.so -> libboost_python-py27.so
I'm using default libboost packages from Ubuntu 11.04.
My user-config.jam is
using python : 2.7 ;
I had a similar problem on ubuntu 12.04 where I installed all the boost libraries as a package. I found the solution here:
http://jayrambhia.wordpress.com/2012/06/25/configuring-boostpython-and-hello-boost/
It turns out that you do not need to use bjam at all. A makefile suffices. I will repeat the the solution from the above link here:
1.) Install the libboost-python package
2.) Create a hello world source file called 'hello_ext.c':
char const* greet()
{
return "hello, world";
}
#include<boost/python.hpp>
BOOST_PYTHON_MODULE(hello_ext)
{
using namespace boost::python;
def("greet",greet);
}
3.) Create a makefile:
PYTHON_VERSION = 2.7
PYTHON_INCLUDE = /usr/include/python$(PYTHON_VERSION)
# location of the Boost Python include files and library
BOOST_INC = /usr/include
BOOST_LIB = /usr/lib
# compile mesh classes
TARGET = hello_ext
$(TARGET).so: $(TARGET).o
g++ -shared -Wl,--export-dynamic $(TARGET).o -L$(BOOST_LIB) -lboost_python -L/usr /lib/python$(PYTHON_VERSION)/config -lpython$(PYTHON_VERSION) -o $(TARGET).so
$(TARGET).o: $(TARGET).c
g++ -I$(PYTHON_INCLUDE) -I$(BOOST_INC) -fPIC -c $(TARGET).c
4.) make
make
5.) Ready to use. In python:
import hello_ext
print hello_ext.greet()
Still not sure it that's the proper way, seems little hackish, but following helped:
In Jamroot file replaced
project
: requirements <library>libboost_python ;
with
project
: requirements <library>/usr/lib/libboost_python.so ;
You could have a site-config file with something like the following ;
using boost : 1.48 : <include>/usr/include/boost-1_48 <library>/usr/lib ;
(you need the < library > bit, not sure why)
then you can do stuff like.
project foo
: <library>/boost//python
Makes things easier in the long run, as you inevitably will have to change boost version at some point.