Hit problem when convert python code to shared object by Cython.
setup file here:
from distutils.core import setup
from Cython.Build import cythonize
setup(
ext_modules = cythonize("hello.py")
)
So everything works fine on my Ubuntu desktop util transferred to CentOS.
Got error:
undefined symbol: PyUnicodeUCS4_DecodeUTF8
I googled and find there are many questions on this, but, almost all of them say the root cause is python with UCS2 or UCS4, and I understand this, didn't find one show the way to solve this.
IMO, ways to solve:
rebuild python to get the right version by "--enable-unicode=ucs4/ucs2"
But I need to reinstall all packages
Compile the code from another desktop whose python with the right UCS
Now, I wanna if there is way to set Cython to compile with specified UCS mode.
Any suggestions is great appreciated.
Thanks.
First, to answer your actual question:
I wanna if there is way to set Cython to compile with specified UCS mode.
You can build a separate python installation from source and link Cython against its headers. To find the headers, you can use the python-config tool (or python3-config for Python 3). It is usually located in the bin directory where the python executable is:
$ # system python on my machine (macos):
$ which python-config
/usr/bin/python-config
$ # python 3 installation
$ which python3-config
/Library/Frameworks/Python.framework/Versions/3.6/bin/python3-config
$ python-config --cflags
-I/System/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -I/System/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -fno-strict-aliasing -fno-common -dynamic -arch x86_64 -arch i386 -g -Os -pipe -fno-common -fno-strict-aliasing -fwrapv -DENABLE_DTRACE -DMACOSX -DNDEBUG -Wall -Wstrict-prototypes -Wshorten-64-to-32 -DNDEBUG -g -fwrapv -Os -Wall -Wstrict-prototypes -DENABLE_DTRACE
$ python-config --ldflags
-L/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/config -lpython2.7 -ldl -framework CoreFoundation
Copy the output to the setup.py:
from setuptools import setup
from setuptools.extension import Extension
from Cython.Build import cythonize
cflags_ucs4 = [
'-I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m',
'-I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m',
...
]
ldflags_ucs4 = [
'-L/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/config-3.6m-darwin',
'-lpython3.6m',
...
]
cflags_ucs4 = [...]
ldflags_ucs2 = [...]
should_build_ucs2 = False # i.e. could be passed via sys.argv
if should_build_ucs2:
cflags = cflags_ucs2
ldflags = ldflags_ucs2
else:
cflags = cflags_ucs4
ldflags = ldflags_ucs4
extensions = [
Extension('hello.py', extra_compile_args=cflags, extra_link_args=ldflags),
]
setup(
ext_modules = cythonize(extensions)
)
However, I do not recommend doing that as you won't win anything by doing that - you will still need to build and distribute two separate packages (one for UCS2, another for UCS4) which is messy to maintain.
Instead, if you are building a wheel that should be installable on a wide range of Linux distros (what is most probably your actual goal), I would suggest to make your build compliable with PEP 513 (manylinux1 packages).I suggest you to read it through as it was very helpful for me when I faced the problem of distributing Linux-compliant wheels.
Now, one way to get a manylinux1-compliant wheel is to build the wheel on your machine, then running auditwheel to check for platform-specific issues and trying to resolve them:
$ pip install auditwheel
$ python setup.py bdist_wheel
$ # there should be now a mypkg-myver-cp36-cp36m-linux_x86_64.whl file in your dist directory
$ auditwheel show dist/mypkg-myver-cp36-cp36m-linux_x86_64.whl
$ # check what warnings auditwheel produced
$ # if there are warnings, try to repair them:
$ auditwheel repair dist/mypkg-myver-cp36-cp36m-linux_x86_64.whl
This should generate a wheel file named mypkg-myver-cp36-cp36m-manylinux1_x86_64.whl in a wheelhouse directory. Check again that everything is fine now by running auditwheel show wheelhouse/mypkg-myver-cp36-cp36m-manylinux1_x86_64.whl. If the wheel is now consistent with manylinux1, you can distribute it and it should work on most Linux distros (at least those with glibc; distros with musl like Alpine won't work, you will need to build a separate wheel if you want to support it).
What should you do if auditwheel can't repair your wheel? The best way is to pull a special docker container provided by PyPA for building manylinux1-compliant wheels (this is what I'm using myself):
$ docker pull https://quay.io/repository/pypa/manylinux1_x86_64
A wheel built inside this container will work on most of the Linux distros (excluding some exotic ones like Alpine).
Related
I build libyaml and install it into a local area:
yaml-0.1.5 $ ./configure --prefix=/usr/local/sqlminus
yaml-0.1.5 $ make install
yaml-0.1.5 $ ls -l /usr/local/sqlminus/include/yaml.h
-rw-r--r--# 1 mh admin 54225 Jan 5 09:05 /usr/local/sqlminus/include/yaml.h
But when I build PyYAML, it cannot find yaml.h.
PyYAML-3.11 $ /usr/local/sqlminus/bin/python setup.py build
checking if libyaml is compilable
gcc -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall
-Wstrict-prototypes -I/usr/local/sqlminus/include/python2.7
-c build/temp.macosx-10.4-x86_64-2.7/check_libyaml.c
-o build/temp.macosx-10.4-x86_64-2.7/check_libyaml.o
build/temp.macosx-10.4-x86_64-2.7/check_libyaml.c:2:10:
fatal error: 'yaml.h'
file not found
#include <yaml.h>
^
1 error generated.
How can I tell PyYAML where I've installed libyaml?
(update) Based on dotslash's comment below, editing setup.cfg and adding these two lines made everything work smoothly.
include_dirs=/usr/local/sqlminus/include
library_dirs=/usr/local/sqlminus/lib
(end update)
I think you should install dependencies.
If you are using Ubuntu or Debian based system, you could search by this
apt-cache search libyaml
Then you may find there are some packages related.
I would suggest you try to install this: apt-get install libyaml-dev -y
If you are using Mac OS, you could change the source in file check_libyaml.c, tell it what the absolute path of yaml.h is.
Or just specify the path while compiling
python setup.py config --with-includepath=/path/to/your/install/of/python/includes/
Then go compiling.
More info can be found here.
Hope this be helpful.
Based on dotslash's comment, editing setup.cfg and adding these two lines made everything work smoothly:
include_dirs=/usr/local/sqlminus/include
library_dirs=/usr/local/sqlminus/lib
I have installed python-2.7-macosx10.5.dmg from python.org on Mac os x 10.5.8.
I'm issuing: python setup.py build for pyPortMidi-0.0.3
And getting:
Found darwin (OS X) platform
running build
running build_ext
pyrexc pypm.pyx --> pypm.c
/Users/baz/Downloads/pyPortMidi-0.0.3/pypm.pyx:357:21: Type 'PmError' not acceptable as a boolean
building 'pypm' extension
creating build/temp.macosx-10.5-intel-2.7
gcc-4.0 -fno-strict-aliasing -fno-common -dynamic -g -O2 -DNDEBUG -g -O3 -arch i386 -arch x86_64 -I/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c pypm.c -o build/temp.macosx-10.5-intel-2.7/pypm.o
pypm.c:1:2: error: #error Do not use this file, it is the result of a failed Pyrex compilation.
pypm.c:1:2: error: #error Do not use this file, it is the result of a failed Pyrex compilation.
lipo: can't figure out the architecture type of: /var/folders/oO/oO1flrWgHAC8u6KdoO0Wq++++TI/-Tmp-//ccTcgy0s.out
error: command 'gcc-4.0' failed with exit status 1
Can anyone help me to resolve this?
I found the easiest way was to build the version of pyPortMidi included in pygame, which has some fixes applied.
You can use the following pattern to import it at the top of your file, preferring the standard version, but falling back to the pygame bundled version.
try:
import pypm
except ImportError:
from pygame import pypm
Using MacPorts, it was easy to install using:
port install py27-game +portmidi
I don't know if you need something special to include portmidi in the build if you build by other methods.
Looking at the link below it seems that stuff is broken and missing. Don't know if they are going to fix it anytime soon... Been bothering me too for some time...
https://groups.google.com/forum/#!topic/pygame-mirror-on-google-groups/sf3I8Q-wYQA
I'm trying to install PyCurl in my local environment which has python 2.7 and gcc-4.2 on OS X 10.7 Lion. I've tried doing this based on this answer Error installing PyCurl:
sudo env ARCHFLAGS="-arch x86_64" pip install pycurl
Which fails because I have gcc-4.2 installed via Xcode:
error: command 'gcc-4.0' failed with exit status 1
I've also tried downloading the source and building a setup.py (I modified this based on Problem trying to install PyCurl on Mac Snow Leopard):
gcc-4.2 -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -Os -Wall -Wstrict-prototypes -DENABLE_DTRACE -arch x86_64 -pipe -DHAVE_CURL_SSL=1 -I/System/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c src/pycurl.c
This results in the same error as above. I have verified that I do indeed have gcc-4.2 and that it is linked to my /usr/bin.
I'm thinking that it will work if I compile it correctly so that it knows to use gcc-4.2 when installing instead of gcc-4.0. However, I don't know how to do this and have not found something to explain passing an argument to use a particular gcc. I want to avoid overriding system defaults if possible.
Chances are you have a 32-bit-only Python 2.7 installed on your system (possibly downloaded from python.org) which was built with gcc-4.0 and includes a PPC universal variant. Building C extension modules with these Pythons is very problematic with Xcode 4 installed (the default for 10.7 and optional for 10.6) because gcc-4.0 and PPC support have both been removed. The easiest and best long-term solution is to install a 64-bit/32-bit Python build (see the python.org download page for current releases) or simply use the Apple-supplied Python 2.7.1 (/usr/bin/python2.7) in 10.7.
I have a distutils setup script with an Extension section, which looks something like this:
from distutils.core import setup, Extension
my_module = Extension('my_module',
sources = ['my_file.c', 'my_other_file.c'])
setup (name = 'my_module',
version = '1.0',
description = 'My module',
ext_modules = [my_module])
Running setup.py build works fine on my Mac. When I move to a Debian machine, it fails:
error: Python/Python.h: No such file or directory
I have python2.6 and python2.6-dev installed, and the file is present at /usr/include/Python2.6.
The command it executes for the problem file:
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.6 -c my_module.c -o -build/XYZ/my_module.o
So it is passing in the location of the header file.
The only obvious difference between the Mac vs Linux environment is gcc-4.2 vs gcc-4.4 and Python 2.7 vs Python 2.6
Ideas?
EDIT:
In the C file in question:
#include <Python/Python.h>
#include <Python/structmember.h>
May be in your module, you need to include "Python.h" instead of "Python/Python.h"?
or you may try exporting include path, and try compiling again with gcc or g++?
export C_INCLUDE_PATH=/usr/include/python2.6:$C_INCLUDE_PATH
export CPLUS_INCLUDE_PATH=/usr/include/python2.6:$CPLUS_INCLUDE_PATH
In my case, I was missing python3-dev, sudo apt-get install python3-dev fixed it.
I'm an extremely amateur programmer; I've done some recreational algorithmics programming, but I honestly have no idea how libraries and programming languages really fit together. I'm supposed to work on a project that requires some image processing, so I've been trying to install PIL for a while, but I haven't been able to.
I went to http://www.pythonware.com/products/pil/ and downloaded "Python Imaging Library 1.1.6 Source Kit (all platforms) (440k TAR GZ) (December 3, 2006)". Then I opened the folder in my command prompt and ran
$ python setup.py build_ext -i .
This was the output I got:
running build_ext
--- using frameworks at /System/Library/Frameworks
building '_imaging' extension
gcc -fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -dynamic -DNDEBUG -g -Os -Wall -Wstrict-prototypes -DMACOSX -I/usr/include/ffi -DENABLE_DTRACE -arch i386 -arch ppc -pipe -DHAVE_LIBZ -IlibImaging -I/opt/local/include -I/System/Library/Frameworks/Python.framework/Versions/2.5/include -I/usr/local/include -I/usr/include -I/System/Library/Frameworks/Python.framework/Versions/2.5/include/python2.5 -c _imaging.c -o build/temp.macosx-10.5-i386-2.5/_imaging.o
unable to execute gcc: No such file or directory
error: command 'gcc' failed with exit status 1
"import Image" produced an error when I tried it.
Do you guys have any idea what's going on? I'm using a MacBook Pro with a Core 2 Duo.
And I'm honestly sorry if this is ridiculously stupid.
Actually, assuming you're still using the default 2.5.x Python that comes with OS X (at least as of 10.5.6), there's a pre-built installer package for it (download the dmg for PIL).
Otherwise, you'll need to either build it from source -- which does require the mac dev tools -- or install it with MacPorts or fink
edit: mono makes a good point, you'll still need the dev tools unless you use the pre-built installer.
You need to install the developer tools that come on your Mac OS X install DVD.
GCC is the GNU compiler. It's a very useful thing to have. You just need to install it in whatever mac-friendly way exists.
http://www.tech-recipes.com/rx/726/mac-os-x-install-gcc-compiler/
So this is from awhile ago, but I just ran into the problem.
The issues lies with ->
/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/distutils/sysconfig.py
or wherever your python install is.
there is a line that sets the compile flags:
archflags = '-arch i386 -arch ppc -arch x86_64'
I just removed it from that line and went on my merry way. Now there is obviously a way to configure this from the line above:
os.environ['ARCHFLAGS']
but I don't know about that, and didn't want to mess with it.