Cross-compiled Python can't find basic modules (math, operator, etc) - python

I can't seem to import any of the basic modules located in the "lib-dynload" directory. They are all there, but I get the error: "ImportError: No module named X" when trying to import them.
I checked my sys.path and it includes the directory where all of these modules are located and my PYTHONHOME environment variable is set correctly. I'm at a bit of a loss as to what the problem could be. Some background info: This is cross-compiled from Python 2.6.6 source and installed onto an ARM embedded Linux board with Angstrom.
It did have python on there before, I had tried to bit-bake it into the image but it was missing a lot of stuff. I ended up doing my best to clean the directory tree of anything to do with the previous python before loading on my cross compiled version.
An strace of a simple script that just attempts to import math: http://pastebin.com/3XgJ3nPR

I see no checks in that trace for filenames like math.so or mathmodule.so which might indicate that shared-object modules are turned off entirely — that the version of Python you have compiled cannot load binary modules dynamically.
More: looking over the config.out from my most recent Python build, I see several lines where Python is investigating whether the platform will let it dynamically load binary modules that end in .so:
checking for dlopen... yes
checking DYNLOADFILE... dynload_shlib.o
checking MACHDEP_OBJS... MACHDEP_OBJS
What do these lines say on your cross-compile?

I have recently run across a similar issue building Python 2.7.13, and I believe it is this bug which is being fixed for Python 3 but not ported back to 2. The build process (setup.py) generates a list of modules to build, and then subtracts the list of built-in modules (sys.builtin_module_names); however, setup.py is run (from the Makefile) using python2.7 which in my case picked up the system (Ubuntu) binary rather than the one built, so it subtracts off modules that are built-in for the system python (including operator and collections) but not for the one being built, so they are neither built-in nor built as external modules.
I was able to use a suggestion from the bug and prepend the built python, in the source directory, to the path (and add a symlink from python2.7 -> python). This worked because I was building an x86 python on a multi-arch x64 machine; if you are building for another system like ARM you may need to apply the patch from the bug to get the list of built-in modules from earlier in the build process rather than the host python.

Related

Importing cython generated *.so-module with another python-version or on another OS

How should a file myModule.cpython-35m-x86_64-linux-gnu.so be imported in python? Is it possible?
I tried the regular way:
import myModule
and the interpreter says:
`ModuleNotFoundError: No module named 'myModule'`
This is a software that I can't install in the cluster that I am working at so I just extracted the .deb package and it does not have a wheel file or structure to install.
It is problematic to use a C-extension built for one Python version in another Python version. Normally (at least for Python3) there is a mechanism in place to differentiate C-extensions for different Python versions, so they can co-exist in the same directory.
In your example, the suffix is cpython-35m-x86_64-linux-gnu so this C-extension will be picked up by a CPython3.5 on a x86_64 Linux. If you try to import this extension with another Python-version or on another plattform, the module isn't visible and ModuleNotFoundError is raised.
It is possible to see, which suffixes are accepted by the current Python version, e.g. via:
>>> import _imp
>>>_imp.extension_suffixes()
['.cpython-36m-x86_64-linux-gnu.so', '.abi3.so', '.so']
A possibility is to use the stable C-API which could be used with multiple Python versions without recompilation. Cython start to support it in version 3.0 (see this PR), see also this SO-post about setuptools and stable C-API.
One might want to be clever and rename the extension to simple .so, so it can be picked up by the Finder - this can/does work for some Python-version combinations on some platforms for some extension - yet this approach cannot be sustained in the long run and is not the right thing to do.
The right thing to do, is to build the C-extension for/with the right Python-version on the right OS/platform or to use the right wheel (or use stable C-API).
In general, a C-extension built for a python-version (let's say PythonA.B) cannot be used by another Python version (let's say PythonC.D), because those extensions/modules are linked against a special Python-library and the needed functionality might no longer/not yet be present in the library of another version.
This different to *.py-files and more similar to *.pyc-files which cannot be used with a different version.
While PEP-3147 regulates the suffices of *.pyc-files, PEP-3149 does the same for the C-extensions. PEP-3149 is however not the state-of-the-art, as some of the problems where fixed only in Python3.5, the whole discussion can be found here.

How to safely create and import .so files in Python 3?

I am trying to use Cython to create .so binary files from our .py files and shared it with our team.
However even if we all use Python3, most of times it should be exactly similar revision (let us say 3.7.8), otherwise we get an error to import them.
Is this behavior expected?
Some of revisions are compatible. For example if we make .so with python 3.5.2 and import in 3.6.8 it works but it does not work in 3.7.8
Where does this mess comes from and what is the safest way to do this?
To follow up on my comments:
On the same platform, extension modules should work within a "minor version" (i.e. modules built with 3.7.2 and 3.7.3 should be compatible). I'm struggling to find a source for this though. Beyond that some effort has made in the past to ensure compatibility between releases, but not so much any more so it's possible you may be lucky and things work.
distutils/setuptools and other similar build mechanisms tag extension modules with a suffix indicating the version and some other details. For example, and extension would be called foo.cpython-37m.so instead of just foo.so. These tags prevent the module from being used with other Python versions and are a good thing. If you are removing these tags then this mess is entirely on you.
Python now defines a more limited stable ABI that should be compatible across Python versions. Cython is working towards supporting that but at the moment it is not in a usable state. In a year or so it should be a good solution.
In summary, .so files are not expected to be portable between different Python versions. You should either standardise on a Python version or build the .so files locally.

Embed Python3 without standard library

EDIT: I have asked an opposing question here: How to embed Python3 with the standard library
A solution for Python2 is provided here: Is it possible to embed python without the standard library?
However, Python3 fails on Py_Initialize(); with:
Fatal Python error: Py_Initialize: unable to load the file system
codec ImportError: No module named 'encodings'
This makes sense because py3 source files are utf-8 by default. So it seems that it requires an external binary just in order to parse py3 source files.
So what to do?
It looks as though I need to locate the encodings binary in my system Python installation, copy this into my project tree, and maybe set some environment variable PYTHONPATH(?) So that my libpython.dylib can find it.
Is it possible to avoid this? And if not, can anyone clarify the steps I need to take? Are there going to be any more hiccups?
NOTE: For posterity, this is how I got a stand-alone libpython.dylib linking into my project on OSX:
First I locate my system Python's library: /usr/local/Frameworks/Python.framework/Versions/3.4/Python (in my case it was installed with homebrew).
Now I:
copy the .dylib into my project folder creating ./Libs/libpython3.4.1_OSX.dylib
Go into build settings -> linking and set other linker flags to -lpython3.4.1_OSX
At this point it will appear to work. However if you know try building it on a fresh OSX installation, it will fail. This is because:
$ otool -D ./libpython3.4.1_OSX.dylib
./libpython3.4.1_OSX.dylib:
/usr/local/Frameworks/Python.framework/Versions/3.4/Python
The .dylib is still holding onto it's old location. It's really weird to me that the .dylib contains a link to its location, as anything that uses it must know where it is in order to invoke it in the first place!
We can correct this with:
$ install_name_tool -id #rpath/libpython3.4.1_OSX.dylib libpython3.4.1_OSX.dylib
But then also in our Xcode project we must:
Go into build phases. Add a copy files step that copies libpython3.4.1_OSX.dylib to Frameworks (that's the right place to put it).
Go into build settings -> linking and set runpath search paths to #executable_path/../Frameworks/libpython3.4.1_OSX.dylib
Finally I need to go into edit scheme -> run -> arguments -> environment variables and add PYTHONHOME with value ../Frameworks
I suspect that to get this working I will also need to add a PYTHONPATH
Links:
https://mikeash.com/pyblog/friday-qa-2009-11-06-linking-and-install-names.html
http://qin.laya.com/tech_coding_help/dylib_linking.html
https://github.com/conda/conda-build/issues/279#issuecomment-67241554
Can you please help me understand how Mach-O libraries work in Mac Os X?
http://nshipster.com/launch-arguments-and-environment-variables/
I have attempted this and it would take more time than you want to spend to embed Python 3 without the Python library.
Some modules in the library are necessary for Python 3 to run and it would take a lot of modifications for it to work without them.

What is Building and Installing?

This is probably a question that has a very easy and straightforward answer, however, despite having a few years programming experience, for some reason I still don't quite get the exact concepts of what it means to "build" and then to "install". I know how to use them and have used them a lot, but have no idea about the exact processes which happen in the background...
I have looked across the web, wikipedia, etc... but there is no one simple answer to it, neither can I find one here.
A good example, which I tried to understand, is adding new modules to python:
http://docs.python.org/2/install/index.html#how-installation-works
It says that "the build command is responsible for putting the files to install into a build directory"
And then for the install command: "After the build command runs (whether you run it explicitly, or the install command does it for you), the work of the install command is relatively simple: all it has to do is copy everything under build/lib (or build/lib.plat) to your chosen installation directory."
So essentially what this is saying is:
1. Copy everything to the build directory and then...
2. Copy everything to the installation directory
There must be a process missing somewhere in the explanation...complilation?
Would appreciate some straightforward not too techy answer but in as much detail as possible :)
Hopefully I am not the only one who doesn't know the detailed answer to this...
Thanks!
Aivoric
Building means compiling the source code to binary in a sandbox location where it won't affect your system if something goes wrong, like a build subdirectory inside the source code directory.
Install means copying the built binaries from the build subdirectory to a place in your system path, where they become easily accessible. This is rarely done by a straight copy command, and it's often done by some package manager that can track the files created and easily uninstall them later.
Usually, a build command does all the compiling and linking needed, but Python is an interpreted language, so if there are only pure Python files in the library, there's no compiling step in the build. Indeed, everything is copied to a build directory, and then copied again to a final location. Only if the library depends on code written in other languages that needs to be compiled you'll have a compiling step.
You want a new chair for your living-room and you want to make it yourself. You browse through a catalog and order a pile of parts. When they arrives at your door, you can't immediately use them. You have to build the chair at your workshop. After a bit of elbow-grease, you can sit down in it. Afterwards, you install the chair in your living-room, in a convenient place to sit down.
The chair is a program you want to use. It arrives at your house as source code. You build it by compiling it into a runnable program. You install it by making it easier to use.
The build and install commands you are refering to come from setup.py file right?
Setup.py (http://docs.python.org/2/distutils/setupscript.html)
This file is created by 3rd party applications / extensions of Python. They are not part of:
Python source code (bunch of c files, etc)
Python libraries that come bundled with Python
When a developer makes a library for python that he wants to share to the world he creates a setup.py file so the library can be installed on any computer that has python. Maybe this is the MISSING STEP
Setup.py sdist
This creates a python module (the tar.gz files). What this does is copy all the files used by the python library into a folder. Creates a setup.py file for the module and archives everything so the library can be built somewhere else.
Setup.py build
This builds the python module back into a library (SPECIFICALLY FOR THIS OS).
As you may know, the computer that the python library originally came from will be different from the library that you are installing on.
It might have a different version of python
It might have a different operating system
It might have a different processor / motherboard / etc
For all the reasons listed above the code will not work on another computer. So setup.py sdist creates a module with only the source files needed to rebuild the library on another computer.
What setup.py does exactly is similar to what a makefile would do. It compiles sources / creates libraries all that stuff.
Now we have a copy of all the files we need in the library and they will work on our computer / operating system.
Setup.py install
Great we have all the files needed. But they won't work. Why? Well they have to be added to Python that's why. This is where install comes in. Now that we have a local copy of the library we need to install it into python so you can use it like so:
import mycustomlibrary
In order to do this we need to do several things including:
Copy files to their library folders in our version of python.
Make sure library can be imported using import command
Run any special install instructions for this library. (seting up paths, etc)
This is the most complicated part of the task. What if our library uses BeautifulSoup? This is not a part of Python Library. We'd have to install it in a way such that our library and any others can use BeautifulSoup without interfering with each other.
Also what if python was installed someplace else? What if it was installed on a server with many users?
Install handles all these problems transparently. What is does is make the library that we just built able to run. All you have to do is use the import command, install handles the rest.

How do I embed a python library in a C++ app?

I've embedded python on a mobile device successfully, but now how do I include a python library such as urllib?
Additionally, how can I include my own python scripts without a PYTHONPATH?
(please note: python is not installed on this system)
The easiest way is to create a .zip file containing all the python code you need and add this to your process's PYTHONPATH environment variable (via setenv()) prior to initializing the embedded Python interpreter. Usage of .pyd libraries can be done similarly by adding them to the same directory as the .zip and including the directory in the PYTHONPATH as well.
Usage of the setenv() call can cause trouble on Windows if you're mixing c-runtime versions. I spent many aggrivating hours learing that setenv() only sets the environment variables for the version of the c-runtime your compiler ships with. So if, for example, Python was built with VC++ 2005 and your compiler is VC++ 2008, you'll need to use an alternative mechanism. Browsing the sources for py2exe and/or PyInstaller may provide you with a better solution (since you're doing essentially the same thing as these tools) but a simple alternative is to "cheat" by using PyRun_SimpleString() to set the module search path from within Python itself.
snprintf(buff, "import sys\nsys.path.append("%s")\n", py_zip_filename)
PyRun_SimpleString(buff)

Categories

Resources