How to get Python module install directory in CMake? - python

I have created a small Python module using C++ and Boost, built with CMake. Now I need to install it into standard python modules directory, however it needs to be a path inside CMAKE_INSTALL_PREFIX, so it can be packaged with standard Debian packaging system.
Now I have this, where the last line is obviously wrong, as it is an absolute path to a system directory.
cmake_minimum_required(VERSION 3.12)
project(foo)
# Not Shown: Extract ${PYTHON_VERSION} from interpreter
find_package(Python ${PYTHON_VERSION} REQUIRED COMPONENTS Interpreter Development)
find_package(Boost 1.55.0 REQUIRED COMPONENTS system python${Python_VERSION_MAJOR}${Python_VERSION_MINOR})
Python_add_library(foo MODULE
src/foo.cc src/python_interface.cc
)
set_target_properties(foo PROPERTIES PREFIX "")
target_link_libraries(foo
PUBLIC
Boost::python${Python_VERSION_MAJOR}${Python_VERSION_MINOR}
)
# This is wrong
install(TARGETS foo DESTINATION ${Python_SITEARCH}/foo)
I also had a look at Python's distutils and setuptools, however they don't seem suited to standalone use.

Related

Shared library distributed in python wheel not found

I have a python module which contains a C++ extension as well a shared library on which the C++ extension depends. The shared library is picked up by setuptools as an extra_object on the extension. After running python setup.py bdist_wheel, the module wheel object gets properly generated has a directory structure as follows:
+-pymodule.whl
| +-pymodule
| +-pymodule-version.dist-info
| +-extension.so
| +-shared_lib.so
To install this wheel, in my python environment I call pip install pymodule.whl which copies the python sources as well as the .so files into the environment's site-packages directory.
After installing the module, one can then attempt to import the module by calling import pymodule in a terminal for the environment. This triggers an exception to be thrown:
ImportError: shared_lib.so: cannot open shared object file: No such file or directory
This exception can be resolved by appending the appropriate site-packages directory to the LD_LIBRARY_PATH variable; however, it seems that this should work out of the box, especially considering that python is clearly able to locate extension.so.
Is there a way to force python to locate this shared library without having to explicitly point LD_LIBRARY_PATH at the installation location(i.e. site-packages)?
This question works around a similar problem by using package data and explicitly specifying an install location for the shared library. The issue I have with this approach is that the shared object is decoupled from the extension. In my case, the shared library and extension are both targets built by the same cmake build. I had previously attempted to use skbuild to build cmake based extentions; however, as per this issue, there is a similar issue in skbuild with including other libraries generated as part of the extension build.

Python Installing data files and then finding them again

I have a python package targeted at linux machines that needs to install its locale files to an accessible location. Right now, I have them being installed to sys.prefix + "/share/locale/".
However, I found a small caveat with Ubuntu and pip. Under default conditions, Ubuntu installs packages installed with pip to /usr/local and sets sys.prefix to that during installation. However, after installation, when the package is run, the prefix is /usr, meaning my code can't find the locale files installed at /usr/local.
I could simply hardcode the location, but I would prefer not to do this, as it makes the package less portable and would require the user to install it as root. These are added as data_files in setup.py and won't be discoverable as a python package.
How else can I ensure my package can find my the locale files after
installation?
I thought about adding a line to the package's __init__.py during installation, which created a variable pointing to the locale dir's location. However, it did not seem trivial to edit files being installed without changing the source files.
This is a python 3 only package.
Maybe use the resource functions available in pkg_resources to find the files?
from pkg_resources import resource_stream, resource_filename
with resource_stream('my_package', 'locale/foo.dat') as infp:
# ...
# ... or ...
foo_location = resource_filename('my_package', 'locale/foo.dat')

How to resolve cmake directory libz.so.1 conflict with library in implicit directories of anaconda python? python

I'm trying to build an example using cmake which needs python and mpi.I have several python versions installed, pvpython python ipython and anaconda python. I set normal python in my PATH variable (I'm working in ubuntu-linux)
I'm new to cmake stuff. Some people stated I have to change toolchainfile.cmake but I cannot locate it in my example files. Any lead on how to solve this? Thanks in advance!
Following is the error I get while running ccmake.
CMake Warning at CMakeLists.txt:14 (ADD_EXECUTABLE):
Cannot generate a safe runtime search path for target Fortran90FullExample
because files in some directories may conflict with libraries in implicit
directories:
runtime library [libz.so.1] in /usr/lib/x86_64-linux-gnu may be hidden by
files in:/home/xxx/anaconda/lib
runtime library [libpython2.7.so.1.0] in /usr/lib/x86_64-linux-gnu may be
hidden by files in:
/home/xxx/anaconda/libSome of these libraries may not be found correctly.
I searched for the file libz.so.1 in /usr/lib/x86_64-linux-gnu directory and it was in it. So I set the paths specifically for this directory and not the anaconda directories.
This time I used ccmake instead of cmake and I was able to easily give the paths on /usr/lib/x86_64-linux-gnu instead of the anaconda paths.
Also I changed my python path from anaconda python to the usual python location which was /usr/bin for me.
I added this to the path
PATH=/usr/bin:$PATH
This adds it to the front of the PATH variable and does not effect what is already there.
Also I had to set PYTHONHOME=$PYTHONPATH to get rid of all the relatedd errors

Python runtime_library_dirs doesn't work on Mac

I have a Python extension module that needs to link against some dynamic libraries at runtime, so I need to tell it where to look for them. I'm doing this by specifying runtime_library_dirs in my setup.py. This works fine on Linux, but seems to have no effect on Mac. I get an ImportError when I try to import my module, and the only way I've found to make it go away is to add the library directory to DYLD_LIBRARY_PATH before starting python. What do I need to do to make this work?
I finally figured this out. The solution has two parts. First, setup.py needs to use extra_link_args to tell the linker to add a correct rpath to the compiled module:
if platform.system() == 'Darwin':
extra_link_args.append('-Wl,-rpath,'+lib_path)
where lib_path is the directory where the libraries are installed. Second, all of the libraries you're linking against must have install names that begin with "#rpath/". For example, if a library is called "libFoo.dylib", its install name should be "#rpath/libFoo.dylib". You can use "install_name_tool -id" to change the install name of a library.
You can tell what libraries an extension links against with
otool -L pyext.so
I had a problem where an extension was linking to the wrong version of a library on my system. In that case I used install_name_tool to change the path to the library directly. For example,
install_name_tool -change /wrong/libfoo.dylib /right/libfoo.dylib pyext.so

Can I somehow "compile" a python script to work on PC without Python installed?

So I have a Python script:
myscript.py
I am executing it like this:
python D:\myscript.py
However, I must have Python installed and included in the PATH environment variable for that to work.
Is it somehow possible to "bundle" Python executable with a Python script so other people will be able to run it on their PCs without Python?
It is ok if it will work only in Windows.
EDIT:
After trying the compile.py I get this error:
Traceback (most recent call last):
File "D:\stuff\compile.py", line 4, in <module>
import py2exe
ImportError: No module named py2exe
Here is one way to do it (for Windows, using py2exe).
First, install the py2exe on your Windows box.
Then create a python script named compile.py, like this:
import sys
from distutils.core import setup
import py2exe
entry_point = sys.argv[1]
sys.argv.pop()
sys.argv.append('py2exe')
sys.argv.append('-q')
opts = {
'py2exe': {
'compressed': 1,
'optimize': 2,
'bundle_files': 1
}
}
setup(console=[entry_point], options=opts, zipfile=None)
To compile your Python script into a Windows executable, run this script with your program as its argument:
$ python compile.py myscript.py
It will spit out a binary executable (EXE) with a Python interpreter compiled inside. You can then just distribute this executable file.
PyInstaller has worked well for me, generating reasonably small packages due to its use of upx. Its dependency detection was better than py2exe at the time as well. It seems not to have a lot of recent development and probably doesn't work with 3.x, however.
The source in the repository is a better starting point than the 1.4 package.
Also see the wiki page about working with Python 2.6+.
From the features list:
Packaging of Python programs into standard executables, that work on computers without Python installed.
Multiplatform: works under Windows (32-bit and 64-bit), Linux (32-bit and 64-bit) and Mac OS X (32-bit only for now, see MacOsCompatibility).
Multiversion: works under any version of Python from 1.5 up to 2.7. NOTE: If you're using Python 2.6+ on Windows, see Python26Win.
Flexible packaging mode:
Single directory: build a directory containing an executable plus all the external binary modules (.dll, .pyd, .so) used by the program.
Single file: build a single executable file, totally self-contained, which runs without any external dependency.
Custom: you can automate PyInstaller to do whatever packaging mode you want through a simple script file in Python.
Explicit intelligent support for many 3rd-packages (for hidden imports, external data files, etc.), to make them work with PyInstaller out-of-the-box (see SupportedPackages).
Full single-file EGG support: required .egg files are automatically inspected for dependencies and bundled, and all the egg-specific features are supported at runtime as well (entry points, etc.).
Partial directory EGG support: required .egg directories are automatically inspected for dependencies and bundled, but egg-specific features will not work at runtime.
Automatic support for binary libraries used through ctypes (see CtypesDependencySupport for details).
Support for automatic binary packing through the well-known UPX compressor.
Optional console mode (see standard output and standard error at runtime).
Windows-specific features:
Support for code-signing executables.
Full automatic support for CRTs: no need to manually distribute MSVCR*.DLL, redist installers, manifests, or anything else; true one-file applications that work everywhere!
Selectable executable icon.
Fully configurable version resource section and manifests in executable.
Support for building COM servers.
Mac-specific features:
Preliminar support for bundles (see MacOsCompatibility)
You want something like py2exe.
There are multiple solutions like py2exe, cx-freeze or (only for Mac OS X) py2app.
Here is a list of them.

Categories

Resources