library path for setup.py? - python

When trying to build numpy on a linux platform, I can't make the configure script look in the right place.
I use
python setup.py config --library-dirs=/software/intel/mkl/10.2.2.025/lib/em64t/
but then I get
mkl_info:
libraries mkl,vml,guide not found in /software/intel/mkl/10.2.2.025
libraries mkl,vml,guide not found in /software/intel/mkl/10.2.2.025/include
libraries mkl,vml,guide not found in /software/intel/mkl/10.2.2.025/lib
So it looks like it never actually looks into the subdirectory emt64/. The path I'm giving is also present in my LD_LIBRARY_PATH.
How can I give the script the right path?
Thanks in advance!

Had a similar problem with rpy2. Did not have root permissions and could not alter the existing R installation or add to its core library directory. R was not built as a shared object library, so I could not link the rpy2 build to a libR.so.
I had to cross compile libR.so on a separate machine (same R version, same Linux family) and copy it to a different directory. I wanted that directory to be seen by setup.py.
Couldn't get -L to work on the command line. It appeared that this argument was deactivated.
(FAIL) python setup.py -L${LD_LIBRARY_PATH} build install
What I ended up doing was editing setup.py and changing a line that accepts library directory entries.
(old) r_libs = []
(new) [os.path.join('/root','path','to_my','install','R','lib'),]
Reran it as: python setup.py build install
Success!

Perhaps
export PYTHONLIB="/software/intel/mkl/10.2.2.025/lib/em64t/"
python setup.py config

Related

Shared library distributed in python wheel not found

I have a python module which contains a C++ extension as well a shared library on which the C++ extension depends. The shared library is picked up by setuptools as an extra_object on the extension. After running python setup.py bdist_wheel, the module wheel object gets properly generated has a directory structure as follows:
+-pymodule.whl
| +-pymodule
| +-pymodule-version.dist-info
| +-extension.so
| +-shared_lib.so
To install this wheel, in my python environment I call pip install pymodule.whl which copies the python sources as well as the .so files into the environment's site-packages directory.
After installing the module, one can then attempt to import the module by calling import pymodule in a terminal for the environment. This triggers an exception to be thrown:
ImportError: shared_lib.so: cannot open shared object file: No such file or directory
This exception can be resolved by appending the appropriate site-packages directory to the LD_LIBRARY_PATH variable; however, it seems that this should work out of the box, especially considering that python is clearly able to locate extension.so.
Is there a way to force python to locate this shared library without having to explicitly point LD_LIBRARY_PATH at the installation location(i.e. site-packages)?
This question works around a similar problem by using package data and explicitly specifying an install location for the shared library. The issue I have with this approach is that the shared object is decoupled from the extension. In my case, the shared library and extension are both targets built by the same cmake build. I had previously attempted to use skbuild to build cmake based extentions; however, as per this issue, there is a similar issue in skbuild with including other libraries generated as part of the extension build.

How to include python-dev in buildroot?

I'm making a buildroot for my raspberrypi3 for a school project.
I've made a buildroot with everything from python included because i want to use WebIOPi. A buildroot has been done and the image has been written on the SDCard.
Now when I want to install it on the buildroot device it asks for python-dev, wich is not included by buildroot. With further research I've only found this. Thats a python-dev0.4.0 but i think there's a much recent version on my virtual ubuntu16 os.(main os is windows 10, so need image to use win32diskimager)
But I don't know how to implement this in the python buildroot packages. I've already read the manuals from buildroot, it's kinda confusing for me...
I've already tried to make a directory named 'python-dev' in the buildroot/package directory (ubuntu os) but with no succes.
This is what i've got so far:
buildroot/package/python-dev:
config.in
python-dev.mk
in the config.in file:
config BR2_PACKAGE_PYTHON_DEV
bool "python-dev"
help
in the python-dev.mk file (copied from libconfig):
################################################################################
#
# python-dev
#
################################################################################
PYTHON_DEV_VERSION = 0.4.0
PYTHON_DEV_SOURCE = dev-0.4.0.tar.gz
PYTHON_DEV_SITE = https://pypi.python.org/packages/53/34/e0d4da6c3e9ea8fdcc4657699f2ca62d5c4ac18763a897feb690c2fb0574/dev-0.4.0.tar.gz
PYTHON_DEV_LICENSE = Python software foundation license v2, others
PYTHON_DEV_LICENSE_FILES = README
PYTHON_DEV_SETUP_TYPE = setuptools
PYTHON_DEV_DEPENDENCIES = libconfig boost
$(eval $(python-package))
When I run a make menuconfig and search for python-dev, it's not there...
I hope someone could help me with this.
If there's an easier way, it's pretty much welcome.
Thank you in advance.
The python-dev package that the WebIOPi setup script is checking for has nothing to do with the dev python package that you found at https://pypi.python.org/pypi/dev.
The python-dev package is a package on Linux distributions that contains the development files for the Python library that is installed together with the Python interpreter. It installs the necessary files to allow C/C++ programs to link against libpython.
Buildroot has already installed what you need in STAGING_DIR. However, you are probably trying to install WebIOPi directly on the target, which is not how Buildroot is intended to be used. Buildroot does not allow to do development on the target: it does not provide a compiler on the target, nor the necessary files for development.
Buildroot is intended to be used as a cross-compilation environment. So what you should do instead is create a Buildroot package for WebIOPi, and have it cross-compiled (from your host machine), so that it gets installed, ready to use, in your target filesystem.

Compiling C shared library with distutils' setup.py, when the library depends on a second shared library

I'm on OSX, trying to compile a shared library in C with distutils' setup.py (to use in python using ctypes). I'm new to distutils, but I'm having problems when the shared library I want to compile (libreboundx.so) depends on another shared library (librebound.so). Explicitly, in modify_orbits_direct.c I have
#include "rebound.h"
rebound.h is in directory /Users/dt/rebound/src/, and all the functions in rebound.h are in the shared library librebound.so, which is in /Users/dt/rebound/.
The linking with cc would look like.
cc -fPIC -shared reboundx.o -L/Users/dt/rebound -lrebound -o libreboundx.so
UPDATE: This situation looks exactly like the example at the end of Sec. 3 at https://docs.python.org/2/extending/building.html. I've updated my setup.py to mimic that one:
libreboundxmodule = Extension('libreboundx',
sources = [ 'src/reboundx.c',
'src/modify_orbits_direct.c'],
include_dirs = ['src', '/Users/dt/rebound/src'],
extra_compile_args=['-fstrict-aliasing', '-O3','-std=c99','-march=native', '-D_GNU_SOURCE', '-fPIC'],
library_dirs=['/Users/dt/rebound'],
libraries=['rebound'],
)
This installs fine when I run
pip install -e ./
Build output:
You are using pip version 7.0.3, however version 7.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Obtaining file:///Users/dtamayo/Documents/workspace/reboundx
Installing collected packages: reboundx
Running setup.py develop for reboundx
Successfully installed reboundx-1.0
but when I try
import reboundx
in Python, I get an OSError: dlopen(libreboundx.so, 10): Symbol not found: _reb_boundary_particle_is_in_box, which is a function in the other library (librebound.so), which doesn't even get called in the code for libreboundx.so.
If I link the shared library with the cc command above, everything works, and I can use the shared library libreboundx.so perfectly fine in C. If I try to take the same libreboundx.so I compile with the cc command and stick it where setup.py would put it, then try to import reboundx in python, I instead get
OSError: dlopen(/Users/dtamayo/Documents/workspace/reboundx/reboundx/../libreboundx.so, 10): Library not loaded: librebound.so
Referenced from: /Users/dtamayo/Documents/workspace/reboundx/libreboundx.so
Reason: image not found
Could this be like an rpath issue, where at runtime libreboundx.so doesn't know where to look for librebound.so?
Thanks for all the suggestions. I should have specified in the question that in the end I want a solution that I could package for upload to PyPy so users can install with a single command. It should also run on both OSX and Linux, so I preferred solutions not involving install_name_tool.
I haven't been able to test it, but I think adding
runtime_library_dirs=['/Users/dt/rebound'],
next to library_dirs should fix the problem on Linux. Apparently this doesn't work on Mac, but you instead can use extra_link_args. Adding this below the libreboundxmodule definition posted above,
if platform.system() == 'Darwin':
extra_link_args.append('-Wl,-rpath,'+'/Users/dtamayo/Documents/workspace/rebound')
fixed my problem. I found the answer here: Python runtime_library_dirs doesn't work on Mac

Python runtime_library_dirs doesn't work on Mac

I have a Python extension module that needs to link against some dynamic libraries at runtime, so I need to tell it where to look for them. I'm doing this by specifying runtime_library_dirs in my setup.py. This works fine on Linux, but seems to have no effect on Mac. I get an ImportError when I try to import my module, and the only way I've found to make it go away is to add the library directory to DYLD_LIBRARY_PATH before starting python. What do I need to do to make this work?
I finally figured this out. The solution has two parts. First, setup.py needs to use extra_link_args to tell the linker to add a correct rpath to the compiled module:
if platform.system() == 'Darwin':
extra_link_args.append('-Wl,-rpath,'+lib_path)
where lib_path is the directory where the libraries are installed. Second, all of the libraries you're linking against must have install names that begin with "#rpath/". For example, if a library is called "libFoo.dylib", its install name should be "#rpath/libFoo.dylib". You can use "install_name_tool -id" to change the install name of a library.
You can tell what libraries an extension links against with
otool -L pyext.so
I had a problem where an extension was linking to the wrong version of a library on my system. In that case I used install_name_tool to change the path to the library directly. For example,
install_name_tool -change /wrong/libfoo.dylib /right/libfoo.dylib pyext.so

Easy Install for Python and Eclipse Library Paths

Recently I found about this tool easy_install that help me to easy install additional python modules. The problem is that for each module it creates additional *.egg folder (sometime there is only an egg file?) (no source?) and I don't know how to setup eclipse paths.
By default I have included C:\Python26\Lib\site-packages and this is enough when I install python modules from source... but not when I'm using easy_intall
For example django instaled with easy_install is located in C:\Python26\Lib\site-packages\django-1.2.5-py2.6.egg\django and installed from source it's located in C:\Python26\Lib\site-packages\django
In fact when I'm using easy_install all installed modules are working without a problem, the only problem is that eclipse can't locate where is the source and gives me a false unresolved import errors
Where I'm wrong?
I'm assuming that eclipse does not search the egg files for source. Eggs, like jar files in Java, are just zipfiles of python code with some included metadata.
You'll also note that in site-packages you've got easy-install.pth and setuptools.pth files. Those files are parsed by python and used to add other directories and egg files to your PYTHONPATH (import sys; sys.path) so that Python can find the code in those locations. Eclipse isn't seeing those imports as valid because it is most likely not setup to take pth files into account.
To get Eclipse to recognize that Django is really installed you may want to try removing your easy_installed django package and reinstalling it with:
easy_install --always-unzip django
That way rather than installing a compressed egg file you'll have a normal package directory that eclipse should have a fairly easy time opening.
Alternatively, in your screenshot above it looks like you may just need to explicitly add each egg file you want eclipse to use.

Categories

Resources