Using Eclipse PyDev's search function with external libraries - python

I find PyDev's search function incredibly useful and use it regularly to navigate around my projects. I've got my interpreters set up correctly so PyDev knows about the external libraries that my code uses, and even lets me follow references into the library modules. This is great, obviously, but I also want to be able to search the external libraries like I can search my own code.
There's a similar question pertaining to Java development here: How do I search Libraries in eclipse?
Is there anything out there for PyDev?

I use two different approaches to allow searching in my library code:
When I am using virtualenv, I keep all my code under myproject/src and add it and myproject/lib/python2.7/site-packages/ as pydev source folders. (Be sure to setup your python interpreter to myproject/bin/python as well)
In other cases, I use two different pydev projects. The first (myproject) includes my code. The second one is called myproject-lib and includes the libraries as it's source paths (.../site_packages). The first project references the second projects (and usually I keep both of them in one workspace). This works great with virtualenv, but I believe that you can actually create a pydev project in your system-wide python. Make sure you use the same python interpreter in both projects.
Now you can quickly and easily use Open Resource (CTRL+T) and the Globals Browser (CTRL+Shift+T) to lookup your libs.

I'm afraid PyDev doesn't support this yet. I created feature request for this at https://jira.appcelerator.org/browse/APSTUD-7405 Meanwhile you could link folders of external libraries to your project.

Related

Multiple Python Version Env Setup

I'm working in the VFX industry and we deal with different software packages that ship with their own Python interpreter. We are running on Linux and use modules to handle our environments to make sure that people are using the correct version of all applications depending on the project they are working on.
Since months, we are trying to setup an environment that supports multiple versions of Python. And what is blocking right now are additional Python libraries that we are using in our in-house tools, like sqlalchemy, psycopg2, openpyxl, zmq, etc.
So far, for each project, we have config file that defines the version of each package to be use including the additional Python modules. And to use the correct Python version of each Python module, we look up the main Python interpreter defined in that same modules definition file. This works as long as the major and minor versions of all Python interpreters do line up.
But now, I would like to start an application that ships with a Python 3.7 interpreter and another application with a Python 3.9 interpreter and so on. All applications do you use our in-house tools which need the additional Python modules. Of course, this fails when trying to import any additional module.
For now, the only solution that I see is to install the corresponding Python modules in the 'site-packages' of each application that comes with its own Python interpreter. That should work. But this means that we have to install for each application version all necessary Python modules (ideally, the same version of it to avoid compatibility issues) and when we decide to update one of them, this needs to be done again for all 3rd party applications.
That does not sound super-efficient to me.
Do you have similar experiences and what did you came up with to handle this? I know, that there are more advanced packages like rez to handle complex environment setups, but although I do not know the details of rez, I could imagine that the problems stays the same. I guess that it is not possible to globally populate PYTHONPATH with additional modules so that it works on multiple Python interpreter versions.
Another solution that I could imagine is to make sure that on startup of each application that needs additional Python modules, we do our own sys.path modification depending on the interpreter version. That would imply some coding but we could keep a global version handling without installing them everywhere.
Anyway, if you have any further hints, please let me know.
Greets,
Carlo

Difference between installing and importing modules

New to Python, so excuse my lack of specific technical jargon. Pretty simple question really, but I can't seem to grasp or understand the concept.
It seems that a lot of modules require using pip or easy_install and running setup.py to "install" into your python installation or your virtualenv. What is the difference between installing a module and simply taking it and importing the into another script? It seems that you access the modules the same way.
Thanks!
It's like the difference between:
Uploading a photo to the internet
Linking the photo URL inside an HTML page
Installing puts the code somewhere python expects those kinds of things to be, and the import statement says "go look there for something named X now, and make the data available to me for use".
For a single module, it usually doesn't make any difference. For complicated webs of modules, though, an installation program may do many things that wouldn't be immediately obvious. For example, it may also copy data files into locations the new modules can find them, put executables (binary libraries, or DLLs on Windws, for example) where the new modules can find them, do different things depending on which version of Python you have, and so on.
If deploying a web of modules were always easy, nobody would have written setup programs to begin with ;-)

Include python library with program distribution

Is there a way to distribute a python library with an application, so that it can be run with out any installation? The app is primarily going to be used in a computer lab, where users do not have permission to install global libraries. Ideally, users would simply be able to unzip a folder and run the app. The following can be assumed:
The python interpreter is present
Linux operating system
The specific library I need is matplotlib, but I would like to find a generic solution. I've looked at programs like PyInstaller, but they create very large programs that are slow to start. They also include a python interpreter, which is unnecessary.
Firstly, p2exe is Windows only.
In principle you can put all of you libraries into the ZIP file so they get expanded in place with thie application. At most you may need to adjust the PYTHONPATH variable to point at the lib's location.
There is no technical difference between modules installed on the path and in the system's python installation.
Have you looked at cxfreeze?

Referencing an external library in a Python appengine project, using Pydev/Eclipse

it's a couple of months I've started development in Python - having myself a C# and Java background.
I'm currently working on 2 different python/appengine applications, and as often happens in those cases, both application share common code - so I would like to refactor and move the common/generic code into a shared place.
In either Java or C# I'd just create a new library project, move the code into the new project and add a reference to the library from the main projects.
I tried the same in Python, but I am unable to make it work.
I am using Eclipse with Pydev plugin.
I've created a new Pydev project, moved the code, and attempted to:
reference the library project from the main projects (using Project Properties -> Project References)
add the library src folder folder into the main projects (in this case I have an error - I presume it's not possible to leave the project boundaries when adding an existing source folder)
add as external library (pretty much the same as google libraries are defined, using Properties -> External libraries)
Import as link (from Import -> File System and enabling "Create links in workspace")
In all cases I am able to reference the library code while developing, but when I start debugging, the appengine development server throws an exception because it can't find what I have moved into a separate library project.
Of course I've searched for a solution a lot, but it looks like nobody has experienced the same problem - or maybe nobody doesn't need to do the same :)
The closest solution I've been able to find is to add an ant script to zip the library sources and copy in the target project - but this way debugging is a pain, as I am unable to step into the library code.
Any suggestion?
Needless to say, the proposed solution must take into account that the library code has to be included in the upload process to appengine...
Thanks
The dev_appserver and the production environment don't have any concept of projects or libraries, so you need to structure your app so that all the necessary libraries are under the application's root. The easiest way to do this, usually, is to symlink them in as subdirectories, or worst-case, to copy them (or, using version control, make them sub-repositories).
How that maps to operations in your IDE depends on the IDE, but in general, it's probably easiest to get the app structured as you need it on disk, and work backwards from that to get your IDE set up how you like it.

alternatives to DYLD_LIBRARY_PATH/LD_LIBRARY_PATH

I'm developing python C++ extensions for use in both OSX and linux. Currently, I can run my code with a wrapper script wrapper.sh:
#!/bin/bash
trunk=`dirname $0`
trunk=`cd $trunk; pwd`
export DYLD_LIBRARY_PATH=$DYLD_LIBRARY_PATH:$trunk/lib
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$trunk/lib/:$trunk/src/hdf5/lib/:$trunk/src/python/lib
$trunk/src/python/bin/python "$#"
which is able to set up my run like this: wrapper.sh app.py
What I would like to do is to eliminate the need for wrapper.sh, so I need alternatives for DYLD_LIBRARY_PATH and LD_LIBRARY_PATH. I can not put my libraries in some standard location like /usr/local/lib because on my machine, I maintain several independent instances of my libraries. That is, my libraries need to be kept somewhere relative to my installation path. I can't put these environment variables in my login script for the same reason. Currently, I need to call one of my wrapper.sh scripts to use the associated libraries. My goal is to be able to run merely app.py, which if it lives in my installation path, should be able to find its associated python and libraries. The purpose is to simplify execution for users, and to simplify usage of external tools like nosetests.
One alternative seems to be using rpath when I build my version of python:
./configure --enable-shared --prefix=$(CURDIR)/$(PYTHON_DIR) LDFLAGS="-Wl,-rpath,$(CURDIR)/lib/ -Wl,-rpath,$(CURDIR)/src/hdf5/lib -Wl,-rpath,$(CURDIR)/src/python/lib"
This trick seems to work fine on linux, even though one of my libraries ended up needing to be copied directly into trunk/src/python/lib/python2.6/lib-dynload for some reason unclear to me. However, this trick is not working on OSX; it looks like I need to run install_name_tool on all my dylibs libraries.
The other alternative I came up with was to do something like this:
ln -s wrapper.sh python
so that my scripts could all use #! ../python, but I'm getting Unmatched ". errors. Same thing if I use #! ../wrapper.sh. I'm not really an expert in bash...
However, these all seem so unnecessarily complicated, and surely this is something that other people have solved?? Thanks for any advice!
For python extensions, consider using PYTHONPATH: the Python interpreter will search the PYTHONPATH for .py/.pyc/.pyo/.so modules, as well as packages. See docs for Python 2.x as well as docs for Python 3.x; specifically the section named "The Module Search Path" on both pages. This also references information that seems to indicate that it is possible to update the module search path at runtime, which, if true, means that you could add all that logic to your program and it can hunt for its libraries on its own (say if it installs a copy in /usr/libexec/pkgname/... somewhere or something).
For all but the most complex of cases, though, setting PYTHONPATH and using a shell-script or native-compiled binary wrapper to start the core program is an okay approach, and one that is also used in other language environments including Mono and Java.
Not sure if this would be an acceptable (partial) solution in your circumstances, but another way to get libraries noticed by ld on linux is to add the path to the libraries to /etc/ld.so.conf and then runldconfig
For the Mac I don't remember the details, but I think Apple provide some resources for distributing apps packaged as a .app which includes some default locations (relative to the root of the .app) for libraries, or "frameworks" as they call them. Would require some googling from there - sorry can't help further on that but hope you get some progress :-)

Categories

Resources