Developing and using the same Python on the same computer - python

I'm developing a Python utility module to help with file downloads, archives, etc. I have a project set up in a virtual environment along with my unit tests. When I want to use this module on the same computer (essentially as "Production"), I move the files to the mymodule directory in the ~/dev/modules/mymodule
I keep all 3rd-party modules under ~/dev/modules/contrib. This contrib path is on my PYTHONPATH, but mymodule is NOT because I've noticed that if mymodule is on my PYTHONPATH, my unit tests cannot distinguish between the "Development" version and the "Production" version. But now if I want to use this common utility module, I have to manually add it to the PYTHONPATH.
This works, but I'm sure there's a better, more automated way.
What is the best way to have a Development and Production module on the same computer? For instance, is there a way to set PYTHONPATH dynamically?

You can add/modify python paths at sys.path, just make sure that the first path is the current directory ".", because some third-party modules rely on importing from the directory of the current module.
More information on python paths:
http://djangotricks.blogspot.com/2008/09/note-on-python-paths.html

I'm guessing by virtual environment you mean the virtualenv package?
http://pypi.python.org/pypi/virtualenv
What I'd try (and apologies if I've not understood the question right) is:
Keep the source somewhere that isn't referenced by PYTHONPATH (e.g. ~/projects/myproject)
Write a simple setuptools or distutils script for installing it (see Python distutils - does anyone know how to use it?)
Use the virtualenv package to create a dev virtual environment with the --no-site-packages option - this way your "dev" version won't see any packages installed in the default python installation.
(Also make sure your PYTHONPATH doesn't have any of your source directories)
Then, for testing:
Activate dev virtual environment
Run install script, (usually something like python setup.py build install). Your package ends up in /path/to/dev_virtualenv/lib/python2.x/site-packages/
Test, break, fix, repeat
And, for production:
Make sure dev virtualenv isn't activated
Run install script
All good to go, the "dev" version is hidden away in a virtual environment that production can't see...
...And there's no (direct) messing around with PYTHONPATH
That said, I write this with the confidence of someone who's not actually tried setting using virtualenv in anger and the hope I've vaguely understood your question... ;)

You could set the PYTHONPATH as a global environment variable pointing to your Production code, and then in any shell in which you want to use the Development code, change the PYTHONPATH to point to that code.
(Is that too simplistic? Have I missed something?)

Related

Python, site-packages partially on network file system?

CentOS 7 system, Python 2.7. The OS's installed python has directory
/usr/lib/python2.7/site-packages
and that is where a
python setup.py install
command would install a package. On a computing cluster I would like to install some packages so that they are referenced from that directory but which actually reside in an NFS served directory here:
/usr/common/lib/python2.7/site-packages
That is, I do not want to have to run setup.py on each of the cluster nodes to do a local install on each, duplicating the package on every machine. The packages already installed locally must not be affected, some of those are used by the OS's commands. Also the local ones must work even if the network is down for some reason. I am not trying to set up a virtual environment, I am only trying to place a common set of packages in a different directory in such a way that the OS supplied python sees them.
It isn't clear to me what is the best way to do this. It seems like such a common problem that there must be a standard or preferred way of doing this, and if possible, that is the method I would like to use.
This command
/usr/bin/python setup.py install --prefix=/usr/common
would probably install into the target directory. However the "python" command on the cluster nodes will not know this package is present, and there is no "network" python program that corresponds to the shared site-packages.
After the network install one could make symlinks from the local to the shared directory for each of the files created. That would be acceptable, assuming that is sufficient.
It looks like the PYTHONPATH environmental variable might also work here, although I'm unclear about what it expects
for "path" (full path to site-packages, or just the /usr/common part.)
EDIT: This does seem to work as needed, at least for the test case. The software package in question was installed using --prefix, as above. PYTHONPATH was not previously defined.
export PYTHONPATH=/usr/common/lib/python2.7/site-packages
python $PATH_TO_START_SCRIPT/start.py
ran correctly.
Thanks.

Install Locally Developed Python Module

I am currently in the process of developing a module that I am using as a library to be imported in another project. I need a sane way that I can install this module in the python site packages in such a way that I don't need to reinstall it everytime I make changes to it. Currently, I am using sudo pip install --force-reinstall {BASE_FOLDER_FOR_MODULE}, but need to re run this command everytime I make any changes to the module code.
The little bit of info I've been able to find on the subject seems to indicate that while I can symlink the base folder for the module in the site-packages folder for my other project, that this may not necessarily be a good way to do it. Is symlinking the folder bad, and if so why? Is there an alternate (better) option?
Thanks
If the library is still under active development then consider adding it to environment variable PYTHONPATH. Directories in PYTHONPATH are appended to sys.path and are searched last when trying to find a module. Using PYTHONPATH means you only have to make minimal changes (set it in a config file source file or .bashrc file) for things to work. Once the library is finalised you can install it to the site-packages directory and remove it from PYTHONPATH.

Absolute imports and managing both a "live" and "development" version of a package

As my limited brain has come to understand it after much reading, relative imports bad, absolute imports good. My question is, how can one effectively manage a "live" and "development" version of a package? That is, if I use absolute imports, my live code and development code are going to be looking at the same thing.
Example
/admin/project1/__init__.py
/scripts/__init__.py
/main1.py
/main2.py
/modules/__init__.py
/helper1.py
with "/admin" on my PYTHONPATH, the contents of project1 all use absolute imports. For example:
main1.py
import project1.modules.helper1
But I want to copy the contents of project1 to another location, and use that copy for development and testing. Because everything is absolute, and because "/admin" is on PYTHONPATH, my copied version is still going to be referencing the live code. I could add my new location to PYTHONPATH, and change the names of all files by hand (i.e. add "dev" to the end of everything), do my changes/work, then when I'm ready to go live, once again, by hand, remove "dev" from everything. This, will work, but is a huge hassle and prone to error.
Surely there must be some better way of handling "live" and "development" versions of a Python project.
You want to use virtualenv (or something like it).
$ virtualenv mydev
$ source mydev/bin/activate
This creates a local Python installation in the mydev directory and modifies several key environment variables to use mydev instead of the default Python directories.
Now, your PYTHONPATH looks in mydev first for any imports, and anything you install (using pip, setup.py, etc) will go in mydev. When you are finished using the mydev virtual environment, run
$ deactivate
to restore your PYTHONPATH to its previous value. mydev remains, so you can always reactivate it later.
#chepner's virtualenv suggestion is a good one. Another option, assuming your project is not installed on the machine as a python egg, is to just add your development path to the front of PYTHONPATH. Python will find your development project1 before the regular one and everyone is happy. Eggs can spoil the fun because they tend to get resolved before the PYTHONPATH paths.

Arranging code in Python

I'm trying to create a development environment where modules are
divided into libraries and applications.
The application needs to import a module that is not installed
as a package into the main python packages.
Both the application and the libraries are continuously modified.
This is the directory layout of the files. Files from one project may be reused by other projects, and thus cannot be in the same directory tree.
projA\lib\util.py
projA\lib\other.py
projB\lib\another.py
projC\src\app1\app1.py
So far, the best I could come up with is the following,
which causes problems for IDEs code completion because of the dynamic imports
# app1.py
import sys
sys.path.append('../../../projA/lib')
import util
Is there a better way of doing this?
How about using virtualenv, and installing your other projects as libraries in the virtual system path.
Most Python IDE's support virtualenv and have no problems with code-completion.
Also it is a good practice making it easy to distribute your project and managing dependencies.
If you don't want to use virtualenv or dynamic imports,
you can add your modules path to PYTHONPATH environment variable.
PYTHONPATH
Note: you might have to create this environment variable, assuming you are using Windows OS, you can use (from command line):
setx PYTHONPATH folder1;folder2;etc
setx

python egg development environment setup

I inherited a python project, which has been packaged as egg. Upon check out through SVN, I am seeing package content as:
__init__.py
scripts/
ptools/
setup.py
...
Here, ptools/ hold the source of various modules. scripts/ is bunch of end-user tools that make use of modules provided by the "ptools". The package has been installed on this shared host environment through "easy_install", but I want to modify both scripts/ and ptools/ and test them out without having to go through the cycle of "make an egg, and easy_install" that will affect everyone else.
However, I am lost on how to make environment changes to make scripts/ not to search default .egg when invoking through my development tree, instead of using the "local" modules in ptools/ ... any ideas?
Update: I should have added that I tried PYTHONPATH approach by putting the module path in dev tree there, but then I tried verify through "import sys; print sys.path", there is no change in module search path, which baffles me.
thanks
Oliver
I think I have found the solution to my problem, and this has been answered in the following post. "setup.py develop" seems to be the perfect solution
PYTHONPATH vs. sys.path
You can use the PYTHONPATH environment variable to customize the locations Python searches for modules.

Categories

Resources