I am currently in the process of developing a module that I am using as a library to be imported in another project. I need a sane way that I can install this module in the python site packages in such a way that I don't need to reinstall it everytime I make changes to it. Currently, I am using sudo pip install --force-reinstall {BASE_FOLDER_FOR_MODULE}, but need to re run this command everytime I make any changes to the module code.
The little bit of info I've been able to find on the subject seems to indicate that while I can symlink the base folder for the module in the site-packages folder for my other project, that this may not necessarily be a good way to do it. Is symlinking the folder bad, and if so why? Is there an alternate (better) option?
Thanks
If the library is still under active development then consider adding it to environment variable PYTHONPATH. Directories in PYTHONPATH are appended to sys.path and are searched last when trying to find a module. Using PYTHONPATH means you only have to make minimal changes (set it in a config file source file or .bashrc file) for things to work. Once the library is finalised you can install it to the site-packages directory and remove it from PYTHONPATH.
Related
For testing my libraries on multiple Python versions I have a single virtual environment that I install them into, and reference them with their complete name/version (i.e. python3.7). Recently I noticed that sys.path is still referencing the source library instead of the copied library (i.e. /source/python/... instead of /source/virtualenv/lib/python3.7/...)
I've tried make install instead of make altinstall 1, I've searched for answers -- so far nothing has helped.
How do I fix this?
1 PSA: If you use make (alt)install and you don't want to clobber your system Python, make sure and use
./configure --prefix /path/to/install_to/here
TL;DR Remove the pyvenv.cfg in the virtualenv root directory.
The issue is the interaction with the virtualenv, and not make. Somewhere in Python's startup it checks to see if it is running in a virtualenv and, if it is, uses the libraries from it's original installation (and I had created the virtualenv from my source copy).
The solution is to remove the pyvenv.cfg file in the root of the virtualenv. This will completely isolate the virtualenv from the system (so no sharing of site-packages nor dist-packages), but is exactly what I wanted for my purposes.
I'm new to Python, so I think my question is very fundamental and is asked a few times before but I cannot really find something (maybe because I do not really know how to search for that problem).
I installed a module in Python (reportlab). Now I wanted to modify a python script in that module but it seems that the python interpreter does not notice the updates in the script. Ironically the import is successful although Python actually should not find that package because I deleted it before. Does Python uses something like a Cache or any other storage for the modules? How can I edit modules and use those updated scripts?
From what you are saying, you downloaded a package and installed it using either a local pip or setup.py. When you do so, it copies all the files into your python package directory. So after an install, you can delete the source folder because python is not looking here.
If you want to be able to modify, edit, something and see changes, you have to install it in editable mode. Inside the main folder do:
python setup.py develop
or
pip install -e .
This will create a symbolic link to you python package repository. You will be able to modify sources.
Careful for the changes to be effective, you have to restart your python interpreter. You cannot just import again the module or whatever else.
I'm trying to create a development environment where modules are
divided into libraries and applications.
The application needs to import a module that is not installed
as a package into the main python packages.
Both the application and the libraries are continuously modified.
This is the directory layout of the files. Files from one project may be reused by other projects, and thus cannot be in the same directory tree.
projA\lib\util.py
projA\lib\other.py
projB\lib\another.py
projC\src\app1\app1.py
So far, the best I could come up with is the following,
which causes problems for IDEs code completion because of the dynamic imports
# app1.py
import sys
sys.path.append('../../../projA/lib')
import util
Is there a better way of doing this?
How about using virtualenv, and installing your other projects as libraries in the virtual system path.
Most Python IDE's support virtualenv and have no problems with code-completion.
Also it is a good practice making it easy to distribute your project and managing dependencies.
If you don't want to use virtualenv or dynamic imports,
you can add your modules path to PYTHONPATH environment variable.
PYTHONPATH
Note: you might have to create this environment variable, assuming you are using Windows OS, you can use (from command line):
setx PYTHONPATH folder1;folder2;etc
setx
I want to install python on a flash drive in a virtual environment so that I can develop code wherever I am. Is this possible to do in such a way that I can use my flash drive on windows/mac/linux computers?
For windows, head to Portable Python (http://PortablePython.com) to see various options you have,
For linux and Mac you don't need to install it on USB drive as those systems usually come with Python pre-installed. If you need specific packages for those systems, bring them on USB together with a command line script that can load them with one call in virtualenv on those systems and you are good to go !
Be aware that this is never 100% bullet proof as you are depending on Python version you are using/bringing packages for.
You could try looking at setting up something using some VirtualEnv type environments, with the various Python versions installed on your machines.
Not sure how you'd get round the different paths on the different operating systems though.
Virtualenv: http://pypi.python.org/pypi/virtualenv
As #millimoose pointed out, you could install three different versions of Python.
For each Python package you are working on, you can create a .pth file in the site-packages directory of each Python version that you would like to use the package from.
Note that, as described here:
If you put a .pth file in the site-packages directory containing a path, python searches this path for imports.
For example, if you have a package named my_package that you are working on that resides at the path C:\Users\Me\Documents\dev_packages\my_package, you can add a file with extension .pth (note that the name doesn't matter, specifically it doesn't have to have any relation to the package name), with the contents:
C:\Users\Me\Documents\dev_packages
This will add C:\Users\Me\Documents\dev_packages to the Python import search-path, causing the my_package package to be discovered. By placing this .pth file in the site-packages directory of each Python version, my_package will be available in all corresponding versions of Python.
I'm developing a Python utility module to help with file downloads, archives, etc. I have a project set up in a virtual environment along with my unit tests. When I want to use this module on the same computer (essentially as "Production"), I move the files to the mymodule directory in the ~/dev/modules/mymodule
I keep all 3rd-party modules under ~/dev/modules/contrib. This contrib path is on my PYTHONPATH, but mymodule is NOT because I've noticed that if mymodule is on my PYTHONPATH, my unit tests cannot distinguish between the "Development" version and the "Production" version. But now if I want to use this common utility module, I have to manually add it to the PYTHONPATH.
This works, but I'm sure there's a better, more automated way.
What is the best way to have a Development and Production module on the same computer? For instance, is there a way to set PYTHONPATH dynamically?
You can add/modify python paths at sys.path, just make sure that the first path is the current directory ".", because some third-party modules rely on importing from the directory of the current module.
More information on python paths:
http://djangotricks.blogspot.com/2008/09/note-on-python-paths.html
I'm guessing by virtual environment you mean the virtualenv package?
http://pypi.python.org/pypi/virtualenv
What I'd try (and apologies if I've not understood the question right) is:
Keep the source somewhere that isn't referenced by PYTHONPATH (e.g. ~/projects/myproject)
Write a simple setuptools or distutils script for installing it (see Python distutils - does anyone know how to use it?)
Use the virtualenv package to create a dev virtual environment with the --no-site-packages option - this way your "dev" version won't see any packages installed in the default python installation.
(Also make sure your PYTHONPATH doesn't have any of your source directories)
Then, for testing:
Activate dev virtual environment
Run install script, (usually something like python setup.py build install). Your package ends up in /path/to/dev_virtualenv/lib/python2.x/site-packages/
Test, break, fix, repeat
And, for production:
Make sure dev virtualenv isn't activated
Run install script
All good to go, the "dev" version is hidden away in a virtual environment that production can't see...
...And there's no (direct) messing around with PYTHONPATH
That said, I write this with the confidence of someone who's not actually tried setting using virtualenv in anger and the hope I've vaguely understood your question... ;)
You could set the PYTHONPATH as a global environment variable pointing to your Production code, and then in any shell in which you want to use the Development code, change the PYTHONPATH to point to that code.
(Is that too simplistic? Have I missed something?)