I have a virtualenv in a structure like this:
venv/
src/
project_files
I want to run a makefile (which calls out to Python) in the project_files, but I want to run it from a virtual environment. Because of the way my deployment orchestration works, I can't simply do a source venv/bin/activate.
Instead, I've tried to export PYTHONPATH={project_path}/venv/bin/python2.7. When I try to run the makefile, however, the python scripts aren't finding the dependencies installed in the virtualenv. Am I missing something obvious?
The PYTHONPATH environmenbt variable is not used to select the path of the Python executable - which executable is selected depends, as in all other cases, on the shell's PATH environment variable. PYTHONPATH is used to augment the search list of directories (sys.path in Python) in which Python will look for modules to satisfy imports.
Since the interpreter puts certain directories on sys.path before it actions PYTHONPATH precisely to ensure that replacement modules with standard names do not shadow the standard library names. So any standard library module will be imported from the library associated with the interpreter it was installed with (unless you do some manual furkling, which I wouldn't recommend).
venv/bin/activate does a lot of stuff that needs to be handled in the calling shell's namespace, which can make tailoring code rather difficult if you can't find a way to source the script..
You can actually just call the Python interpreter in your virtual environment. So, in your Makefile, instead of calling python, call venv/bin/python.
To run a command in a virtualenv, you could use vex utility:
$ vex venv make
You could also check, whether make PYTHON=venv/bin/python would be enough in your case.
PYTHONPATH adjusts sys.path list. It doesn't change python binary. Don't use it here.
Related
I have a python project with multiple scripts (scriptA, scriptB, scriptC) that must be able to find packages located in subpath of the project that is not a python package or module. The organization is like so:
|_project
|_scriptA.py
|_scriptB.py
|_scriptC.py
|_thrift_packages
|_gen-py
|_thriftA
|__init__.py
|_thriftA.py
On a per script basis I am adding the absolute path to this directory to sys.path
Is there a way that I can alter the PYTHONPATH or sys.path every time a script is executed within the project so that I do not have to append the path to this directory to sys.path on a per-script basis?
You have an XY problem, albeit an understandable one since the "proper" way to develop a Python project is not obvious, and there aren't a lot of good guides to getting started (there are also differing opinions on this, especially in the details, though I think what I'll describe here is fairly commonplace).
First, at the root of your project, you can create a setup.py. These days this file can just be a stub; eventually the need for it should go away entirely, but some tools still require it:
$ cat setup.py
#!/usr/bin/env python
from setuptools import setup
setup()
Then create a setup.cfg. For most Python projects this should be sufficient--you only need to put additional code in setup.py for special cases. Here's a template for a more sophisticated setup.cfg, but for your project you need at a minimum:
$ cat setup.cfg
[metadata]
name = project
version = 0.1.0
[options]
package_dir =
=thrift_packages/gen-py
packages = find:
Now create and activate a virtual environment for your project (going in-depth into virtual environments will be out of scope for this answer but I'm happy to answer follow-up questions).
$ mkdir -p ~/.virtualenvs
$ python3 -m venv ~/.virtualenvs/thrift
$ source ~/.virtualenvs/thrift/bin/activate
Now you should have a prompt that look something like (thrift) $ indicating that you're in an isolated virtualenv. Here you can install any dependencies for your package, as well as the package itself, without interfering with your main Python installation.
Now install your package in "editable" mode, meaning that the path to the sources you're actively developing on will automatically be added to sys.path when you run Python (including your top-level scripts):
$ pip install -e .
If you then start Python and import your package you can see, for example, something like:
$ python -c 'import thriftA'; print(thriftA)
<module 'thriftA' from '/path/to/your/source/code/project/thrift_packages/gen-py/thriftA/__init__.py'>
If this seems like too much trouble, trust me, it isn't. Once you get the hang of this (and there are several project templates, e.g. made with cookie-cutter to take the thinking out of it) you'll see that it makes things like paths less trouble). This is how I start any non-trivial project (anything more than a single file); if you set everything up properly you'll never have to worry about fussing with sys.path or $PYTHONPATH manually.
In this guide, although the first part is a bit application-specific, if you ignore the specific purpose of this code a lot of it is actually pretty generic, especially the section titled "Packaging our package", which repeats some of this advice in more detail.
As an aside, if you're already using conda you don't need to create a virtualenv, as conda environments are just fancy virtualenvs.
An advantage to doing this the "right" way, is that when it comes time to install your package, whether by yourself, or by users, if your setup.cfg and setup.py are set up properly, then all users have to do is run pip install . (without the -e) and it should work the same way.
You should add __init__.py in each package, and then properly call all your script.
|_project
|_scriptA.py
|_scriptB.py
|_scriptC.py
|__init__.py <====== Here
|_thrift_packages
|__init__.py <====== Here
|_gen-py
|_thriftA
|__init__.py
|_thriftA.py
Assuming that, project is in your pythonpath, you can do:
from project.thrift_packages.thriftA import thriftA
yes you can.
However I think the other answers are more adapted to your current issue.
Here I just explain, how to call subprocesses with another environment, another current working directory.
This can come in handy in other situations.
You can get the current environment as a dict (os.environ) copy it to another dict, modify it and pass it to a supbprocess call. (subprocess functions have an env parameter)
import os
import subprocess
new_env = dict(os.environ)
new_env["PYTHONPATH"] = whateveryouwanthere # add here the pythonpath that you want
cmd = ["mycmd.py", "myarg1", "myarg2"] # replace with your command
subprocess.call(cmd, env=new_env) #
# or alternatively if you also want to change the current directory
# subprocess.call(cmd, env=new_env, cwd=path_of_your_choice) #
I'm attempting to find a set of steps necessary to make a virtual environment of python 3.6 on windows relocatable.
1st I created a virtual environment on virtualenv 15.1.0 with the following command:
virtualenv
--always-copy
-a "path\to\project\dir"
-r "path\to\requirements.txt"
venv_name
After this, I run the following command to use the built in 'make paths relative ' functionality of virtualenv:
virtualenv --relocatable venv_name
Part of my requirements.txt is pypiwin32 library which, at least when installed via pip, wont work until the:
python Scripts/pywin32_postinstall.py -install
script is run (See here for details).
At this point, if I search the venv directory for clues of hardcoding, I see them in scripts\activate.bat, which I can make relative by changing this:
set "VIRTUAL_ENV=C:\path\to\venv"
into this:
pushd %~dp0..
set VIRTUAL_ENV=%CD%
popd
There are some other other places where I had to make slight adjustments to make them relative (I used the search in folder feature of sublime with my username as the search parameter - it brought up all the path\to\username\then\some\more style lines in the directory.
There are 2 hardcoded paths which are not so simple:
1. "path\to\venv\Lib\orig-prefix.txt"
I understand that orig-prefix.txt is a record of which is the source python installation on which the venv was based and so cannot really be relative but may need to be left blank if moving the venv to another machine (it's absence may crash the python launcher but its emptiness is fine.)
2. "path\to\venv\Lib\site-packages\virtualenv_path_extensions.pth"
This is trickier. As it is a hard-coded path which is then added to sys.path as a location to look for modules, when I move the venv to another machine where this path doesn't exist, the module load will fail.
Is there a way I can add relative paths to the configuration files such as virtualenv_path_extensions.pth
Normally environments are tied to a specific path. That means that you cannot move an environment around or copy it to another computer. You can fix up an environment to make it relocatable with the command:
$ virtualenv --relocatable ENV
This will make some of the files created by setuptools use relative paths, and will change all the scripts to use activate_this.py instead of using the location of the Python interpreter to select the environment.
Note: scripts which have been made relocatable will only work if the virtualenv is activated, specifically the python executable from the virtualenv must be the first one on the system PATH. Also note that the activate scripts are not currently made relocatable by virtualenv --relocatable.
Note: you must run this after you’ve installed any packages into the environment. If you make an environment relocatable, then install a new package, you must run virtualenv --relocatable again.
Also, this does not make your packages cross-platform. You can move the directory around, but it can only be used on other similar computers. Some known environmental differences that can cause incompatibilities: a different version of Python, when one platform uses UCS2 for its internal unicode representation and another uses UCS4 (a compile-time option), obvious platform changes like Windows vs. Linux, or Intel vs. ARM, and if you have libraries that bind to C libraries on the system, if those C libraries are located somewhere different (either different versions, or a different filesystem layout).
I have a full python installation with files in /usr/local/, but also have one that I compiled from source sitting in ~/python_dist. If I look at sys.path on each interpreter I see that they indeed import from different libraries.
Currently I can run $ PYTHONPATH=~/other_py_libs ~/python_dist/bin/python to invoke the custom interpreter with some other modules available in the path. However, I don't want permanently change the global PYTHONPATH variable.
How can I permanently change the python path for only one specific python install?
The easiest way to do this is to use a virtualenv (manage with virtualenvwrapper). With virtual environments you can set up different, isolated python environments (kind of like little python playgrounds). Switching between them (with the help of virtualenvwrapper) is as easy as typing workon envname. You don't have to worry about switching the PYTHONPATH around, and you can direct scripts to use a specific environment simply by running them with the python install in that environment, e.g. using #! /home/myname/.virtualenvs/envname/bin python.
The first entry of sys.path is the directory of the current script, according to the docs. In the following setup, I would like to change this default. Imagine the following directory structure:
src/
core/
stuff/
tools/
tool1.py
tool2.py
gui/
morestuff/
gui.py
The scripts tool*.py and gui.py are intended to be run as scripts, like the following:
python src/core/tools/tool2.py
python src/gui/gui.py
Now all tools import from src.core.stuff, and the GUI needs gui.morestuff. This means that sys.path[0] should point to src/, but it points to src/core/tools/ or src/gui/ by default.
I can adjust sys.path[0] in every script (with a construct like the following, e.g., at the beginning of gui.py):
if __name__ == '__main__':
if sys.path[0]: sys.path[0] = os.path.dirname(os.path.abspath(sys.path[0]))
However, this is sort of redundant, and it becomes tedious for a mature code base with thousands of scripts. I also know the -m switch:
python -m gui.gui
But this requires the current directory to be src/.
Is there a better way to achieve the desired result, e.g. by modifying the __init__.py files?
EDIT: This is for Python 2.7:
~$ python -V
Python 2.7.3
The only officially approved way to run a script that is in a package is by using the -m flag. While you could run a script directly and try to do sys.path manipulations yourself in each script, it's likely to be a big pain. If you move a script between folders, the logic for rewriting sys.path may also need to be changed to reflect the new location. Even if you get sys.path right, explicit relative imports will not work correctly.
Now, making python -m mypackage.mymodule work requires that either you be in the project's top level folder (src in your case), or for that top level folder to be on the Python search path. Requiring you to be in a specific folder is awkward, and you've said that you don't want that. Getting src into the search path is our goal then.
I think the best approach is to use the PYTHONPATH environment variable to point the interpreter to your project's src folder so that it can find your packages from anywhere.
This solution is simple to set up (the environment variable can be be set automatically in your .profile, .bashrc or some other equivalent place), and will work for any number of scripts. If you move your project, just update your environment settings and you'll be all set, without needing to do any more work for each script.
You've got three basic options here. I've been through all three in both a production environment and personal projects. In many ways they build on each other. However, my advice is to just skip to the last one.
The fundamental problem is that you need your ./src directory to be in the python search path. This is really what python packaging is all about.
PYTHONPATH
The most straightforward, user defined way to adjust your python path is through the environment variable PYTHONPATH. You can set it at run time, doing something like:
PYTHONPATH=/src python src/gui/gui.py
You can of course also set this up in your global environment so hopefully all processes that need it will find the correct PYTHONPATH. But, just remember, you'll always forget one. Usually at 3 AM when your cron task finally runs.
Site Packages
To avoid needing an environment variable, your options are pretty much to include your software in an existing entry in the source path, or find some additional way to add a new search path. So this can mean dropping the contents of your src directory into /usr/lib/python2.7/site-packages or wherever your system site-packages is located.
Since you may not want to actually include the code in site-packages, you can create a symlink for your two sub-packages.
This is of course less than ideal for a number of reasons. If you're not careful with naming then suddenly every python program on the machine is exposed to potential name conflicts. You're exposing your software to every user on the machine. You might run into issues if python get's updated. If you add a new sub-package, now you have to create a new symlink.
A slightly better approach is to include a .pth file somewhere in your site-packages. When python encounters these files, it adds the contents (which is supposed to be the name of a directory) to the search path. This avoids the problem of having to remember to add a new symlink for each new sub-package.
virtualenv and packaging
The best solution is to just bite the bullet and do real python packaging. This, combined with great tools like virtualenv and pip let you have an isolated (or semi-isolated) python environment.
Under virtualenv, you would have a custom site-packages for just your project where you can easily install your software into it, avoiding all the problems of the earlier solutions. virtualenv also makes it easy to maintain executable scripts so that the python environment it runs under is exactly as you expect.
The one downside is that you have to write and maintain a setup.py which will instruct pip (the python installer) to include your software in the virtualenv. The contents would be something like:
!/usr/bin/env python
# -*- coding: utf-8 -*-
from distutils.core import setup
setup(
name='myproject',
package_dir={'myproject': 'src'},
scripts=['src/gui/gui.py', 'src/core/tools/tool1.py', 'src/core/tools/tool2.py']
)
So, to setup this environment, it's going to look something like this:
virtualenv env
env/bin/pip install -e setup.py
To run your script, then you'd just do something like:
env/bin/tool1.py
I wanted to do this to avoid having to set PYTHONPATH in the first place
There are other places you can hook into Python's sys.path initialization, using the site module, which is (by default) automatically imported when Python initializes.
Based on the this code in site.py...
# Prefixes for site-packages; add additional prefixes like /usr/local here
PREFIXES = [sys.prefix, sys.exec_prefix]
...it looks as if the intention was that this file was designed to be modified after installation, which is one option, although it also provides other ways you can influence sys.path, e.g. by placing a .pth file somewhere inside your site-packages directory.
Assuming the desired result is to make the code work 'out of the box', this would work, but only for all users on a single system.
If you need it to work on multiple systems, then you'd have to apply the same changes to all systems.
For deployment, this is no big deal. Indeed, many Python packages already do something like this. e.g. on Ubuntu...
~$ dpkg -L python-imaging | grep pth
/usr/share/pyshared/PIL.pth
/usr/lib/python2.7/dist-packages/PIL.pth
...but if your intention is to make it easy for multiple concurrent developers, each using their own system, you may be better off sticking with the current option of adding some 'boilerplate' code to every Python module which is intended to be run as a script.
There may be another option, but it depends on exactly what you're trying to achieve.
I'm new to virtualenv and not sure how to set up paths. My paths have been set to something like this:
PYTHONPATH=C:\Python27\
PYTHONSTARTUP=C:\Python27\Scripts\startup.py
PATH=%PYTHONPATH%;...;%PYTHONPATH%\Scripts
Should I remove those paths for virtualenv's activate script to work correctly? If I can keep my paths then how do I call scripts for an env when it has been activated? Do I call scripts by explicitly running them with python.exe instead of simply typing the script name alone?
python myscript.py
Not sure how to handle the paths and I would appreciate a little guidance.
First, you have your paths wrong. PYTHONPATH tells Python in what folders to look for Python modules and normally you don't put Python's installation folder in it. For keeping installation folder of Python there's different environment variable called PYTHONHOME. So instead of PYTHONPATH=C:\Python27\ you should have PYTHONHOME=C:\Python27\. You should change PATH variable to use PYTHONHOME accordingly.
As to how to set environment variables when working with virtualenv; you don't need to do anything because virtualenv stores original values when it's activated, modifies environment variables it needs to modify and then restores original values when it's deactivated.
You can take a look at Using Python on Windows
Think you are fine just get on with virtual-env, (follow docs) but remember you must use cmd shell (NO POINT AND CLICKING!!) Took me a while before I realized that...
Once you have activated And installed what you want to in the virtual env,, you invoke scripts by "python scriptname"