I want to perform several operations while working on a specified virtualenv.
For example command
make install
would be equivalent to
source path/to/virtualenv/bin/activate
pip install -r requirements.txt
Is it possible?
I like using something that runs only when requirements.txt changes:
This assumes that source files are under project in your project's root directory and that tests are under project/test. (You should change project to match your actually project name.)
venv: venv/touchfile
venv/touchfile: requirements.txt
test -d venv || virtualenv venv
. venv/bin/activate; pip install -Ur requirements.txt
touch venv/touchfile
test: venv
. venv/bin/activate; nosetests project/test
clean:
rm -rf venv
find -iname "*.pyc" -delete
Run make to install packages in requirements.txt.
Run make test to run your tests (you can update this command if your tests are somewhere else).
run make clean to delete all artifacts.
In make you can run a shell as command. In this shell you can do everything you can do in a shell you started from comandline. Example:
install:
( \
source path/to/virtualenv/bin/activate; \
pip install -r requirements.txt; \
)
Attention must be paid to the ;and the \.
Everything between the open and close brace will be done in a single instance of a shell.
Normally make runs every command in a recipe in a different subshell. However, setting .ONESHELL: will run all the commands in a recipe in the same subshell, allowing you to activate a virtualenv and then run commands inside it.
Note that .ONESHELL: applies to the whole Makefile, not just a single recipe. It may change behaviour of existing commands, details of possible errors in the full documentation. This will not let you activate a virtualenv for use outside the Makefile, since the commands are still run inside a subshell.
Reference documentation: https://www.gnu.org/software/make/manual/html_node/One-Shell.html
Example:
.ONESHELL:
.PHONY: install
install:
source path/to/virtualenv/bin/activate
pip install -r requirements.txt
I have had luck with this.
install:
source ./path/to/bin/activate; \
pip install -r requirements.txt; \
This is an alternate way to run things that you want to run in virtualenv.
BIN=venv/bin/
install:
$(BIN)pip install -r requirements.txt
run:
$(BIN)python main.py
PS: This doesn't activate the virtualenv, but gets thing done. Hope you find it clean and useful.
I like to set my Makefile up so that it uses a .venv directory if one exists, but defaults to using the PATH.
For local development, I like to use a virtual environment, so I run:
# Running this: # Actually runs this:
make venv # /usr/bin/python3 -m venv .venv
make deps # .venv/bin/python setup.py develop
make test # .venv/bin/python -m tox
If I'm installing into a container though, or into my machine, I might bypass the virtual environment by skipping make venv:
# Running this: # Actually runs this:
make deps # /usr/bin/python3 setup.py develop
make test # /usr/bin/python3 -m tox
Setup
At the top of your Makefile, define these variables:
VENV = .venv
VENV_PYTHON = $(VENV_PYTHON)/bin/python
SYSTEM_PYTHON = $(or $(shell which python3), $(shell which python))
# If virtualenv exists, use it. If not, find python using PATH
PYTHON = $(or $(wildcard $(VENV_PYTHON)), $(SYSTEM_PYTHON))
If ./venv` exists, you get:
VENV = .venv
VENV_PYTHON = .venv/bin/python
SYSTEM_PYTHON = /usr/bin/python3
PYTHON = .venv/bin/python
If not, you get:
VENV = .venv
VENV_PYTHON = .venv/bin/python
SYSTEM_PYTHON = /usr/bin/python3
PYTHON = /usr/bin/python3
Note: /usr/bin/python3 could be something else on your system, depending on your PATH.
Then, where needed, run python stuff like this:
$(PYTHON) -m tox
$(PYTHON) -m pip ...
You might want to create a target called "venv" that creates the .venv directory:
$(VENV_PYTHON):
rm -rf $(VENV)
$(SYSTEM_PYTHON) -m venv $(VENV)
venv: $(VENV_PYTHON)
And a deps target to install dependencies:
deps:
$(PYTHON) setup.py develop
# or whatever you need:
#$(PYTHON) -m pip install -r requirements.txt
Example
Here's my Makefile:
# Variables
VENV = .venv
VENV_PYTHON = $(VENV)/bin/python
SYSTEM_PYTHON = $(or $(shell which python3), $(shell which python))
PYTHON = $(or $(wildcard $(VENV_PYTHON)), $(SYSTEM_PYTHON))
## Dev/build environment
$(VENV_PYTHON):
rm -rf $(VENV)
$(SYSTEM_PYTHON) -m venv $(VENV)
venv: $(VENV_PYTHON)
deps:
$(PYTHON) -m pip install --upgrade pip
# Dev dependencies
$(PYTHON) -m pip install tox pytest
# Dependencies
$(PYTHON) setup.py develop
.PHONY: venv deps
## Lint, test
test:
$(PYTHON) -m tox
lint:
$(PYTHON) -m tox -e lint
lintfix:
.PHONY: test lint lintfix
## Build source distribution, install
sdist:
$(PYTHON) setup.py sdist
install:
$(SYSTEM_PYTHON) -m pip install .
.PHONY: build install
## Clean
clean:
rm -rf .tox *.egg-info dist
.PHONY: clean
Based on the answers above (thanks #Saurabh and #oneself!) I've written a reusable Makefile that takes care of creating virtual environment and keeping it updated: https://github.com/sio/Makefile.venv
It works by referencing correct executables within virtualenv and does not rely on the "activate" shell script. Here is an example:
test: venv
$(VENV)/python -m unittest
include Makefile.venv
Differences between Windows and other operating systems are taken into account, Makefile.venv should work fine on any OS that provides Python and make.
You also could use the environment variable called "VIRTUALENVWRAPPER_SCRIPT". Like this:
install:
( \
source $$VIRTUALENVWRAPPER_SCRIPT; \
pip install -r requirements.txt; \
)
A bit late to the party but here's my usual setup:
# system python interpreter. used only to create virtual environment
PY = python3
VENV = venv
BIN=$(VENV)/bin
# make it work on windows too
ifeq ($(OS), Windows_NT)
BIN=$(VENV)/Scripts
PY=python
endif
all: lint test
$(VENV): requirements.txt requirements-dev.txt setup.py
$(PY) -m venv $(VENV)
$(BIN)/pip install --upgrade -r requirements.txt
$(BIN)/pip install --upgrade -r requirements-dev.txt
$(BIN)/pip install -e .
touch $(VENV)
.PHONY: test
test: $(VENV)
$(BIN)/pytest
.PHONY: lint
lint: $(VENV)
$(BIN)/flake8
.PHONY: release
release: $(VENV)
$(BIN)/python setup.py sdist bdist_wheel upload
clean:
rm -rf $(VENV)
find . -type f -name *.pyc -delete
find . -type d -name __pycache__ -delete
I did some more detailed writeup on that, but basically the idea is that you use the system's Python to create the virtual environment and for the other targets just prefix your command with the $(BIN) variable which points to the bin or Scripts directory inside your venv. This is equivalent to the activate function.
I found prepending to $PATH and adding $VIRTUAL_ENV was the best route:
No need to clutter up recipes with activate and constrain oneself to ; chaining
Shown here and here
Can simply use python as you would normally, and it will fall back onto system Python
No need for third party packages
Compatible with both Windows (if using bash) and POSIX
# SYSTEM_PYTHON defaults to Python on the local machine
SYSTEM_PYTHON = $(shell which python)
REPO_ROOT = $(shell pwd)
# Specify with REPO_ROOT so recipes can safely change directories
export VIRTUAL_ENV := ${REPO_ROOT}/venv
# bin = POSIX, Scripts = Windows
export PATH := ${VIRTUAL_ENV}/bin:${VIRTUAL_ENV}/Scripts:${PATH}
And for those interested in example usages:
# SEE: http://redsymbol.net/articles/unofficial-bash-strict-mode/
SHELL=/bin/bash -euo pipefail
.DEFAULT_GOAL := fresh-install
show-python: ## Show path to python and version.
#echo -n "python location: "
#python -c "import sys; print(sys.executable, end='')"
#echo -n ", version: "
#python -c "import platform; print(platform.python_version())"
show-venv: show-python
show-venv: ## Show output of python -m pip list.
python -m pip list
install: show-python
install: ## Install all dev dependencies into a local virtual environment.
python -m pip install -r requirements-dev.txt --progress-bar off
fresh-install: ## Run a fresh install into a local virtual environment.
-rm -rf venv
$(SYSTEM_PYTHON) -m venv venv
#$(MAKE) install
You should use this, it's functional for me at moment.
report.ipynb : merged.ipynb
( bash -c "source ${HOME}/anaconda3/bin/activate py27; which -a python; \
jupyter nbconvert \
--to notebook \
--ExecutePreprocessor.kernel_name=python2 \
--ExecutePreprocessor.timeout=3000 \
--execute merged.ipynb \
--output=$< $<" )
Related
I'm running some commands through a Makefile and I have to create a new python virtualenv, activate it and install requirements in one Make recipe/goal and then reuse it before running other recipes/goals, but the problem is, I have to activate that env on every subsequent Make Goal like so
SHELL := bash
# For Lack of a better mechanism will have to activate the venv on every Make recipe because every step is in its own shell
.ONESHELL:
install:
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
test: install
source .venv/bin/activate
pytest
synth: install
source .venv/bin/activate
cdk synth --no-color
diff: install
source .venv/bin/activate
cdk diff --no-color
bootstrap: install
source .venv/bin/activate
cdk bootstrap --no-color
deploy: install
source .venv/bin/activate
cdk deploy --no-color
.PHONY: install test synth diff bootstrap deploy
Is there a better way to do this, i.e saves me having to do source .venv/bin/activate on every single goal ?
In other words can I run all make goals in a Makefile in the same SHELL basically ?
You cannot run all goals in the same shell (that cannot work, since the shell must exit after each recipe is complete else make cannot know whether the commands in the recipe were successful or not, and hence whether the rule was successful or not).
You can of course put the sourcing in a variable so you don't have to type it out. You could also put the entire thing in a function, like this:
run = . .venv/bin/activate && $1
then:
install:
python -m venv .venv
$(call run,pip install -r requirements.txt)
test: install
$(call run,pytest)
etc.
If you don't want to do any of that your only choice is to use recursive make invocations. However this is going to be tricky because you only want to do it after the install rule is created. Not sure it will actually lead to a more understandable makefile.
I've been keeping all my python2.7 installs in my ~/.local/ directory that way I don't have to sudo every time I want to do a pip install. I also have $HOME/.local/lib/python2.7/site-packages on my $PYTHONPATH. This has worked well for years but now I find myself needing to run python3 programs more frequently. After much research it seems like virtualenv is the most recommended way to deal with python 2 and 3 on the same system. But I am running into troubles. I can spin up a python3 virtual environment but when I try to install new libs with pip, my old global path (i.e. ~/.local/) is still being searched by pip, which makes sense. However, this is even the case if I remove my ~/.local/bin/ directory from my $PATH and unset my $PYTHONPATH.
Here is are the steps I took:
First check the preliminaries before activating virtualenv. (I'm on Ubuntu 16.04 btw)
maddoxw#firefly:~$ echo $PATH
/usr/local/cuda-8.0/bin:/home/maddoxw/.node_modules_global/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/home/maddoxw/bin:/home/maddoxw/scripts
maddoxw#firefly:~$ echo $PYTHONPATH
maddoxw#firefly:~$ python --version
Python 2.7.12
maddoxw#firefly:~$ python3 --version
Python 3.5.2
maddoxw#firefly:~$ which pip
Since I removed my ~/.local/bin directory from my path, then I can be certain pip will not be found. Also, $PYTHONPATH is still empty. Now I create my virtualenv:
maddoxw#firefly:~$ mkdir test && cd test/
mkdir: created directory 'test'
maddoxw#firefly:~/test$ python3 -m venv .env
maddoxw#firefly:~/test$ source .env/bin/activate
(.env) maddoxw#firefly:~/test$ echo $PATH
/home/maddoxw/test/.env/bin:/usr/local/cuda-8.0/bin:/home/maddoxw/.node_modules_global/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/home/maddoxw/bin:/home/maddoxw/scripts
(.env) maddoxw#firefly:~/test$ echo $PYTHONPATH
(.env) maddoxw#firefly:~/test$ which python
/home/maddoxw/test/.env/bin/python
(.env) maddoxw#firefly:~/test$ python --version
Python 3.5.2
(.env) maddoxw#firefly:~/test$ which pip
/home/maddoxw/test/.env/bin/pip
Good. My ~/.local/ is still NOT on my $PATH, $PYTHONPATH is still empty, python points to the correct path and version, and pip is pointing to the correct location. Now lets try to pip install a fresh lib.
(.env) maddoxw#firefly:~/test$ pip install Cython
Requirement already satisfied: Cython in /home/maddoxw/.local/lib/python2.7/site-packages
Why is pip still looking in a non-$PATH path?
First, install pip3 to use with python3. You can install it with the following command and then use pip3 to install your packages.
sudo apt-get install python3-pip
[Solved]
When I initially set up my python2.7 environment way back when, I created for myself a handy little function wrapper around pip so that I wouldn't have to type out the --user every time I wanted to pip install
pip() {
if [ "$1" = "install" -o "$1" = "bundle" ]; then
cmd="$1"
shift
$HOME/.local/bin/pip $cmd --user $#
else
$HOME/.local/bin/pip $#
fi
}
I put this function in ~/.bash.d/bash_functions
and in my ~/.bashrc i added the line,
[ -f ~/.bash.d/bash_functions ] && source ~/.bash.d/bash_functions
So, although I removed $HOME/.local/ from my path. This wrapper function was still being called everytime I fired up a new terminal. Weather or not I was or was not in a virtualenv was irrelevant.
Solution?
Commented out (or delete completely) the function wrapper fixed it.
If you should ever encounter the following error when creating a Python virtual environment using the pyvenv command:
user$ pyvenv my_venv_dir
Error: Command '['/home/user/my_venv_dir/bin/python', '-Im', 'ensurepip', '--upgrade', '--default-pip']' returned non-zero exit status 1
... then the answer (below) provides a simple way to work around it, without resorting to setuptools and it's related acrobatics.
Here's an approach that is fairly O/S agnostic...
Both the pyvenv and python commands themselves include a --without-pip option that enable you to work around this issue; without resorting to setuptool or other headaches. Taking note of my inline comments below, here's how to do it, and is very easy to understand:
user$ pyvenv --without-pip ./pyvenv.d # Create virtual environment this way;
user$ python -m venv --without-pip ./pyvenv.d # --OR-- this newer way. Both work.
user$ source ./pyvenv.d/bin/activate # Now activate this new virtual environment.
(pyvenv.d) user$
# Within it, invoke this well-known script to manually install pip(1) into /pyvenv.d:
(pyvenv.d) user$ curl https://bootstrap.pypa.io/get-pip.py | python
(pyvenv.d) user$ deactivate # Next, reactivate this virtual environment,
user$ source ./pyvenv.d/bin/activate # which will now include the pip(1) command.
(pyvenv.d) user$
(pyvenv.d) user$ which pip # Verify that pip(1) is indeed present.
/path/to/pyvenv.d/bin/pip
(pyvenv.d) user$ pip install --upgrade pip # And finally, upgrade pip(1) itself;
(pyvenv.d) user$ # although it will likely be the
# latest version already.
# And that's it!
I hope this helps. \(◠﹏◠)/
2020, on python3.8 on WSL2 (Ubuntu) the following solved this for me:
sudo apt install python3.8-venv
If you have a module whose file is named after a python standard library, in the folder on which you invoke python -m venv venv, then this command will fail without a hint about that.
For instance, you name a file email.py.
What I did to find this was coding a bash script which moves the .py files off the current directory one by one (to a holdspace/ subdir), and try at each move to invoke the venv dir creation. If the python -m venv venv command exits with 0 code, so it's successful and the last file moved was the culprit.
#!/bin/bash
if [ ! -d ./holdspace ]
then
mkdir holdspace/
fi
for file in *.py
do
mv "$file" holdspace/
python -m venv venv >/dev/null 2>&1
if [ $? -eq 0 ]
then
echo "$file was the culprit."
rm -rf venv/
break
else
echo "$file isn't the culprit."
fi
rm -rf venv/
done
mv holdspace/*.py .
The following should fix it
brew update
brew upgrade
brew install zlib
I have successfully installed python 2.7.11 on a shared Bluehost server.
In the home directory I installed get-pip.py When I run that now,
# python get-pip.py
Requirement already up-to-date: pip in ./python/lib/python2.7/site-packages
But when I try to run pip I get,
usernmame#example.com [~]# pip
-bash: pip: command not found
Why is pip not running? How can I check what python packages are installed?
My ~/.bashrc looks like this,
# .bashrc
export PATH=$HOME/python/Python-2.7.11/:$PATH
# User specific aliases and functions
alias mv='mv -i'
alias rm='rm -i'
alias cp='cp -i'
# Source global definitions
if [ -f /etc/bashrc ]; then
. /etc/bashrc
fi
Additionally,
# echo $PATH
/usr/local/jdk/bin:/home2/username/python/Python-2.7.11/:/usr/lib64/qt-3.3/bin:/home2/username/perl5/bin:/ramdisk/php/54/bin:/usr/php/54/usr/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/bin:/usr/X11R6/bin:/home2/username/bin
EDIT:
My attempt to add the path to $PATH in ~/.bashrc
export PATH=$HOME/python/Python-2.7.11/:$HOME/python/lib/python2.7/site-packages/:$PATH
But it doesn't work, same error.
Thanks
You can run pip by invoking the -m option when you use python to run pip. Suppose you want to use pip to install a Python package called "package-of-interest" from PyPI. Then, you can install it by running:
python -m pip install package-of-interest
I've been trying to get up and running with the built-in "venv" module of Python 3.3 on my OS X machine. I've installed Python 3.3 using Homebrew.
As per the docs, creating and switching virtual environment works as you'd expect:
$ python3 -m venv myvenv
$ source myvenv/bin/activate
And I've tested something like this:
$ echo "YEAH = 'YEAH!'" > myvenv/lib/python3.3/site-packages/thingy.py
$ python
>>> import thingy
>>> print(thingy.YEAH)
'YEAH!'
But when I try to install distribute, it simply won't go in the proper place. For some reason, it insists on trying to install into /usr/local/lib/python3.3/site-packages/, which fails with the following messages:
No setuptools distribution found
running install
Checking .pth file support in /usr/local/lib/python3.3/site-packages/
/Users/victor/myvenv/bin/python -E -c pass
TEST FAILED: /usr/local/lib/python3.3/site-packages/ does NOT support .pth files
error: bad install directory or PYTHONPATH
You are attempting to install a package to a directory that is not
on PYTHONPATH and which Python does not read ".pth" files from. The
installation directory you specified (via --install-dir, --prefix, or
the distutils default setting) was:
/usr/local/lib/python3.3/site-packages/
and your PYTHONPATH environment variable currently contains:
''
This happens regardless if I try to install using distribute_setup.py or using the source distribution directly. I've even tried using --prefix=/Users/victor/myenv but it still tries to put everything in my "global" site-packages.
I can't figure out why this happens, but it's consistent on two of my machines. Note that sys.prefix reports the correct path (the virtual environment).
Is this a problem with Homebrew? OS X? Python 3.3? venv? Me?
This has been an issue with Homebrew, yes, but it is working now since https://github.com/mxcl/homebrew/commit/0b50110107ea2998e65011ec31ce45931b446dab.
$ brew update
$ brew rm python3 #if you have installed it before
$ brew install python3
$ cd /tmp
$ which python3
/usr/local/bin/python3
$ python3 -m venv myvenv
$ source myvenv/bin/activate
$ wget http://python-distribute.org/distribute_setup.py # may need brew install wget
$ python3 distribute_setup.py
...
Finished processing dependencies for distribute==0.6.45
After install bootstrap.
Creating /private/tmp/myvenv/lib/python3.3/site-packages/setuptools-0.6c11-py3.3.egg-info
Creating /private/tmp/myvenv/lib/python3.3/site-packages/setuptools.pth
You see that distribute install successfully into the /tmp dir.
This happens because homebrew installs distutils config file:
$ brew cat python3 | grep "Tell distutils" -A5
# Tell distutils-based installers where to put scripts
(prefix/"Frameworks/Python.framework/Versions/#{VER}/lib/python#{VER}/distutils/distutils.cfg").write <<-EOF.undent
[install]
install-scripts=#{scripts_folder}
install-lib=#{site_packages}
EOF
$ mv ~/.local/Frameworks/Python.framework/Versions/3.3/lib/python3.3/distutils/distutils.cfg ~/tmp/
$ cat ~/tmp/distutils.cfg
[install]
install-scripts=/Users/gatto/.local/share/python3
install-lib=/Users/gatto/.local/lib/python3.3/site-packages
$ . venv/bin/activate
(venv) $ python distribute-0.6.36/distribute_setup.py
(venv) $ ls venv/lib/python3.3/site-packages/
distribute-0.6.36-py3.3.egg easy-install.pth setuptools-0.6c11-py3.3.egg-info setuptools.pth
See "distutils.cfg Can Break venv" issue at bugs.python.org.