Record python version to rebuild virtualenv - python

The convention to record package requirements for future installation in a python virtual environment is to use pip freeze > requirements.txt and then install using pip install -r requirements.txt
The python interpreter version, however, is not recorded in the requirements.txt file.
Is there a similar convention to record the python version so that the entire virtual env, including the python interpreter used, can be easily rebuilt?

You can use pipenv for virtualenv creation and package management needs, e.g.:
pipenv --python 3.6
pipenv --python 3
See the documentation:
https://pipenv.kennethreitz.org/basics/#specifying-versions-of-python
That way, when you'll recreate the env with pipenv, it'll use the specified Python version and install the required packages.

Such requirements cannot be written in requirements.txt or setup.py: when one runs
pip install -r requirements.txt
pip is already being run with some version of Python.
Python version requirement could be implemented in a script that creates and populates a virtual env.
But best of all it must be written in the docs!

Related

How can I upgrade Python version and packages in pyenv virtualenv?

I used pyenv, pyenv-virtualenv for managing python virtual environment.
I have a project working in Python 3.4 virtual environment.
So all installed packages(pandas, numpy etc) are not newest version.
What I want to do is to upgrade Python version from 3.4 to 3.6 as well as upgrade other package version to higher one.
How can I do this easily?
Here is how you can switch to 3.9.0 for a given virtual environement venv-name:
pip freeze > requirements-lock.txt
pyenv virtualenv-delete venv-name
pyenv virtualenv 3.9.0 venv-name
pip install -r requirements-lock.txt
Once everything works correctly you can safely remove the temporary requirements lock file:
rm requirements-lock.txt
Note that using pip freeze > requirements.txt is usually not a good idea as this file is often used to handle your package requirements (not necessarily pip freeze output). It's better to use a different (temporary) file just to be sure.
Use pip freeze > requirements.txt to save a list of installed packages.
Create a new venv with python 3.6.
Install saved packages with pip install -r requirements.txt. When pip founds an universal wheel in its cache it installs the package from the cache. Other packages will be downloaded, cached, built and installed.
OP asked to upgrade the packages alongside Python. No other answers address the upgrade of packages. Lock files are not the answer here.
Save your packages to a requirements file without the version.
pip freeze | cut -d"=" -f1 > requirements-to-upgrade.txt
Delete your environment, create a new one with the upgraded Python version, then install the requirements file.
pyenv virtualenv-delete venv-name
pyenv virtualenv 3.6.8 venv-name
pip install -r requirements-to-upgrade.txt
The dependency resolver in pip should try to find the latest package. This assumes you have the upgrade Python version installed (e.g., pyenv install 3.6.8).
If you use anaconda, just type
conda install python==$pythonversion$

Equivalent in python of package.json and "npm install --save" command to easly save every new package [duplicate]

I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.

Create a virtualenv from another virtualenv

Can we create a virtualenv from an existing virtualenv in order to inherit the installed libraries?
In detail:
I first create a "reference" virtualenv, and add libraries (with versions fixed):
virtualenv ref
source ref/bin/activate
pip install -U pip==8.1.1 # <- I want to fix the version number
pip install -U wheel==0.29.0 # <- I want to fix the version number
Then:
virtualenv -p ref/bin/python myapp
source myapp/bin/activate
pip list
I get:
pip (1.4.1)
setuptools (0.9.8)
wsgiref (0.1.2)
How to get my installed libraries?
Similar question
I saw a similar question: Can a virtualenv inherit from another?.
But I want a isolated virtualenv which didn't use the referenced virtualenv, except for libraries installation. So, adding the specified directories to the Python path for the currently-active virtualenv, is not the solution.
Why doing that?
Well, we have an integration server which builds the applications (for releases and continuous integration) and we want to keep the control on libraries versions and make the build faster.
Create a relocatable virtualenv
I think I could use a relocatable virtualenv, that way:
create the ref virtualenv
make it relocatable: ``virtualenv --relocatable ref```
For "myapp":
copy ref to myapp
What do you think of this solution? Is it reliable for a distribuable release?
You can solve your problem by using .pth files. Basically you do this:
virtualenv -p ref/bin/python myapp
realpath ref/lib/python3.6/site-packages > myapp/lib/python3.6/site-packages/base_venv.pth
After doing this and activating myapp, if you run pip list you should see all the packages from ref as well. Note that any packages installed in myapp would hide the respective package from ref.
You may freeze list of packages from one env:
(ref) user#host:~/dir$ pip freeze > ref-packages.txt
Then install them:
(use) user#host:~/dir$ pip install -r ref-packages.txt
when you install the second virtualenv you have to add --system-site-packages flag.
virtualenv -p ref/bin/python myapp --system-site-packages
The pip version 1.4.1 was bundle with an old version of virtualenv. For example the one shipped with Ubuntu 14.04. You should remove that from your system and install the most recent version of virtualenv.
pip install virtualenv
This might require root permissions (sudo).
Then upgrade pip inside the virtual env pip install -U pip or recrete the env.
I think your problem can be solved differently. With use of PYTHONPATH. First we create ref virtaulenv and install all needed packages here
$ virtualenv ref
$ source ref/bin/activate
$ pip install pep8
$ pip list
> pep8 (1.7.0)
> pip (8.1.2)
> setuptools (26.1.1)
> wheel (0.29.0)
Then we create second virtaulenv use.
$ virtualenv use
$ source use/bin/activate
$ pip list
> pip (8.1.2)
> setuptools (26.1.1)
> wheel (0.29.0)
And now we can set our PYTHONPATH in this env to include ref's directories
$ export PYTHONPATH=PYTHONPATH:/home/path_to/ref/lib/python2.7/site-packages:/home/path_to/ref/local/lib/python2.7/site-packages
$ pip list
> pep8 (1.7.0)
> pip (8.1.2)
> setuptools (26.1.1)
> wheel (0.29.0)
As you see this way you just reference installed packages in ref's environment. Also note that we add this folders at the end so they will have lower priority.
NOTE: this are not all folders that exists in PYTHONPATH. I included this 2 because they are main ones. But if you will have some problems you can add other ones too, just lookup needed paths with this method
how to print contents of PYTHONPATH

python: how to install and use setuptools in virtual python

I do not have root previlege on a linux server so I want to creat a virtual python according to creating a "virtual" python.
After I run virtual-python.py, I do have python in ~/bin/python:
Then, according to setuptools PyPI page, I download ez_setup.py and run ~/bin/python ez_setup.py. Error occurs:
What should I do?
Looking at the linked website, it looks outdated. You use pip, not easy_install.
For installing development packages, I always take the following rules in account:
The system package manager is responsible for system-wide packages, so never use sudo pip. This doesn't just match the question, but this is always a good idea.
The package manager packages are probably outdated. You'll want an up-to-date version for development tools.
I recommend the following way to install local development tools.
$ # Install pip and setuptools on a user level
$ curl https://bootstrap.pypa.io/get-pip.py | python - --user
$ # Add the executables to your path. Add this to your `.bashrc` or `.profile` as well
$ export PATH=$PATH/$HOME/.local/bin
At this point pip should be accessible from the command line and usable without sudo. Use this to install virtualenv, the most widely used tools to set up virtual environments.
$ pip install virtualenv --user
Now simply use virtualenv to set up an environment to run your application in:
$ virtualenv myapp
Now activate the virtual environment and do whatever you would like to do with it. Note that after activating the virtual environment, pip refers to pip installed inside of the virtualenv, not the one installed on a user level.
$ source myapp/bin/activate
(myapp)$ pip install -r requirements.txt # This is just an example
You'll want to create a new virtual environment for each application you run on the server, so the dependencies can't conflict.

Creating python virtualenv in jenkins with shiningpanda and requirements.txt

I have some problems with jenkins and creating a virtualenv. I'm using the shiningpanda plugin and the "Virtualenv Builder" build step combined with pyenv.
I can install packages with "pip install package" but I cannot install requirements from a requirements file, because the subsequent packages cannot find the installed packages, e.g. numexpr cannot find/import numpy.
As I was typing my question, I found the answer to that problem: The current version (v0.21) of the shiningpanda plugin does NOT support pip's requirements.txt in virtualenv builders.
https://wiki.jenkins-ci.org/display/JENKINS/ShiningPanda+Plugin
Current version (0.23) works in our setup like this (in Build-Virtualenv Builder, with Nature: Shell):
pushd %run_dir%
SET PYTHONPATH=%CD%
python -m pip install --upgrade -r configurations/requirements.txt
This has worked well even if libraries require each other.

Categories

Resources