Place Pip and Yolk inside or outside a virtual environment? - python

I'm using virtualenv for sandboxing my Python environment, pip to install/uninstall packages and yolk to list the packages.
I can install packages to my virtual environment by using pip install <package name> -e=<environment name> and I guess I don't need to have pip inside my virtual environment. Am i correct?
If I need to list out all the installed packages in my virtual environment, can I use yolk -l to do so? I know I can do this by keeping yolk installed inside the environment but is this also possible by keeping yolk outside the environment i.e. the global Python installation.
Thanks.

Here is your workflow:
Add virtualenv and pip to your global environment.
Create virtualenvs
Within a virtualenv, add new packages
I recommend you look into virtualenvwrapper. It makes the maintenance of virtualenvs way easier.
Download and install virtualenvwrapper in your global environment
Create directory ~/.virtualenvs
Modify your ~/.bashrc with these statements:
export WORKON_HOME=$HOME/.virtualenvs
export VIRTUALENVWRAPPER_VIRTUALENV_ARGS='--no-site-packages --python=python2.6'
source /usr/local/bin/virtualenvwrapper.sh
Then you can create, delete, modify, and change between virtualenvs easily.
So, for your questions:
Should I put pip inside my virtualenv?
No, do not do that.
Should I use yolk to list the packages?
Not familiar with yolk. I just use pip freeze and then I get a
requirements file that lists all the packages for recreating my
environment.

Related

how to make sure packages in my python venv are found first (before PYTHONPATH)

I'm working with a distributed package system (happens to be CVMFS but I don't think the particulars are relevant). The setup script for that environment adds some locations to PYTHONPATH.
Now, working in that environment, I want to install newer versions of some packages that are already found within that path. Installing the packages is easy, I can either do
python -m pip install --user --upgrade <packages>
or, my preferred approach, use a virtual environment
python -m venv myenv
source myenv/bin/activate
python -m pip install --upgrade <packages>
Installation works fine. But then when I try to run python, the directories in PYTHONPATH are search before either my user-site directory or the virtual environment, and I get the old versions of packages.
Is there any way to force packages in my virtual environment or user-site path to have priority, without having to manually edit sys.path or PYTHONPATH?

Change the python2.7 package to my required python3.5 package when use virtualenv

When I create the virtualenv, if I do not add the --no-site-packages as param:
virtualenv venv
I can get the packages, in the
venv/lib/
there are a python2.7 package:
python2.7
under the python2.7 there are site-packages.
But, I have a requirement, I want copy the python3.5 to the venv/lib/ how can I do this?
EDIT-1
I use the post method create the venv, in the venv/lib/:
there is the python3.5 directory, but in the venv/lib/python3.5/site-packages there are few packages:
But in my Mac's sitepages:
there are so many packages, my requirement is add those packages in the venv when create the venv:
my origin site-packages path is:/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages
EIDT-2
Before, I do not use virtualenv, I have installed many site-packages in my Mac, (/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages), and I want to use venv now, so I am looking for a method to create the venv and the site-packages I have installed should include in it.
First of all you should install virtualenv using python3.5.
there is a tricky way that I use, I add some lines to the .bashrc file like this:
export VIRTUALENVWRAPPER_PYTHON=/usr/local/bin/python3.5
alias v.activate="source /usr/local/bin/virtualenvwrapper.sh"
so whenever I want to change my virtualenv to be on python3.5 I run the v.activate command, and then create my virtual environment with python3.5.
remember this that:
1.I use virtualenvwrapper instead of virtualenv
2.Dont forget to run source .bashrc after you add those lines to bashrc

Install Adafruit_MPR121 Library in virtual Environment

I was trying to install the Adafruit_Python_MPR121 Library in my venv folder, but it always installed it into the global dist-packages and I cannot access them from my venv.
I cannot use --site-packages because I need some local packages in the env.
Does someone know a solution for it?
I'm sure you can and should use --site-packages. It doesn't do what you seem to think it does — it doesn't make pip install all packages globally. It makes python in the virtual env access global site-packages but pip still install packages in the virtual env (after you activate it with . env/bin/activate).

Python - Virtual Environment uses System Directories

I've created a Python virtual environment, and activate it by doing:
joe#joe-mint $ source ./venvs/deep-learning/bin/activate
Which turns the prompt into:
(deep-learning) joe#joe-mint $
Now whenever I run a python package or try to install one, the system seems to ignore the fact that it's in a virtual environment and does things system-wide:
(deep-learning) joe#joe-mint $ which pip
/usr/local/bin/pip
The same happens when I try to install new packages that aren't on my system; it installs them to the system files (i.e. /usr/bin) instead of the virtual environment.
What's wrong with my virtual environment? How do I get it to ignore system files and do everything inside the environment?
I've looked at this question which says to use an explicit flag when creating the virtual environment to make it use the local environment packages, but I used python-3.5 -m venv to create the virtual environment, and this flag is removed in this version as it's now a default option.
I've also looked at this question and can confirm that the VIRTUAL_ENV variable is set correctly in the activate file of the virtual environment.
Here was the problem:
It seems that if you run pip on a venv without a local pip installation, then it will default to the system's pip outside the venv. Even if you've activated a virtual environment, this seems to want to install packages on the system rather than in the venv.
Here was the solution:
First, I had to install the virtual environment without pip due to a bug that has long remained unresolved.
Second, I installed pip in the virtual environment as per the instruction here. However, doing so required using some temporary folders that for some reason my user didn't have access to. So this failed, and the only way I could get it to work was to become root.
sudo su
activate ..../venvs/deep-learning/bin/activate to activate the virtual environment.
curl --silent --show-error --retry 5 https://bootstrap.pypa.io/get-pip.py | python as per the answer linked above.
Although which pip now indicated the correct pip (inside the venv) was being used, running pip would use the system one! Deactivating (deactivate) and reactivating the venv solved this.
Now it took me a while to realise that having installed this as root caused me permission errors when trying to install more packages using pip inside the virtual environment.
chown <user>:<group> -R ..../venvs/deep-learning/*
And that was it. After these steps, I could activate the venv and run pip correctly. It would use the pip inside the venv, and install packages inside the venv.

How to import a globally installed package to virtualenv folder

So I have a virtualenv folder called venv for my python project.
I can run:
venv/bin/pip install -r requirements.txt
Which installs all requirements I need for the project except one, M2Crypto. The only way to install it is through apt-get:
apt-get install python-m2crypto
How can I then add this package installed through apt to venv folder?
--system-site-packages
gives access to the global site-packages modules to the virtual environment.
you could do:
$ sudo apt-get install python-m2crypto
$ virtualenv env --system-site-packages
... and you would then have access to m2crypto (along with all other system-wide installed packages) inside your virtualenv.
What I did after all:
cp -R /usr/lib/python2.7/dist-packages/M2Crypto /home/richard/hello-project/venv/lib/python2.7/site-packages/
cp -R /usr/lib/python2.7/dist-packages/OpenSSL /home/richard/hello-project/venv/lib/python2.7/site-packages/
Real simple solution.
In the virtual environment directory, edit the file pyvenv.cfg. Set the parameter include-system-site-packages = true, and save the file.
The globally installed modules will appear the next time you activate (source venv/bin/activate) your environment.
It can be verified via pip list.
Enjoy!
toggleglobalsitepackages will toggle access to the system-wide site-packages.
Note: You need to pip install virtualenvwrapper to get this command; the vanilla virtualenv doesn't include it. With virtualenvwrapper you also get the very useful mkvirtualenv and rmvirtualenv commands, among others.
venv/bin/pip install -I M2Crypto
The -I forces it to also be installed into the virtualenv, even if it's already globally installed.
The only way to transfer packages locally from one environment or global to the other virtual environment is
Copying the the 📁"Lib"folder or the package folder with all its contents from the environment to the other environment you want the package to work
If you don't know it's location search for it within the environment folder using file explorer
Lib folder contains all the installed packages of the environment

Categories

Resources