Python - Virtual Environment uses System Directories - python

I've created a Python virtual environment, and activate it by doing:
joe#joe-mint $ source ./venvs/deep-learning/bin/activate
Which turns the prompt into:
(deep-learning) joe#joe-mint $
Now whenever I run a python package or try to install one, the system seems to ignore the fact that it's in a virtual environment and does things system-wide:
(deep-learning) joe#joe-mint $ which pip
/usr/local/bin/pip
The same happens when I try to install new packages that aren't on my system; it installs them to the system files (i.e. /usr/bin) instead of the virtual environment.
What's wrong with my virtual environment? How do I get it to ignore system files and do everything inside the environment?
I've looked at this question which says to use an explicit flag when creating the virtual environment to make it use the local environment packages, but I used python-3.5 -m venv to create the virtual environment, and this flag is removed in this version as it's now a default option.
I've also looked at this question and can confirm that the VIRTUAL_ENV variable is set correctly in the activate file of the virtual environment.

Here was the problem:
It seems that if you run pip on a venv without a local pip installation, then it will default to the system's pip outside the venv. Even if you've activated a virtual environment, this seems to want to install packages on the system rather than in the venv.
Here was the solution:
First, I had to install the virtual environment without pip due to a bug that has long remained unresolved.
Second, I installed pip in the virtual environment as per the instruction here. However, doing so required using some temporary folders that for some reason my user didn't have access to. So this failed, and the only way I could get it to work was to become root.
sudo su
activate ..../venvs/deep-learning/bin/activate to activate the virtual environment.
curl --silent --show-error --retry 5 https://bootstrap.pypa.io/get-pip.py | python as per the answer linked above.
Although which pip now indicated the correct pip (inside the venv) was being used, running pip would use the system one! Deactivating (deactivate) and reactivating the venv solved this.
Now it took me a while to realise that having installed this as root caused me permission errors when trying to install more packages using pip inside the virtual environment.
chown <user>:<group> -R ..../venvs/deep-learning/*
And that was it. After these steps, I could activate the venv and run pip correctly. It would use the pip inside the venv, and install packages inside the venv.

Related

How to activate virtual environment while working on django project?

I have been through numerous similar questions and I still don't understand how are we activating virtual environment for Django projects. Please mention explanation for how each command works and one more questions is that why do we not need to install python in Django's virtual environment, I am getting confused. Thank you in advance, please help this newbie.
Benefits
You can use any version of python you want for a specific environment
without having to worry about collisions (shoutout to my python 2.7
mac users!)
You can organize your packages much better and know exactly the
packages you need to run your code incase someone else needs to run
it on their machine
Your main python package directory does not get FLOODED with
unnecessary python packages
To create virtual environment
step 1 install environment package (virtualenv) using pip
pip install virtualenv
step 2 create virtualenv
virtualenv env_name #<- env_name is virtualenv name you can set any
step 3 Activate Virtual env
env_name\Scripts\activate #<- for window
step 4 Install pakages you wan to install in virtual env
cmd(env_name): pip install django
Note that python is install in your virtual env automaticaty the
version is same as in your local machine
There is no difference between activating virtual environment for Django or for other purposes. Django on it's own does not differ from any Python library out there.
Python virtual environment allows you to separate your system Python and it's libraries and create self-contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.
On Linux, assuming you already have Python 3 and pip3 installed:
# install virtualenv package (skip if you have it already)
pip3 install virtualenv
# create virtual environment in directory "tutorial-env"
python3 -m venv tutorial-env
# activate virtual environment
source tutorial-env/bin/activate
Upon activation below command should give path to new Python binary:
which python3
Similarly with pip3
which pip3
As long as your environment is activate you can run pip3 install $package_name and it will install it inside virtual environment.
To deactivate your virtual environment:
deactivate
For more info and commands for Windows:
https://packaging.python.org/en/latest/guides/installing-using-pip-and-virtual-environments/
https://docs.python.org/3/tutorial/venv.html

having issues activating pipenv in vscode [duplicate]

ERROR:: --system is intended to be used for pre-existing Pipfile installation, not installation of specific packages. Aborting.
I can't use pipenv in Ubuntu18.04.
How can I fix it?
This is an open issue in the Pipenv repository: https://github.com/pypa/pipenv/issues/5052.
From the discussion in the thread, it seems to pop up when there is an existing virtualenv that was created with the same directory path. The solution mentioned in the thread is to simply remove this virtualenv which fixes the issue.
You can use the python virtual environment,
python -m venv venv # Creates virtual environment
To activate virtual environment do source venv/bin/activate
Then you can install your packages using pip install lib
To deactivate virtual environment type deactivate

pip freeze showing all libraries, not the ones in my virtual env

Pip Freeze shows all the libraries I have on my computer not just the libraries in the virtual environment.
I am trying to create a requirements.txt file for my virtual environment. I'm using an anaconda distribution. I am creating a flask app. I have navigated to my project folder created the virtual environment added flask and then when I make the command pip freeze it clearly shows items that are not in my virtual environment like xlwings, pandas and stuff I use that has nothing to do with flask.
Any way I can create a requirements file from my virtual environment.
I can clearly see my virtual environment is active with (venv) to the left.
Edit: I created a short video showing I get the same list of libraries whether I'm in my virtual environment or not. Also I'm showing the site-packages in my virtual environment and showing that these libraries aren't there I'm specifically pointing out xlwings.
https://youtu.be/xEFZ3dSaqoY
So i'm not sure why it was happening, but I deleted the virtual environment and re-created it (I had a previous requirements.txt that was correct). Then I ran pip freeze again and it all worked. Not sure what happened, but it works for me now.
I had this same problem and it happened because i change the name of my venv folder.
To solve this i used this command in my terminal [yourVenv]\Scripts\python -m pip freeze.
If you don't want to do it like this everytime, try creating a new enviorment like this.
[yourVenv]\Scripts\python -m pip freeze > requirements.txt
python -m venv [yourVenv]
[yourVenv]\Scripts\activate
python -m pip install --upgrade pip
pip install -r requirements.txt
You need to activate your pip environment in the console that you are trying to run pip freeze in. That way it uses the environment's pip and not your global pip.
So in your console, navigate to your virtual environment folder. From there go to the "Scripts" folder. Then enter the word "activate" into the console.
You should then see next to the console cursor the name of your virtual environment. At that point you can use the pip that's inside of your virtual environment and all the normal pip commands will point to it.
In my case i was using vscode IDE so when i created .venv its recommended use worksapce, so i choosed yes thats why its took all from my system' lib.
for this you can delete old on .venv and create new one, this time don't choose plugin's recommendations.
for create veirtual enviroment in window/linx
python -m venv .venv or python3 -m venv .venv
to activate .venv win use (.venv\scripts\activate)
and for linux use (source .venv/bin/activate)
all command like pip freeze, pip list, pip freeze > requirements.txt will work

How can I properly use Pyenv and venv?

Articles read:
Python Virtual Environments: A Primer,
Pyenv – Install Multiple Python Versions for Specific Project,
How to manage multiple Python versions and virtual environments
Let's suppose we have these directories:
~/Projects/PyA: uses Python 3.4.3 with Django 2.0
~/Projects/PyB: uses Python 3.5.1 with Django 2.1
~/Projects/PyC: uses Python 3.5.6 with Django 2.2
~/Projects/PyD: uses Python 3.2 with python-igraph
The first to do, we install the Python versions needed:
pyenv install 3.4.3
pyenv install 3.5.1
pyenv install 3.5.6
pyenv install 3.2
My questions start here:
Should I do this?
cd ~/Projects/PyA && pyenv local 3.4.3 && python3.4 -m venv proA
cd ~/Projects/PyB && pyenv local 3.5.1 && python3.5 -m venv proB
cd ~/Projects/PyC && pyenv local 3.5.6 && python3.5 -m venv proC
cd ~/Projects/PyD && pyenv local 3.2 && python3.2 -m venv proD
When is a unique directory for virtual environments used? Which option is recommended? Why?
How should I install the per-project packages listed above?
Should I use virtualenvwrapper?
How do I switch between projects (changing Python/virtual-environment in the process) easily or painlessly?
In Ruby, there is a file named Gemfile where I can set which gems (with their respective versions) are installed for the current project, which is a very good idea. Is there something similar for Python?
PS: I use Arch Linux as guest for a Vagrant box.
When is an unique directory for virtual environments used? Which
option is recommended? Why?
Every virtual environment "lives" in its own folder. All packages you install will go there, especially if every environment will have a different Python version.
How should I install per-project packages listed above?
When you switch to the project environment after you created it, see my original answer below. All packages installed will exclusively be installed into that virtual environment you are currently working in.
You can always check which Python interpreter is currently in use by typing
which python
in the terminal you currently have the the project environment activated. In addition you can also check
which pip
to make sure if you are installing using pip install somepackage that you target the correct Python interpreter. If you want to pin the packages, you can do
pip freeze > requirements.txt
any time and the currently installed packages plus their version will be written to the textfile requirements.txt. You can now always create a new environment using
pip install -r requirements.txt
Should I use virtualenvwrapper?
I would always work in a per-project virtual environment, so other projects that may use some pinned version of a special package are not influenced.
How do I switch between projects (changing Python/virtual-environment
in the process) easily or painlessly?
You could define an alias in your ~/.bashrc file or ~/.bash_aliases. In a terminal, open (in my example) the ~/.bashrc with a text editor, e.g., Vim/nano or one of your liking:
nano ~/.bashrc
and somewhere near the end you can add a line with an alias to switch to the project directory and activate the environment at the same time:
alias activate_proj1="cd ~/project_1 && pyenv activate venv_project_1"
so you only type activate_proj1 in the terminal (tab completion also works) and both commands are executed. Don't forget to source the bash-file again after you change something with source ~/.bashrc or just open a new terminal.
Original answer:
pyenv will handle everything you need:
My workflow (for one project to make it more readable) would be the following:
pyenv install 3.5.1
cd python_projects
mkdir myproject
cd myproject
pyenv virtualenv 3.5.1 venv_myproject
After that you can simply activate the virtualenv created by pyenv using
pyenv activate venv_myproject
which will open your distinct environment. Here you can do all the things you want, e.g., install your packages using pip etc.
After you completed setting up the environment, you can freeze the environment and create a requirements file:
pip freeze > requirements.txt
to be able to reconstruct the environment if needed. This way all the overhead that may be needed (setting a PATH etc.) will be handled by pyenv.
If you want to work on different projects, just activate the environment you need and off you go!
Note that you can make pyenv activate the virtualenv when you cd the folder in your terminal by putting its name into your .python-version file as well.
There's a lot going on in this question.
virtualenv workflows are usually pretty simple. You create a directory for your project, cd into it, and run virtualenv venv for a simple virtualenv, but you can also specify which Python executable you'd like in your virtual environment with a -p python3.5 for a Python 3.5 virtual environment, for instance.
There isn’t any magic going on here. You need Python 3.5 installed to create the Python 3.5 virtual environment. To activate this virtual environment, you simply source venv/bin/activate. Once activated, your shell should reflect which virtual environment you're operating in. You can even run which python to see that it's actually directed at the venv directory structure. Simple.
An analog to the Gemfile in Python would be similar to what most projects use as a requirements.txt. These can be generated trivially by running pip freeze > requirements.txt or installed by running pip install -r requirements.txt. Generally, this is done within the context of a virtual environment to avoid disrupting or clobbering your operating system's global Python packages.
Kenneth Reitz released a tool that incorporates virtualenv called pipenv and it looks very nice, but I've had some trouble breaking my habits of using virtualenv, and the truth is, virtualenv hasn't presented me with enough problems to deeply explore this new project, but your mileage may vary. Mr. Reitz's contributions to the Python community are overwhelmingly positive, so it's definitely worth a look.

Why is virtualenvwrapper creating paths related to python2 instead of python3?

I updated the versions of mkvirtualenv and virtualenv
$ sudo pip install --upgrade virtualenv virtualenvwrapper
because my whole life I only used Python 2, and wanted to use -now- Python 3. The virtualenvwrapper had some issues.
Then I tried creating a virtual environment for my python3 installation:
$ mkvirtualenv py3test -p /usr/bin/python3
The environment is created in ~/.virtualenvs/py3test. Once active, I want to install a package I made:
(py3test)$ pip install python-cantrips
(py3test)$ pip freeze
And the package is appropriately installed. Then I install ipython and run it:
(py3test)$ pip install ipython
(py3test)$ ipython
And I enter ipython appropriately. But then I...
import cantrips
And it explodes with an ImportError. Then I check sys.path. And the issue is here: sys.path includes a path like: '/home/myuser/.virtualenvs/py3test/lib/python2.7/site-packages'. I don't remember whether the path is exact or not, since I am not in such computer right now. But I can have one thing for sure: the environment was created with python3 (the directory is not python2.7 but python3.5 in my virtualenv).
So: Why is virtualenv creating an environment for python3 but adding me the paths as if it was a python2.7 environment instead?
Found it!
There was no issue with virtualenv or virtualenvwrapper. The issue was with ipython. Actually, there is no issue specifically with ipython but with the way the scripts are accessible inside a virtualenv.
Globally, I had ipython installed (which works with the global python27). When I installed ipython in my local python3 environment, the (shell) path was not updated until I somehow refresh the environment again (e.g. deactivating, activating again). So when I tried again, the ipython was the appropriate (the local ipython in my environment with 3.5), and the generated path was the expected one.

Categories

Resources