I have activated the python virtual environment, but anyway when I run pip install *, the dependencies are installed to my local Python path. The same happens when I run the server in my Django project: python manage.py runserver -> the system is not using the virtual environment, but the python from my PC. What is the problem? Why my activated virtual environment does not work?
I am using MacOS. Everything worked until I erased all my data and installed again Python.
Thank you!
If the packages aren't getting installed within your virtual environment, it most likely is the case that you are not using the pip within your virtual environment (or may not have pip within your virtual environment at all, in which case it is defaulting to use the pip installed in /usr/local/bin/). First, check that you have a separate version of pip in your virtual environment located in ./your_virtual_environment_name/bin/pip....If you don't, install from the PyPA download page (https://pip.pypa.io/en/stable/installing/#installing-with-get-pip-py). When you go to install a package within your virtual environment, activate the virtual environment first using source ./bin/activate inside your virtual environment, then "cd .." and you'll use the pip installed within the virtual environment by typing: ./your_virtual_environment_name/bin/pip install * ....The installation will be in ./your_virtual_environment_name/lib/python3.6/site-packages/
Related
I have been through numerous similar questions and I still don't understand how are we activating virtual environment for Django projects. Please mention explanation for how each command works and one more questions is that why do we not need to install python in Django's virtual environment, I am getting confused. Thank you in advance, please help this newbie.
Benefits
You can use any version of python you want for a specific environment
without having to worry about collisions (shoutout to my python 2.7
mac users!)
You can organize your packages much better and know exactly the
packages you need to run your code incase someone else needs to run
it on their machine
Your main python package directory does not get FLOODED with
unnecessary python packages
To create virtual environment
step 1 install environment package (virtualenv) using pip
pip install virtualenv
step 2 create virtualenv
virtualenv env_name #<- env_name is virtualenv name you can set any
step 3 Activate Virtual env
env_name\Scripts\activate #<- for window
step 4 Install pakages you wan to install in virtual env
cmd(env_name): pip install django
Note that python is install in your virtual env automaticaty the
version is same as in your local machine
There is no difference between activating virtual environment for Django or for other purposes. Django on it's own does not differ from any Python library out there.
Python virtual environment allows you to separate your system Python and it's libraries and create self-contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.
On Linux, assuming you already have Python 3 and pip3 installed:
# install virtualenv package (skip if you have it already)
pip3 install virtualenv
# create virtual environment in directory "tutorial-env"
python3 -m venv tutorial-env
# activate virtual environment
source tutorial-env/bin/activate
Upon activation below command should give path to new Python binary:
which python3
Similarly with pip3
which pip3
As long as your environment is activate you can run pip3 install $package_name and it will install it inside virtual environment.
To deactivate your virtual environment:
deactivate
For more info and commands for Windows:
https://packaging.python.org/en/latest/guides/installing-using-pip-and-virtual-environments/
https://docs.python.org/3/tutorial/venv.html
I created a new VENV and pip installed awscli there.
I can see it installed with pip list --local.
But when I type aws - it does not recognize the command (even though I activated that VENV).
How to setup paths so that newly installed library in VENV can be recognized?
There are many ways to create a virtual environment. Let's say:
python3 -m venv my_venv
But to install some package in that particular environment, you have to activate it by:
For windows:
my_venv\Scripts\activate
For Linux or Mac:
source my_venv/bin/activate
Then if you install a package by using pip, it will be installed in that particular virtual environment. Otherwise, it will be installed in the local environment.
I made Virt2 virtual environment.
using $ python -m venv Virt2.
I want to install my custom packages in "site-packages" directory. However packages are installed in "dist-packages" directory.
what should I do to install packages in my python virtual environment site-packages??
my python version 3.6.2 (in /usr/local/bin)
You use sudo and sudo switches user to root, i.e. you're completely outside of you virtual env. System pip3 outside of virtual env installs packages into a system directory which is dist-packages.
Run pip install inside you virtual env without sudo.
I've created a Python virtual environment, and activate it by doing:
joe#joe-mint $ source ./venvs/deep-learning/bin/activate
Which turns the prompt into:
(deep-learning) joe#joe-mint $
Now whenever I run a python package or try to install one, the system seems to ignore the fact that it's in a virtual environment and does things system-wide:
(deep-learning) joe#joe-mint $ which pip
/usr/local/bin/pip
The same happens when I try to install new packages that aren't on my system; it installs them to the system files (i.e. /usr/bin) instead of the virtual environment.
What's wrong with my virtual environment? How do I get it to ignore system files and do everything inside the environment?
I've looked at this question which says to use an explicit flag when creating the virtual environment to make it use the local environment packages, but I used python-3.5 -m venv to create the virtual environment, and this flag is removed in this version as it's now a default option.
I've also looked at this question and can confirm that the VIRTUAL_ENV variable is set correctly in the activate file of the virtual environment.
Here was the problem:
It seems that if you run pip on a venv without a local pip installation, then it will default to the system's pip outside the venv. Even if you've activated a virtual environment, this seems to want to install packages on the system rather than in the venv.
Here was the solution:
First, I had to install the virtual environment without pip due to a bug that has long remained unresolved.
Second, I installed pip in the virtual environment as per the instruction here. However, doing so required using some temporary folders that for some reason my user didn't have access to. So this failed, and the only way I could get it to work was to become root.
sudo su
activate ..../venvs/deep-learning/bin/activate to activate the virtual environment.
curl --silent --show-error --retry 5 https://bootstrap.pypa.io/get-pip.py | python as per the answer linked above.
Although which pip now indicated the correct pip (inside the venv) was being used, running pip would use the system one! Deactivating (deactivate) and reactivating the venv solved this.
Now it took me a while to realise that having installed this as root caused me permission errors when trying to install more packages using pip inside the virtual environment.
chown <user>:<group> -R ..../venvs/deep-learning/*
And that was it. After these steps, I could activate the venv and run pip correctly. It would use the pip inside the venv, and install packages inside the venv.
I'm creating a virtual environment during runtime using virtualenv.create_environment. If this env was already created I skip the installation, but would like to check and see if a certain package is installed. Right now I'm running a command activating the env and pip list.
My question is if I can to import the pip from the virtual env and pip.get_installed_distributions()?