Pip Freeze shows all the libraries I have on my computer not just the libraries in the virtual environment.
I am trying to create a requirements.txt file for my virtual environment. I'm using an anaconda distribution. I am creating a flask app. I have navigated to my project folder created the virtual environment added flask and then when I make the command pip freeze it clearly shows items that are not in my virtual environment like xlwings, pandas and stuff I use that has nothing to do with flask.
Any way I can create a requirements file from my virtual environment.
I can clearly see my virtual environment is active with (venv) to the left.
Edit: I created a short video showing I get the same list of libraries whether I'm in my virtual environment or not. Also I'm showing the site-packages in my virtual environment and showing that these libraries aren't there I'm specifically pointing out xlwings.
https://youtu.be/xEFZ3dSaqoY
So i'm not sure why it was happening, but I deleted the virtual environment and re-created it (I had a previous requirements.txt that was correct). Then I ran pip freeze again and it all worked. Not sure what happened, but it works for me now.
I had this same problem and it happened because i change the name of my venv folder.
To solve this i used this command in my terminal [yourVenv]\Scripts\python -m pip freeze.
If you don't want to do it like this everytime, try creating a new enviorment like this.
[yourVenv]\Scripts\python -m pip freeze > requirements.txt
python -m venv [yourVenv]
[yourVenv]\Scripts\activate
python -m pip install --upgrade pip
pip install -r requirements.txt
You need to activate your pip environment in the console that you are trying to run pip freeze in. That way it uses the environment's pip and not your global pip.
So in your console, navigate to your virtual environment folder. From there go to the "Scripts" folder. Then enter the word "activate" into the console.
You should then see next to the console cursor the name of your virtual environment. At that point you can use the pip that's inside of your virtual environment and all the normal pip commands will point to it.
In my case i was using vscode IDE so when i created .venv its recommended use worksapce, so i choosed yes thats why its took all from my system' lib.
for this you can delete old on .venv and create new one, this time don't choose plugin's recommendations.
for create veirtual enviroment in window/linx
python -m venv .venv or python3 -m venv .venv
to activate .venv win use (.venv\scripts\activate)
and for linux use (source .venv/bin/activate)
all command like pip freeze, pip list, pip freeze > requirements.txt will work
Related
I have been through numerous similar questions and I still don't understand how are we activating virtual environment for Django projects. Please mention explanation for how each command works and one more questions is that why do we not need to install python in Django's virtual environment, I am getting confused. Thank you in advance, please help this newbie.
Benefits
You can use any version of python you want for a specific environment
without having to worry about collisions (shoutout to my python 2.7
mac users!)
You can organize your packages much better and know exactly the
packages you need to run your code incase someone else needs to run
it on their machine
Your main python package directory does not get FLOODED with
unnecessary python packages
To create virtual environment
step 1 install environment package (virtualenv) using pip
pip install virtualenv
step 2 create virtualenv
virtualenv env_name #<- env_name is virtualenv name you can set any
step 3 Activate Virtual env
env_name\Scripts\activate #<- for window
step 4 Install pakages you wan to install in virtual env
cmd(env_name): pip install django
Note that python is install in your virtual env automaticaty the
version is same as in your local machine
There is no difference between activating virtual environment for Django or for other purposes. Django on it's own does not differ from any Python library out there.
Python virtual environment allows you to separate your system Python and it's libraries and create self-contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.
On Linux, assuming you already have Python 3 and pip3 installed:
# install virtualenv package (skip if you have it already)
pip3 install virtualenv
# create virtual environment in directory "tutorial-env"
python3 -m venv tutorial-env
# activate virtual environment
source tutorial-env/bin/activate
Upon activation below command should give path to new Python binary:
which python3
Similarly with pip3
which pip3
As long as your environment is activate you can run pip3 install $package_name and it will install it inside virtual environment.
To deactivate your virtual environment:
deactivate
For more info and commands for Windows:
https://packaging.python.org/en/latest/guides/installing-using-pip-and-virtual-environments/
https://docs.python.org/3/tutorial/venv.html
I created a virtual environment with venv.
venv --python=/workspace/dev/python3.9 /var_path/var_virtual_env
In Visual Studio Code(portable version -Linux) I have set:
"python.venvPath": "/var_path/var_virtual_env",
"python.formatting.provider": "black"
The environment is seen in VSCode:
Code with python built-ins works without a problem.
From VS Code terminal with virtual environment activated using pip I installed different packages and I expected the packages to be imported in the virtual environment, but instead seems that they are installed in the/home/user_name/local/lib/python3.6.
Also I noticed that the configurations likepylint.d etc are at same location in "local". The pylint was set thru VSCode.
To fix it I installed the packages in the environment outside VSCode, but the editor doesn't sees them. Example httpx:
But they are in the environment, pip freeze output:
How can I fix this ?
In a terminal where the scarlette environment is activated, copy the output of which python.
Set in settings.json the following line:
{
"python.pythonPath": "paste_the_output_here"
....
}
I don't know if it will work in your case but that did the trick for me several times.
Click on:
And then choose python from venv.
Let know what happened.
I think virtual env is ok, but you didn't activated it in console.
And also I think it will be more intuitive if you create venv relatively, in the same folder as app, then just add it to .gitignore and .dockerignore
Execute this command to create virtual environment:
python -m venv .venv --prompt ${PWD##*/}
Then activate:
source .venv/bin/activate
More advanced:
echo "creating virtual environment"
python -m venv .venv --prompt ${PWD##*/}
echo "upgrading pip & wheel"
.venv/bin/python -m pip install -U pip
.venv/bin/pip install wheel
echo "installing dependencies from requirements.txt"
.venv/bin/pip install -r requirements.txt
Then activate:
source .venv/bin/activate
(You can also activate by opening new terminal instance in vscode, or aby destroying current one and opening new)
While developing my app I didn't use an environment. Now I want to use one and export all dependencies of my app in an environment.yml / requirements.txt file that afterwards I can use it to build a docker image.
The issue is, if I create an environment and then export it with:
conda env export > environment.yml
I get no dependencies in that file.
Or if I use:
pip freeze --local > requirements.txt
I see all system modules that have nothing to do with my project.
I would imagine conda or pip has something that would just go through all my files in the directory I am and place all imports and their dependencies inside the environment.yml/requirements.txt file.
I can't find the command to do that..
You can use virtualenv to isolate your pip environment of your application from rest of your system. Use:
virtualenv <your_project_path>/venv
This will create a virtual environment of your app. Then use;
source venv/bin/activate
This will isolate your pip environment. Reinstall all your dependencies and run pip freeze you will see only project related dependencies.
pip freeze by default fetches all installed pip modules over the system. If you use virtualenv and then install your dependencies, your pip modules will reside in your application folder.
edit
I would recommend a good IDE based on your comments such as PyCharm. You can follow tutorial here for setting up venv and handling all your dependencies. Once done, you can run pip freeze for your requirements.txt
I updated the versions of mkvirtualenv and virtualenv
$ sudo pip install --upgrade virtualenv virtualenvwrapper
because my whole life I only used Python 2, and wanted to use -now- Python 3. The virtualenvwrapper had some issues.
Then I tried creating a virtual environment for my python3 installation:
$ mkvirtualenv py3test -p /usr/bin/python3
The environment is created in ~/.virtualenvs/py3test. Once active, I want to install a package I made:
(py3test)$ pip install python-cantrips
(py3test)$ pip freeze
And the package is appropriately installed. Then I install ipython and run it:
(py3test)$ pip install ipython
(py3test)$ ipython
And I enter ipython appropriately. But then I...
import cantrips
And it explodes with an ImportError. Then I check sys.path. And the issue is here: sys.path includes a path like: '/home/myuser/.virtualenvs/py3test/lib/python2.7/site-packages'. I don't remember whether the path is exact or not, since I am not in such computer right now. But I can have one thing for sure: the environment was created with python3 (the directory is not python2.7 but python3.5 in my virtualenv).
So: Why is virtualenv creating an environment for python3 but adding me the paths as if it was a python2.7 environment instead?
Found it!
There was no issue with virtualenv or virtualenvwrapper. The issue was with ipython. Actually, there is no issue specifically with ipython but with the way the scripts are accessible inside a virtualenv.
Globally, I had ipython installed (which works with the global python27). When I installed ipython in my local python3 environment, the (shell) path was not updated until I somehow refresh the environment again (e.g. deactivating, activating again). So when I tried again, the ipython was the appropriate (the local ipython in my environment with 3.5), and the generated path was the expected one.
I've created a Python virtual environment, and activate it by doing:
joe#joe-mint $ source ./venvs/deep-learning/bin/activate
Which turns the prompt into:
(deep-learning) joe#joe-mint $
Now whenever I run a python package or try to install one, the system seems to ignore the fact that it's in a virtual environment and does things system-wide:
(deep-learning) joe#joe-mint $ which pip
/usr/local/bin/pip
The same happens when I try to install new packages that aren't on my system; it installs them to the system files (i.e. /usr/bin) instead of the virtual environment.
What's wrong with my virtual environment? How do I get it to ignore system files and do everything inside the environment?
I've looked at this question which says to use an explicit flag when creating the virtual environment to make it use the local environment packages, but I used python-3.5 -m venv to create the virtual environment, and this flag is removed in this version as it's now a default option.
I've also looked at this question and can confirm that the VIRTUAL_ENV variable is set correctly in the activate file of the virtual environment.
Here was the problem:
It seems that if you run pip on a venv without a local pip installation, then it will default to the system's pip outside the venv. Even if you've activated a virtual environment, this seems to want to install packages on the system rather than in the venv.
Here was the solution:
First, I had to install the virtual environment without pip due to a bug that has long remained unresolved.
Second, I installed pip in the virtual environment as per the instruction here. However, doing so required using some temporary folders that for some reason my user didn't have access to. So this failed, and the only way I could get it to work was to become root.
sudo su
activate ..../venvs/deep-learning/bin/activate to activate the virtual environment.
curl --silent --show-error --retry 5 https://bootstrap.pypa.io/get-pip.py | python as per the answer linked above.
Although which pip now indicated the correct pip (inside the venv) was being used, running pip would use the system one! Deactivating (deactivate) and reactivating the venv solved this.
Now it took me a while to realise that having installed this as root caused me permission errors when trying to install more packages using pip inside the virtual environment.
chown <user>:<group> -R ..../venvs/deep-learning/*
And that was it. After these steps, I could activate the venv and run pip correctly. It would use the pip inside the venv, and install packages inside the venv.