I have a .env file that I used to create a virtual environment.
I want to install the same packages (as specified in the .env file), but this time
I dont want it to be a virtual environment. How can I do this?
Many thanks.
Note:
A .env file can be used by miniconda or anaconda to create virtual environments like so:
conda create --name optimus --file alpha.env
then you can run
source activate optimus
to activate your virtualEnv. But how can I do something similar to:
pip install -r myFile.env
to install all packages specified in myFile.env, but not in a virtual environment.
Here is my alpha.env file:
cairo=1.12.2=2
dateutil=2.1=py27_2
freetype=2.4.10=0
numpy=1.6.2=py27.4
If you have access to your original virtual env or can create a new one as you describe I think a very simple solution would be to simply create a requirements.txt file. This is simply a file that names all the packages that you have installed in a specific python environment.
The file can be created by simply running
pip freeze > requirements.txt
This will create a file in your working directory called requirements.txt. You have to have your virtual environment active. To then install the packages in your new environment you just have to enter it (deactivate your virtualenv or what ever you need to do) and run
pip install -r requirements.txt
I have never seen a .env file, but by the sound of it it could actually be exactly the same as a requirements.txt file but under a different name. See if the output of
pip freeze
looks the same (as far as format etc) as what you have in your .env file. In that case you could also just run the command you would in your virtual env ie.
pip install -r myFile.env
EDIT:
After seeing your .env file output I am convinced that you can simply run the
pip install -r myFile.env
command.
Related
Pip Freeze shows all the libraries I have on my computer not just the libraries in the virtual environment.
I am trying to create a requirements.txt file for my virtual environment. I'm using an anaconda distribution. I am creating a flask app. I have navigated to my project folder created the virtual environment added flask and then when I make the command pip freeze it clearly shows items that are not in my virtual environment like xlwings, pandas and stuff I use that has nothing to do with flask.
Any way I can create a requirements file from my virtual environment.
I can clearly see my virtual environment is active with (venv) to the left.
Edit: I created a short video showing I get the same list of libraries whether I'm in my virtual environment or not. Also I'm showing the site-packages in my virtual environment and showing that these libraries aren't there I'm specifically pointing out xlwings.
https://youtu.be/xEFZ3dSaqoY
So i'm not sure why it was happening, but I deleted the virtual environment and re-created it (I had a previous requirements.txt that was correct). Then I ran pip freeze again and it all worked. Not sure what happened, but it works for me now.
I had this same problem and it happened because i change the name of my venv folder.
To solve this i used this command in my terminal [yourVenv]\Scripts\python -m pip freeze.
If you don't want to do it like this everytime, try creating a new enviorment like this.
[yourVenv]\Scripts\python -m pip freeze > requirements.txt
python -m venv [yourVenv]
[yourVenv]\Scripts\activate
python -m pip install --upgrade pip
pip install -r requirements.txt
You need to activate your pip environment in the console that you are trying to run pip freeze in. That way it uses the environment's pip and not your global pip.
So in your console, navigate to your virtual environment folder. From there go to the "Scripts" folder. Then enter the word "activate" into the console.
You should then see next to the console cursor the name of your virtual environment. At that point you can use the pip that's inside of your virtual environment and all the normal pip commands will point to it.
In my case i was using vscode IDE so when i created .venv its recommended use worksapce, so i choosed yes thats why its took all from my system' lib.
for this you can delete old on .venv and create new one, this time don't choose plugin's recommendations.
for create veirtual enviroment in window/linx
python -m venv .venv or python3 -m venv .venv
to activate .venv win use (.venv\scripts\activate)
and for linux use (source .venv/bin/activate)
all command like pip freeze, pip list, pip freeze > requirements.txt will work
So I want to send my python script to another computer, nonetheless, it doesn't have the same package installed on it. Is there any way to send the whole python code as a folder which will also include all that packages? ( I have tried creating a virtual environment through the problem relies on the fact that a lot of the code in the virtual environment is made of aliased files which might not exist on the other computer).
Thank you very much in advance for your help.
you can make a requirements.txt file, here will be all your packages from your project, if you have already vitualenv for example to create your requirements.txt execute
pip freeze > requirements.txt
example of requirements.txt
absl-py==0.2.1
amqp==2.3.1
asn1crypto==0.24.0
after this take your all project without your virtualenv, copy to another compure create a new virtualenv, enter in your virtualenv and make
pip install -r requirements.txt
you will have all your packages
Barring the longer process of creating an installable package with a setup.py file, you can place your script in its own folder, then add a pip requirements file. While your virtualenv is active, run the bash/terminal command:
pip freeze > requirements.txt
Ensure that you send the requirements file with the script, and then the recipient can simply run the bash/terminal command (in their own virtualenv, which they hopefully will do)
pip install -r requirements.txt
before running the script.
While developing my app I didn't use an environment. Now I want to use one and export all dependencies of my app in an environment.yml / requirements.txt file that afterwards I can use it to build a docker image.
The issue is, if I create an environment and then export it with:
conda env export > environment.yml
I get no dependencies in that file.
Or if I use:
pip freeze --local > requirements.txt
I see all system modules that have nothing to do with my project.
I would imagine conda or pip has something that would just go through all my files in the directory I am and place all imports and their dependencies inside the environment.yml/requirements.txt file.
I can't find the command to do that..
You can use virtualenv to isolate your pip environment of your application from rest of your system. Use:
virtualenv <your_project_path>/venv
This will create a virtual environment of your app. Then use;
source venv/bin/activate
This will isolate your pip environment. Reinstall all your dependencies and run pip freeze you will see only project related dependencies.
pip freeze by default fetches all installed pip modules over the system. If you use virtualenv and then install your dependencies, your pip modules will reside in your application folder.
edit
I would recommend a good IDE based on your comments such as PyCharm. You can follow tutorial here for setting up venv and handling all your dependencies. Once done, you can run pip freeze for your requirements.txt
I've created a Python virtual environment, and activate it by doing:
joe#joe-mint $ source ./venvs/deep-learning/bin/activate
Which turns the prompt into:
(deep-learning) joe#joe-mint $
Now whenever I run a python package or try to install one, the system seems to ignore the fact that it's in a virtual environment and does things system-wide:
(deep-learning) joe#joe-mint $ which pip
/usr/local/bin/pip
The same happens when I try to install new packages that aren't on my system; it installs them to the system files (i.e. /usr/bin) instead of the virtual environment.
What's wrong with my virtual environment? How do I get it to ignore system files and do everything inside the environment?
I've looked at this question which says to use an explicit flag when creating the virtual environment to make it use the local environment packages, but I used python-3.5 -m venv to create the virtual environment, and this flag is removed in this version as it's now a default option.
I've also looked at this question and can confirm that the VIRTUAL_ENV variable is set correctly in the activate file of the virtual environment.
Here was the problem:
It seems that if you run pip on a venv without a local pip installation, then it will default to the system's pip outside the venv. Even if you've activated a virtual environment, this seems to want to install packages on the system rather than in the venv.
Here was the solution:
First, I had to install the virtual environment without pip due to a bug that has long remained unresolved.
Second, I installed pip in the virtual environment as per the instruction here. However, doing so required using some temporary folders that for some reason my user didn't have access to. So this failed, and the only way I could get it to work was to become root.
sudo su
activate ..../venvs/deep-learning/bin/activate to activate the virtual environment.
curl --silent --show-error --retry 5 https://bootstrap.pypa.io/get-pip.py | python as per the answer linked above.
Although which pip now indicated the correct pip (inside the venv) was being used, running pip would use the system one! Deactivating (deactivate) and reactivating the venv solved this.
Now it took me a while to realise that having installed this as root caused me permission errors when trying to install more packages using pip inside the virtual environment.
chown <user>:<group> -R ..../venvs/deep-learning/*
And that was it. After these steps, I could activate the venv and run pip correctly. It would use the pip inside the venv, and install packages inside the venv.
So I have a virtualenv folder called venv for my python project.
I can run:
venv/bin/pip install -r requirements.txt
Which installs all requirements I need for the project except one, M2Crypto. The only way to install it is through apt-get:
apt-get install python-m2crypto
How can I then add this package installed through apt to venv folder?
--system-site-packages
gives access to the global site-packages modules to the virtual environment.
you could do:
$ sudo apt-get install python-m2crypto
$ virtualenv env --system-site-packages
... and you would then have access to m2crypto (along with all other system-wide installed packages) inside your virtualenv.
What I did after all:
cp -R /usr/lib/python2.7/dist-packages/M2Crypto /home/richard/hello-project/venv/lib/python2.7/site-packages/
cp -R /usr/lib/python2.7/dist-packages/OpenSSL /home/richard/hello-project/venv/lib/python2.7/site-packages/
Real simple solution.
In the virtual environment directory, edit the file pyvenv.cfg. Set the parameter include-system-site-packages = true, and save the file.
The globally installed modules will appear the next time you activate (source venv/bin/activate) your environment.
It can be verified via pip list.
Enjoy!
toggleglobalsitepackages will toggle access to the system-wide site-packages.
Note: You need to pip install virtualenvwrapper to get this command; the vanilla virtualenv doesn't include it. With virtualenvwrapper you also get the very useful mkvirtualenv and rmvirtualenv commands, among others.
venv/bin/pip install -I M2Crypto
The -I forces it to also be installed into the virtualenv, even if it's already globally installed.
The only way to transfer packages locally from one environment or global to the other virtual environment is
Copying the the 📁"Lib"folder or the package folder with all its contents from the environment to the other environment you want the package to work
If you don't know it's location search for it within the environment folder using file explorer
Lib folder contains all the installed packages of the environment