How do you check what package and version are installed inside virtualenv?
My thought was to create requirement.txt file. But, is there another way to do this from CLI?
Once you activate your virtual environment, you should be able to list out packages using pip list and verify version using python --version.
pip list will show you all the packages that are installed for the virtualenv. I see you are wanting to create the requirements.txt file from a CLI, so you can run this command
pip freeze > requirements.txt
This will give you a requirements.txt file inside your current directory, for ALL the packages and libraries that are installed for that virtualenv.
Personally, I like using pigar. All you need to do after installing it is run pigar and it will search through your Python files inside the current directory and create a requirements.txt file for you based on all your imports from those files.
Related
I have installed python in my system and wrote a simple script using GET REST API for Jenkins data.
I have installed all the required modules using pip. Now I want to package this script with all the dependencies and run on another machine. However, in another machine, I don't want to perform all the pip installation steps.
I know we can mention all the modules in the requirements.txt and use pip install -r requirements.txt. But, is there any way so that I don't need to install modules using pip for each dependency, such that I can install Python and all other dependencies must be installed when I run the zip file.
You can install pip dependencies to a certain directory using -t (target).
pip install -r requirements.txt -t .
That will install your pip modules to the current directory. You can zip the whole thing then and deploy. Make sure that the environment you install the dependencies in matches your intended deployment environment. For consistency you can run the command in a docker container, for example.
I think you should use virtualenv module which makes your project easily deploy-able.
Virtual Environment should be used whenever you work on any Python based project. It is generally good to have one new virtual environment for every Python based project you work on. So the dependencies of every project are isolated from the system and each other.
I came across a link which can help Virtual Env explained
So I want to send my python script to another computer, nonetheless, it doesn't have the same package installed on it. Is there any way to send the whole python code as a folder which will also include all that packages? ( I have tried creating a virtual environment through the problem relies on the fact that a lot of the code in the virtual environment is made of aliased files which might not exist on the other computer).
Thank you very much in advance for your help.
you can make a requirements.txt file, here will be all your packages from your project, if you have already vitualenv for example to create your requirements.txt execute
pip freeze > requirements.txt
example of requirements.txt
absl-py==0.2.1
amqp==2.3.1
asn1crypto==0.24.0
after this take your all project without your virtualenv, copy to another compure create a new virtualenv, enter in your virtualenv and make
pip install -r requirements.txt
you will have all your packages
Barring the longer process of creating an installable package with a setup.py file, you can place your script in its own folder, then add a pip requirements file. While your virtualenv is active, run the bash/terminal command:
pip freeze > requirements.txt
Ensure that you send the requirements file with the script, and then the recipient can simply run the bash/terminal command (in their own virtualenv, which they hopefully will do)
pip install -r requirements.txt
before running the script.
I have a question about python virtualenv. I get a virtualenv for a project with all packages required to run that project. But when i run it for the first time and it crash 'cause python has some requirements not satisfied. So i check if there is all packages inside:
virtualenv/lib/python2.7/site-packages/
and all packages required are inside.
But when i type:
pip list
packages doesn't shown. So i have to run:
pip install -r requirements.txt
pip downloads them again.
So my question is, why pip downloads and reinstall them again if they are installed yet ? And how i can force pip to reinstall all packages inside virtualenv ?
The problem was that all scripts inside the virtualenv were created on another pc with them paths. Indeed when i launched python or pip from virtualenv they ran from my global path 'cause couldn't find virtualenv script path and in particular pip shown my global packages.
Fixing directives path of all script inside virtualenb/bin/ to my real virtualenv path solved this issue.
I am trying to show python libraries in requirements.txt by using "pip freeze > requirements.txt" and my virtual environment is still activated.
But instead of getting my current virtual env's libraries, I am getting all the libraries in my requirements.txt file that i used in other environment earlier.
What is the issue?
Your virtualenv is certainly messed up.
To find the packages which are not required, run:
pip list --not-required
But, I think it's time to recreate your virtualenv.
I'm very new in python, sorry if my question is very basic.I have a shell script that using it to run a .py file on a cluster. Here is my shell script:
#!/bin/bash
module add python/2.6
python Myfile.py
Python has been installed on the cluster but some of the libraries and packages needs to be installed. For example, I need to install Numpy package, is there any way that I can do it inside my shell script or my .py file before "import" it?
Thanks
For this (and similar) use case, I would recommend a combination of pip and virtualenv.
You would install pip into your system Python install (i.e. sudo apt-get install python-pip), and then install virtualenv via pip, i.e. pip install virtualenv).
You can then create a specific virtualenv for this project. This represents a sandboxed environment with specific versions of libraries that are specified traditionally through a requirements file (using the -r option), but can also be specified individually through the command line.
You would do this via command like virtualenv venv_test, which will create a virtualenv directory named venv_test in the current directory. You can then run pip from that virtualenv's bin dir to install packages.
For example, to install the flask package in that virutalenv, you would run:
venv_test/bin/pip install flask
You can then either run source venv_test/bin/activate to put the current shell into the virtualenv's, or invoke a script directly from the virtualenv's interpreter, i.e.:
venv_test/bin/python foo.py
Here's link to a virtualenv introduction for some additional details/steps.