So I have a virtualenv folder called venv for my python project.
I can run:
venv/bin/pip install -r requirements.txt
Which installs all requirements I need for the project except one, M2Crypto. The only way to install it is through apt-get:
apt-get install python-m2crypto
How can I then add this package installed through apt to venv folder?
--system-site-packages
gives access to the global site-packages modules to the virtual environment.
you could do:
$ sudo apt-get install python-m2crypto
$ virtualenv env --system-site-packages
... and you would then have access to m2crypto (along with all other system-wide installed packages) inside your virtualenv.
What I did after all:
cp -R /usr/lib/python2.7/dist-packages/M2Crypto /home/richard/hello-project/venv/lib/python2.7/site-packages/
cp -R /usr/lib/python2.7/dist-packages/OpenSSL /home/richard/hello-project/venv/lib/python2.7/site-packages/
Real simple solution.
In the virtual environment directory, edit the file pyvenv.cfg. Set the parameter include-system-site-packages = true, and save the file.
The globally installed modules will appear the next time you activate (source venv/bin/activate) your environment.
It can be verified via pip list.
Enjoy!
toggleglobalsitepackages will toggle access to the system-wide site-packages.
Note: You need to pip install virtualenvwrapper to get this command; the vanilla virtualenv doesn't include it. With virtualenvwrapper you also get the very useful mkvirtualenv and rmvirtualenv commands, among others.
venv/bin/pip install -I M2Crypto
The -I forces it to also be installed into the virtualenv, even if it's already globally installed.
The only way to transfer packages locally from one environment or global to the other virtual environment is
Copying the the 📁"Lib"folder or the package folder with all its contents from the environment to the other environment you want the package to work
If you don't know it's location search for it within the environment folder using file explorer
Lib folder contains all the installed packages of the environment
Related
I'm working with a distributed package system (happens to be CVMFS but I don't think the particulars are relevant). The setup script for that environment adds some locations to PYTHONPATH.
Now, working in that environment, I want to install newer versions of some packages that are already found within that path. Installing the packages is easy, I can either do
python -m pip install --user --upgrade <packages>
or, my preferred approach, use a virtual environment
python -m venv myenv
source myenv/bin/activate
python -m pip install --upgrade <packages>
Installation works fine. But then when I try to run python, the directories in PYTHONPATH are search before either my user-site directory or the virtual environment, and I get the old versions of packages.
Is there any way to force packages in my virtual environment or user-site path to have priority, without having to manually edit sys.path or PYTHONPATH?
I'm to deploy Python into a production system and the python script I have has a number of modules associated with it.
Is there a way to install python with only a specific list of modules? Abit like with generating a jar, you can have a folder with all the other dependency jar's in a folder, which is nice and clean. I don't want to compile the python code so I want something similar.
(Note: I also don't want to create a virtual environment - I want the default environment like this)
You can either use virtualenv, which basically is what the name suggests, or you can use Docker, which personally I prefer
If you don't want to do like what Amir is suggesting above, then 2 other options are available:
Copy those modules and place them in the same folder where your script is installed
Create a requirements.txt file with the name & version of those modules and then run "pip install -r requirements.txt" to install these modules in your site-packages folder
To manage your python packages you can use great virtualenv tool, it looks really simple and works well on linux/macOS/Windows. Any package which will be installed in activated virtualenv will be available only in this virtualenv, so you can have for example 3 different versions of "Django" package on your machine and work with them using different virtual environments:
Install virtualenv:
$ pip3 install virtualenv
Create your virtualenv:
$ virtualenv -p python3 my_virtualenv_name
Activate your virtualenv:
$ . my_virtualenv_name/bin/activate
Check what packages have been installed:
$ pip freeze
Install any package for example "Django":
$ pip install Django
Confirm installation:
$ pip freeze | grep Django
Uninstall any package from your virtual environment:
$ pip uninstall Django -y
Uninstall all packages from your virtual environment:
$ pip freeze | xargs pip uninstall -y
Deactivate virtualenv
$ deactivate
More info in the official documentation: https://virtualenv.pypa.io/en/latest/
I made Virt2 virtual environment.
using $ python -m venv Virt2.
I want to install my custom packages in "site-packages" directory. However packages are installed in "dist-packages" directory.
what should I do to install packages in my python virtual environment site-packages??
my python version 3.6.2 (in /usr/local/bin)
You use sudo and sudo switches user to root, i.e. you're completely outside of you virtual env. System pip3 outside of virtual env installs packages into a system directory which is dist-packages.
Run pip install inside you virtual env without sudo.
It it's possible, of course.
For example - I can download python-dbus like this:
$ sudo apt-get download python-dbus
But what I should to do next, with this .deb package in my current virtualenv?
If you really need to do it this way, you can just copy the files that get installed globally directly into your virtualenv. For example I couldn't get pycurl working since the required libraries weren't installing, but apt-get install python-pycurl did. So I did the following:
sudo apt-get install python-pycurl
cp /usr/lib/python2.7/dist-packages/pycurl* ~/.virtualenvs/myenv/lib/python2.7/site-packages/
The install said it was adding it to /usr/lib/python2.7. So I looked in that directory for a site-packages or dist-packages with pycurl, after looking at the files I copied them into my virtualenv. You'd have to also copy any executables from bin into your virtualenv's bin directory.
Also, running a pip install -r requirements.txt successfully found pycurl in there and just skipped over it as if I had installed it via pip.
To include system site packages in your existing virtual environment open the config file:
<PATH_TO_YOUR_VENV_FOLDER>/pyvenv.cfg
and change false to true for include-system-site-packages
include-system-site-packages = true
Save and reload your virtual environment.
(tested with virtualenv 20.2.2 on Raspbian GNU/Linux 10 (buster) to pull in python3-pyqt5 installed with apt into my virtual environment)
If it is for a new environment #Joshua Kan's answer using the --system-site-packages flag with the venv command is probably what you want.
Why would you want to do this? The whole point is to avoid doing stuff like that...
virtualenv whatever
cd whatever
bin/pip install dbus-python
You may also choose to specify --no-site-packages to virtualenv to keep it extra isolated.
An alternative solution is to install globally, then followed by allowing the virtualenv to be able to see it.
As an example, let's say we want to install matplotlib for Python 3:
sudo apt update # Update first
sudo apt install python3-matplotlib # Install globally
sudo pip3 install -U virtualenv # Install virtualenv for Python 3 using pip3
virtualenv --system-site-packages -p python3 ./venv #the system-site-packages option allows venv to see all global packages including matplotlib
source ./venv/bin/activate #activate the venv to use matplotlib within the virtualenv
deactivate # don't exit until you're done using the virtualenv
First install the dbus development libraries (you may need some other dev libraries, but this is all I needed)
sudo apt-get install libdbus-1-dev libdbus-glib-1-dev
Next, with your virtualenv activated, run the following. It'll fail but that's ok.
pip install dbus-python
Finally, go into your virtualenv's build directory and install it the non-pythonic way.
cd $VIRTUAL_ENV/build/dbus-python
chmod +x configure
./configure --prefix=$VIRTUAL_ENV
make
make install
I'm using virtualenv for sandboxing my Python environment, pip to install/uninstall packages and yolk to list the packages.
I can install packages to my virtual environment by using pip install <package name> -e=<environment name> and I guess I don't need to have pip inside my virtual environment. Am i correct?
If I need to list out all the installed packages in my virtual environment, can I use yolk -l to do so? I know I can do this by keeping yolk installed inside the environment but is this also possible by keeping yolk outside the environment i.e. the global Python installation.
Thanks.
Here is your workflow:
Add virtualenv and pip to your global environment.
Create virtualenvs
Within a virtualenv, add new packages
I recommend you look into virtualenvwrapper. It makes the maintenance of virtualenvs way easier.
Download and install virtualenvwrapper in your global environment
Create directory ~/.virtualenvs
Modify your ~/.bashrc with these statements:
export WORKON_HOME=$HOME/.virtualenvs
export VIRTUALENVWRAPPER_VIRTUALENV_ARGS='--no-site-packages --python=python2.6'
source /usr/local/bin/virtualenvwrapper.sh
Then you can create, delete, modify, and change between virtualenvs easily.
So, for your questions:
Should I put pip inside my virtualenv?
No, do not do that.
Should I use yolk to list the packages?
Not familiar with yolk. I just use pip freeze and then I get a
requirements file that lists all the packages for recreating my
environment.