I was able to install 0MQ in Ubuntu 12.04 by doing the followinng:
$ sudo apt-get install libzmq-dev
$ sudo apt-get install python-zmq
but when I went to use it in a virtualenv it could not find the module. What do I have to do in my virtualenv to see it
Once you make your virtualenv and activate it, use pip to install Python packages. They will install into your virtualenv.
Alternately, when you create your virtualenv, enable system-wide packages (with the --system-site-packages switch) within it so that system-installed packages will be visible in the virtualenv.
Related
I'm on ubuntu 22.04 and installed pip using Linux package managers:
sudo apt update
sudo apt install python3-venv python3-pip
The point is that when I activate virtual environment and install dependencies on it:
source venv/bin/activate
pip install -r requirements.txt
, it installed on system-installed python rather than on a virtual environment.
which pip
output is: /usr/bin/pip which is the same folder without the virtual environment:
deactivate
which pip
the output is : /usr/bin/pip
the objective is to install the package only on the environment separately from system-installed python
in directory of your project use this command to create virtual environment
python3 -m venv venv
if you wish to use other version of python first you should install it self and venv module with following command
sudo apt update && sudo apt install python3.9 python3.9-venv python3.9-pip
to use venv first you need to activate it with:
source venv/bin/activate
or
. venv/bin/activate
then use venv python with this command
pyhton3.9 -m pip install -r requirements.txt
I've been working on building debian package of my project.
When user installs my package, then my project's binaries are installed on /usr/bin/*. And then, a bash script is invoked at the end that creates python virtual environment to /usr/share/my_proejct/venv and installs required python package on that virtual environment.
$ sudo dpkg -i my_project.deb
# being installed on /usr/bin/*
# automatically `sudo post_install.sh` is invoked(debian postinst)
$ cat post_install.sh
python3 -m pip install -U virtualenv # sudo
python3 -m venv /usr/share/my_project/venv # sudo
/usr/share/my_project/venv/bin/python -m pip install ${REQUIRED_PACKAGES}
And my project's binaries are using that virtual environment's python.
AFAIK, running pip with sudo has security problem. But I just use virtualenv's python( /usr/share/my_project/venv/bin/python) directly;I still install virtualenv with sudo and create venv with it. Is it still dangerous?
Can I use this virtual environment's python with multiple users?
I'm trying to set up a standard virtual-environment(venv) with python 3.7 on Ubuntu 18.04, with pip (or some way to install packages in the venv). The standard way to install python3.7 seems to be:
% sudo apt install python3.7 python3.7-venv
% python3.7 -m venv py37-venv
but the second command fails, saying:
The virtual environment was not created successfully because ensurepip
is not available. On Debian/Ubuntu systems, you need to install the
python3-venv package using the following command.
apt-get install python3-venv
You may need to use sudo with that command. After installing the
python3-venv package, recreate your virtual environment.
Failing command: ['/py37-venv/bin/python3.7', '-Im', 'ensurepip',
'--upgrade', '--default-pip']
This is true; there is no ensurepip nor pip installed with this python. And I did install python3.7-venv already (python3-venv is for python3.6 on Debian/Ubuntu). I gather there has been some discussion about this in the python community because of multiple python versions and/or requiring root access, and alternate ways to install python modules via apt or similar.
Creating a venv without pip (--without-pip) succeeds, but then there's no way to install packages in the new venv which seems to largely defeat the purpose.
So what's the accepted "best practice" way to install and use python3.7 on 18.04 with a venv?
I don't know if it's best practices or not, but if I also install python3-venv and python3.7-venv then everything works (this is tested on a fresh stock Debian buster docker image):
% sudo apt install python3.7 python3-venv python3.7-venv
% python3.7 -m venv py37-venv
% . py37-venv/bin/activate
(py37-venv) %
Note that it also installs all of python3.6 needlessly, so I can't exactly say I like it, but at least it does work and doesn't require running an unsigned script the way get-pip.py does.
sudo apt install python3-venv
python3 -m venv env
I have installed virtualenv on my system using http://www.pythonforbeginners.com/basics/how-to-use-python-virtualenv
according to these guidelines, the initial step is:
$ sudo apt-get install python-pip python-dev python-virtualenv
However, I do not want to touch my parent environment. The only reason I believe virtualenv might be of some help for my case is because I have some weird errors that point to python version inconsistencies.
So my requirements are:
virtualenv with e.g. python 3.5
tensorflow
no influence on my parent environment
ability to disable virtualenv with no side effects
Is it doable how?
You could follow the steps in this answer for instance, which will be essentially the same as the guide you've mentioned.
virtualenv installs libraries and all in a subfolder of your main system, and directs python to only use those, so they don't interfere with your main installation.
If you really don't want to touch anything in your system, you could always run tensorflow in a docker container (see this answer for some tips). But even that will require some installation in the "parent" system.
create the env
virtualenv -p python3 path/to/your/env
activate the env
source path/to/your/env/bin/activate
install packages
pip install pkgname
deactivate
deactivate
If you do not want to touch your parent environment, install package using pip after activating the environment. Next time you activate the environment, the installed packages will remain there. If you want to delete the environment, just delete the folder path/to/your/env.
Just run this single command:
It installs python package manager: pip.
It creates a virtual environment named: my_env.
It activates the virtual environment.
sudo apt-get install python3-pip -y && sudo apt install python3.8-venv && python3 -m venv my_env/ && source my_env/bin/activate
It it's possible, of course.
For example - I can download python-dbus like this:
$ sudo apt-get download python-dbus
But what I should to do next, with this .deb package in my current virtualenv?
If you really need to do it this way, you can just copy the files that get installed globally directly into your virtualenv. For example I couldn't get pycurl working since the required libraries weren't installing, but apt-get install python-pycurl did. So I did the following:
sudo apt-get install python-pycurl
cp /usr/lib/python2.7/dist-packages/pycurl* ~/.virtualenvs/myenv/lib/python2.7/site-packages/
The install said it was adding it to /usr/lib/python2.7. So I looked in that directory for a site-packages or dist-packages with pycurl, after looking at the files I copied them into my virtualenv. You'd have to also copy any executables from bin into your virtualenv's bin directory.
Also, running a pip install -r requirements.txt successfully found pycurl in there and just skipped over it as if I had installed it via pip.
To include system site packages in your existing virtual environment open the config file:
<PATH_TO_YOUR_VENV_FOLDER>/pyvenv.cfg
and change false to true for include-system-site-packages
include-system-site-packages = true
Save and reload your virtual environment.
(tested with virtualenv 20.2.2 on Raspbian GNU/Linux 10 (buster) to pull in python3-pyqt5 installed with apt into my virtual environment)
If it is for a new environment #Joshua Kan's answer using the --system-site-packages flag with the venv command is probably what you want.
Why would you want to do this? The whole point is to avoid doing stuff like that...
virtualenv whatever
cd whatever
bin/pip install dbus-python
You may also choose to specify --no-site-packages to virtualenv to keep it extra isolated.
An alternative solution is to install globally, then followed by allowing the virtualenv to be able to see it.
As an example, let's say we want to install matplotlib for Python 3:
sudo apt update # Update first
sudo apt install python3-matplotlib # Install globally
sudo pip3 install -U virtualenv # Install virtualenv for Python 3 using pip3
virtualenv --system-site-packages -p python3 ./venv #the system-site-packages option allows venv to see all global packages including matplotlib
source ./venv/bin/activate #activate the venv to use matplotlib within the virtualenv
deactivate # don't exit until you're done using the virtualenv
First install the dbus development libraries (you may need some other dev libraries, but this is all I needed)
sudo apt-get install libdbus-1-dev libdbus-glib-1-dev
Next, with your virtualenv activated, run the following. It'll fail but that's ok.
pip install dbus-python
Finally, go into your virtualenv's build directory and install it the non-pythonic way.
cd $VIRTUAL_ENV/build/dbus-python
chmod +x configure
./configure --prefix=$VIRTUAL_ENV
make
make install