Where does venv get pip from? - python

On a linux(Manjaro) system I had a couple of copies of pip. Some were at the system level and some at the user level and some withing virtual environments where they had been updated to levels higher than the system or user.
I did not like this b/c depending on what I was doing I would have to know what pip is was using, so I wanted to remove them and install pip at the user level and update it to the latest. I have removed every copy of pip that I can find. Finding them I used an exhaustive search on the command line using find command.
If I try to use pip in any terminal now I get that it is not a found command. BUT if I do python -m venv ./venv I find that a copy of pip has made its way into the venv folder.
Where is this copy of pip coming from?
I have check inside ~/.local/, /etc/, /root/, ... like I said, i have done an exhaustive search to see if it is just residing on my system.
EDIT:
I decided to look into the venv module because that was the built-in virtual environment that I have been using.
In /usr/lib/python3.9/venv/ there are __main__.py and __init__.py. Without delving into python usage/implentation specifics I am assuming that the __main__.py file is importing the main() function by use of from . import main in the __main__.py file. Then, in __init__.py the main function calls the create() function as part of the EnvBuilder class defined inside. That create() function eventually leads to _setup_pip() where there is this
cmd = [context.env_exe, '-Im', 'ensurepip', '--upgrade',
'--default-pip']
subprocess.check_output(cmd, stderr=subprocess.STDOUT)
So if pip is not found locally they are going out on the network to get it.
So if this is true then that leads me to want to look at passing a flag to venv when creating the virtual environment so that I can see what it is doing. As of now using python -m venv ./venv just leads to it pausing for a moment while it does the operation then it comes back to the command line prompt. With some sort of output I may have been able to see the download of pip and moved forward quicker.

Related

Auto-completion in Click does not work outside virtualenv

I have a Python script that is designed to make some basic text edits and accepts command line parameters. The whole project is enclosed in a virtual environment too.
I am using python Click module to accept command line parameters, which does support auto completion in bash. I have a basic setup.py file to install the main script as a command or in a virtual environment by using pip install --editable ..
Now let's say main script is called edits. I should use _EDITS_COMPLETE=bash_source edits to generate a bash script to be run and enables auto completion. This does work fine in a virtualenv, where the script gets generated. However, outside of it, the normal script output is given.
In another words, in virtualenv the script returns the correct auto complete script, but outside of it, nothing happens.
So where may the issue with this be? I expect it to generate auto complete script outside virtualenv too.
Link to repository with the script I am trying to use: https://github.com/Astra3/DiscordText
Run pip install click (or pip install -r requirements.txt) in our host environment (outside the virtual env) and it will work.

How can I add paths to the $PYTHONPATH or $PATH variables when a script within my python project is executed?

I have a python project with multiple scripts (scriptA, scriptB, scriptC) that must be able to find packages located in subpath of the project that is not a python package or module. The organization is like so:
|_project
|_scriptA.py
|_scriptB.py
|_scriptC.py
|_thrift_packages
|_gen-py
|_thriftA
|__init__.py
|_thriftA.py
On a per script basis I am adding the absolute path to this directory to sys.path
Is there a way that I can alter the PYTHONPATH or sys.path every time a script is executed within the project so that I do not have to append the path to this directory to sys.path on a per-script basis?
You have an XY problem, albeit an understandable one since the "proper" way to develop a Python project is not obvious, and there aren't a lot of good guides to getting started (there are also differing opinions on this, especially in the details, though I think what I'll describe here is fairly commonplace).
First, at the root of your project, you can create a setup.py. These days this file can just be a stub; eventually the need for it should go away entirely, but some tools still require it:
$ cat setup.py
#!/usr/bin/env python
from setuptools import setup
setup()
Then create a setup.cfg. For most Python projects this should be sufficient--you only need to put additional code in setup.py for special cases. Here's a template for a more sophisticated setup.cfg, but for your project you need at a minimum:
$ cat setup.cfg
[metadata]
name = project
version = 0.1.0
[options]
package_dir =
=thrift_packages/gen-py
packages = find:
Now create and activate a virtual environment for your project (going in-depth into virtual environments will be out of scope for this answer but I'm happy to answer follow-up questions).
$ mkdir -p ~/.virtualenvs
$ python3 -m venv ~/.virtualenvs/thrift
$ source ~/.virtualenvs/thrift/bin/activate
Now you should have a prompt that look something like (thrift) $ indicating that you're in an isolated virtualenv. Here you can install any dependencies for your package, as well as the package itself, without interfering with your main Python installation.
Now install your package in "editable" mode, meaning that the path to the sources you're actively developing on will automatically be added to sys.path when you run Python (including your top-level scripts):
$ pip install -e .
If you then start Python and import your package you can see, for example, something like:
$ python -c 'import thriftA'; print(thriftA)
<module 'thriftA' from '/path/to/your/source/code/project/thrift_packages/gen-py/thriftA/__init__.py'>
If this seems like too much trouble, trust me, it isn't. Once you get the hang of this (and there are several project templates, e.g. made with cookie-cutter to take the thinking out of it) you'll see that it makes things like paths less trouble). This is how I start any non-trivial project (anything more than a single file); if you set everything up properly you'll never have to worry about fussing with sys.path or $PYTHONPATH manually.
In this guide, although the first part is a bit application-specific, if you ignore the specific purpose of this code a lot of it is actually pretty generic, especially the section titled "Packaging our package", which repeats some of this advice in more detail.
As an aside, if you're already using conda you don't need to create a virtualenv, as conda environments are just fancy virtualenvs.
An advantage to doing this the "right" way, is that when it comes time to install your package, whether by yourself, or by users, if your setup.cfg and setup.py are set up properly, then all users have to do is run pip install . (without the -e) and it should work the same way.
You should add __init__.py in each package, and then properly call all your script.
|_project
|_scriptA.py
|_scriptB.py
|_scriptC.py
|__init__.py <====== Here
|_thrift_packages
|__init__.py <====== Here
|_gen-py
|_thriftA
|__init__.py
|_thriftA.py
Assuming that, project is in your pythonpath, you can do:
from project.thrift_packages.thriftA import thriftA
yes you can.
However I think the other answers are more adapted to your current issue.
Here I just explain, how to call subprocesses with another environment, another current working directory.
This can come in handy in other situations.
You can get the current environment as a dict (os.environ) copy it to another dict, modify it and pass it to a supbprocess call. (subprocess functions have an env parameter)
import os
import subprocess
new_env = dict(os.environ)
new_env["PYTHONPATH"] = whateveryouwanthere # add here the pythonpath that you want
cmd = ["mycmd.py", "myarg1", "myarg2"] # replace with your command
subprocess.call(cmd, env=new_env) #
# or alternatively if you also want to change the current directory
# subprocess.call(cmd, env=new_env, cwd=path_of_your_choice) #

Why is my virtualenv's pip listing packages in my lib/python2.7 directory?

In my home, I have a directory named lib/python2.7 (there are actually five directories like that, for different python versions). Since this is a shared hosting (Webfaction), that directory is fundamental to me. There, I have stuff like virtualenv and virtualenvwrapper installed, since as customer of a shared hosting, I have no access to sudo and installing global packages.
However, when I create a virtualenv:
$ mkvirtualenv myenvironment
$ workon myenvironment
$ which pip
# outputs the myenvironment's path to pip
$ pip freeze
The command shows the whole packages list under my lib/python2.7 (this includes the same virtualenv packages, and conflicting packages I have due to... legacy... reasons). This also annoys me if I want to install a package which is the name of a package in lib/python2.7 since it does not allow me to update it.
Right inside the workon environment, I try to check whether the PYTHONPATH has weird stuff, but it is empty:
$ echo $PYTHONPATH
# shows a blank line
It is also empty if I try that command out of any virtual environment.
It seems that --no-site-packages is default but solves just part of the problem. This means: pip freeze | wc -l displays a lesser value when in an environment than when executing globally, out of any environment, which tells me that there are certain already-provided packages that are being excluded and are from the hosting itself (and not installed by me since, again, the hosting is shared and I don't have access to global space).
My question is: How can I solve this? I want my virtualenv not list the packages in $HOME/lib/python2.7
Please avoid dupe-linking to this question, nothing was useful there and still does not have an accepted answer. I wrote this question after reading and trying each solution in that question
I think you need to specify python version. You can specify python version with which you want to create virtual environment using command like
virtualenv -p /usr/bin/python3.4 virt/virtname --no-site-packages
Because when you not specify a python version, virtualenv creates a environment with pythonv2.7 and hence all packages end up in the folder you mentioned.
Found the solution after deeply digging. This is a Webfaction custom but this could apply to any installation like this if the problem occurs.
The core of the problem is that Webfaction configured a sitecustomize.py file for us. The file is located at /usr/local/lib/pythonX.Y/sitecustomize.py and adds by itself the contents of ~/lib/pythonX.Y (and conditionally the contents of any python app under ~/webapps if you are working under a directory of it to run a python script).
Even when the virtualenv's python executable is a different one, it will load the said sitecustomize.py file each time it runs as the base python executable does.
The workaround here? Create an empty sitecustomize.py in your virtualenv to override the other:
touch ~/.virtualenvs/yourvenv/lib/pythonX.Y/sitecustomize.py
And it will work. Take this as reference if you are stuck here like I was
Notes: Replace X.Y on each case with the corresponding version you are working. Additionally remember: You cannot remove or edit the base sitecustomize.py since you are in a shared hosting, in this case. However, overriding will work for each case as long as you do this for every virtualenv you want.

Can I move a virtualenv?

This question is not a duplicate.
It pertains not just to renaming a virtual environment, but to actually moving it to a different directory, including, potentially, a different user's directory.
This is not the same as merely renaming a virtual environment, especially to people unfamiliar with virtualenvs.
If I create a virtualenv, and I move it to a different folder, will it still work?
$ virtualenv -p /usr/bin/python3 /home/me/Env/my-python-venv
$ source Env/my-python-venv/bin/activate
(my-python-venv) $
...later that day, the virtual environment MOVED...
(my-python-venv) $ deactivate
$ mkdir -p /home/me/PeskyPartyPEnvs
$ mv /home/me/Env/my-python-venv /home/me/PeskyPartyPEnvs/
Question:
Will this work?
$ source /home/me/PeskyPartyPEnvs/my-python-venv/bin/activate
(my-python-venv) $ /home/me/PeskyPartyPEnvs/my-python-venv/bin/pip3 install foaas
I mean this as less of a question about the wisdom of trying this (unless that wisdom is humorous, of course), and more about whether it's possible. I really want to know whether it's possible to do in Python 3, or whether I just have to suck it up and clone it.
Can I just mv a virtualenv like that without sadness? I do want to avoid sadness.
Yes. It is possible to move it on the same platform. You can use --relocatable on an existing environment.
From --help:
--relocatable -- Make an EXISTING virtualenv environment relocatable.
This fixes up scripts and makes all .pth files relative.
HOWEVER, this does NOT seem to change the activate script, and rather only changes the pip* and easy_install* scripts. In the activate script, the $VIRTUAL_ENV environment variable hardcoded as the original /path/to/original/venv. The $VIRTUAL_ENV variable is used to set the PATH of your active environment too, so it must be changed based on the new location in order to call python and pip etc. without absolute path.
To fix this issue, you can change the $VIRTUAL_ENV environment variable in the activate script (for example using sed), and everything should be good to go.
An example of usage:
$ cd ~/first
$ virtualenv my-venv
$ grep 'VIRTUAL_ENV=' my-venv/bin/activate
VIRTUAL_ENV="/home/username/first/my-venv"
$ virtualenv --relocatable my-venv
Making script my-venv/bin/easy_install relative
Making script my-venv/bin/easy_install-2.7 relative
Making script my-venv/bin/pip relative
Making script my-venv/bin/pip2 relative
Making script my-venv/bin/pip2.7 relative
### Note that `activate` has not been touched
$ mkdir ~/second
$ mv my-venv ~/second
$ cd ~/second
$ grep 'VIRTUAL_ENV=' my-venv/bin/activate
VIRTUAL_ENV=/home/username/first/my-venv
### (This variable hasn't been changed, it still refers to the old, now non-existent directory!)
$ sed -i -e 's|username/first|username/second|' my-venv/bin/activate
## sed can be used to change the path.
## Note that the `-i` (in place) flag won't work on all machines.
$ source my-venv/bin/activate
(my-venv) $ pip install foass
...
(my-venv) $ python
[...]
> import foass
Hooray, now you can install things and load them into your newly located virtual environment.
For Python 3.3+ (with new venv built-in module)
Short Answer (regardless of version):
There's no clean, direct way to move a virtual environment
Just recreate, it's easy!!
Long Answer:
As of Python v3.3, the virtualenv package has become a built-in module named venv.
The --relocatable option mentioned in other answers has not been included in venv, and currently there is no good, safe way that I'm aware of to either rename or relocate a Python virtual environment.
However, it is fairly simple to recreate a virtual environment, with all its currently installed packages. See this answer, or see the section below. During the process you can recreate the new environment in whatever location and with whatever name you desire.
In the answer linked above, he mentions some 3rd party packages which may support direct renames or moves. If you are settled on pursuing a way to move a virtual environment, you could look into if those work with venv as well.
Note: In that answer, it is focused on virtualenv, rather than venv. See next section for how to translate.
venv vs. older virtualenv command syntax
The command to use venv is:
python -m venv
rather than just virtualenv, which installs as a command in the original package. Where "python" refers to however you run your python executable, which could be a variety of things, such as:
python
py or py -3.7 or similar (the Python Launcher for Windows for Python 3.3+ and bundled with Python for Windows, or the py package that can be installed separately for Linux [and MacOS?])
python3 (convention for linux environments that dual install python 2 and 3)
If you are having issues, use the absolute path to the python executable you want to run: e.g. c:\program files\python37\python.exe
If you are unsure which version is being run, you can always python --version to find out.
How to recreate a virtual environment
Creating/recreating a virtual environment is easy and should become second nature after you work with them for a bit. This process mirrors what you would do to distribute your script as a package (with it's dependencies) in the first half, and then what someone would do to install your script/package for further development.
First, get an updated list of what is in the virtual environment. With it active, get the Python version it uses and save out the list of dependencies to a file.
Use python --version with the virtual environment activated to see what version of Python it is using.
This is for clarity - you may want to update the Python version for various reasons - at least to the latest patch version
For example, if the existing venv is using Python v3.7.4, but now v3.7.6 is out - use v3.7.6 instead, which should including only non-breaking security and bug fixes.
Use python -m pip freeze > requirements.txt to create the list of current package dependencies and put them into the requirements.txt file. This command works in Linux or the Git Bash for sure - not 100% sure about Powershell or Command Line in Windows.
Now create a new virtual environment and then add the dependencies from the old one.
Make your new venv.
Make sure you are using the correct version of python that you want to install to the venv.
If you want it to be exactly the same Python version:
While in the old venv, type "python --version", then make sure you create the new venv with that version of the python command.
For the new venv folder entry in the command:
Either add an absolute or relative path to the desired final folder location.
Use python -m venv my_new_venv to create a new virtual environment in the current working directory in a new my_new_venv folder.
The name of the venv folder will be the name of the venv (what shows up in the prompt when it is activated).
Install your dependencies from the requirements.txt file.
python -m pip install -r requirements.txt
You might need to reinstall local packages that are in development mode.
Note, if you ever need to see the specific location a package is installed to, use:
python -m pip list -v
The -v or "verbose" option will add some extra information about each package that is installed, including the path it is installed in. This is useful to make sure you are keeping virtual, user, and system installed packages straight.
At this point you can just delete the old venv folder and all contents. I recommend using a GUI for that - file deletions are often permanent from the linux command line, and a small typo can be bad news.
BUT ALAS:
No, you can't simply mv. There are workarounds, but it might be easier to reinstall.
(my-python-venv)$ /home/me/PeskyPartyPEnvs/pip3 install foaas
zsh: /home/me/PeskyPartyPEnvs/pip3: bad interpreter: /home/me/Env/my-python-venv/bin/python3: no such file or directory
(my-python-venv)$ deactivate
$
... presses enter a lot in frustration, and the following works
$
$
$ pip3 search foaas
Except it is not from my-python-venv, ergo sadness.
Want to mv your virtualenv and use it, otherwise unmodified?
Short Answer:
Well, ya can't.
The --relocatable argument to virtualenv appears to allow you to do this.
YES, YOU CAN! (In windows)
The workaround is easy, just move your virtual environment anywhere then edit activate.bat inside scripts\:
Move to the virtual environment to the desired directory
Right-click and edit activate.bat located at venv_folder\scripts.
Change VIRTUAL_ENV variable from:
set VIRTUAL_ENV=C:\old_directory\venv_name
into
set VIRTUAL_ENV=C:\new_directory\venv_name
Save the edited batch file, and thats it!
NOTE: My solution should work and save windows users setting up new virtual environments, I doubt this will work in other operating system since .bat is from MS-DOS
Yes, this should be possible if you haven't done anything that depends on the current directory of the virtualenv.
However, if you have the choice, the best thing to do is to create new virtualenv and start using the new virtualenv instead. This is the safest choice and least likely to cause issues later.
The documentation does mention that:
Each virtualenv has path information hard-coded into it,
For example, if you have run setvirtualenvproject then it won't be able to switch to the right directory after you run workon ... so in that case you'd need to fix that manually.
In general a virtualenv is little more than a directory with the necessary Python interpreter files plus packages that you need.
Using answers of this and other threads about similar topic, I've made a bash script that, located and executed within the virtualenv directory itself, will help with your virtualenv moves.
After doing virtualenv --relocatable yourenv you'll need to change your VIRTUAL_ENV variable every time you move the directory, so if you don't wan't to change it manually, use this.
#!/bin/bash \n
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
EXISTING=$(grep 'VIRTUAL_ENV=' bin/activate)
NEWDIR=VIRTUAL_ENV=\"$DIR\"
sed -i -e "s|$EXISTING|$NEWDIR|" bin/activate
source bin/activate
I hope it helps.
I wrote a venv-move script.
The first argument is the path to the venv. It deletes any __pycache__ under that path.
It detects the old path, and replaces it with the current path, after confirming. It seems to work okay, even when moving to a different machine of the same type.
It would make sense to re-write this in Python, but the program would be longer.
#!/bin/bash -eu
venv=$1
old=`perl -ne '/VIRTUAL_ENV="(.*?)"/ && print "$1\n"' "$venv/bin/activate"`
new=$PWD/$venv
find "$venv" -name __pycache__ | xargs rm -rf --
files=`fgrep -r "$old" "$venv" -l`
echo "replace $old with $new in:"
echo "$files"
read -p "[yn] ? " YN
[ "$YN" = y ]
sed -i "s:$old:$new:g" $files
TL;DR
virtualenv-clone is included part of virtualenvwrapper
virtualenv-clone /path/to/old/venv /path/to/new/venv
Alternatively
You could also try cpvirtualenv
cpvirtualenv /path/to/old/venv /path/to/new/venv
But cpvirtualenv expects the /path/to/old/venv to be existing inside $WORKON_HOME and if it isn't it fails. Since this calls virtualenv-clone you may as well use that instead; to avoid errors like
mark#Desktop:~/venvs$ cpvirtualenv ./random/ $WORKON_HOME/random
Copying random as /home/mark/.virtualenvs/venvs/random...
Usage: virtualenv-clone [options] /path/to/existing/venv /path/to/cloned/venv
virtualenv-clone: error: src dir '/home/mark/.virtualenvs/venvs/random' does not exist
Warning as per virtualenvwrapper documentation
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
What does it actually do ?
As per virtualenv-clone PyPi page
A script for cloning a non-relocatable virtualenv.
Virtualenv provides a way to make virtualenv's relocatable which could
then be copied as we wanted. However making a virtualenv relocatable
this way breaks the no-site-packages isolation of the virtualenv as
well as other aspects that come with relative paths and /usr/bin/env
shebangs that may be undesirable.
Also, the .pth and .egg-link rewriting doesn't seem to work as
intended. This attempts to overcome these issues and provide a way to
easily clone an existing virtualenv.
It performs the following:
copies sys.argv[1] dir to sys.argv[2]
updates the hardcoded VIRTUAL_ENV variable in the activate script to
the new repo location. (--relocatable doesn't touch this)
updates the shebangs of the various scripts in bin to the new Python
if they pointed to the old Python. (version numbering is retained.)
it can also change /usr/bin/env python shebangs to be absolute too,
though this functionality is not exposed at present.
checks sys.path of the cloned virtualenv and if any of the paths are
from the old environment it finds any .pth or .egg link files within
sys.path located in the new environment and makes sure any absolute
paths to the old environment are updated to the new environment.
finally it double checks sys.path again and will fail if there are
still paths from the old environment present.
NOTE: This script requires Python 2.7 or 3.4+

How to disable flask global installation

I have created a python project using Flask.Let's call it projectA. I have run the command . flask/bin/activate to make it global. I have created another project called projectB. I ran the same command to make this Flask installation global. Next I try to install python-mysql module in the projectB. However, I noticed that it gets installed in the projectA.
How to fix this issue?
I assumed that If I can deactivate projectA global installation, this issue may be fixed. However, I didn't find a suitable command in the Flask documentation.Even though, I deleted the projectA, it still try to install the mysql module in the projectA.
I don't believe that I got it right: what are your flask folder contents?
I ask it because sometimes it is the virtualenv – a lot of Flask tutorials suggest installing a virtualenv under the flask folder.
If that is the case, it doesn't make the project global. On the contrary: it makes your commands use the local version of Python (the one installed inside the flask/bin folder), and not the global Python installed wihin your operational system.
So, your problem might not be with Flask itself, but with the lack of understanding of virtualenv.
When you run . flask/bin/activate inside project A, whatever you do in terms of Python (pip and easy_install included) will reflect only in the Python installation under project A's flask folder. Until you run deactivate. Does that make sense?
So, maybe the command you need is deactivate so you can jump from one virtualenv to another.
And, as a final advice, take some time to study virtualenv and, from there, go to virtualenvwrapper.

Categories

Resources