Python system libraries leak into virtual environment - python

While working on a new python project and trying to learn my way through virtual environments, I've stumbled twice with the following problem:
I create my virtual environment called venv. Running pip freeze shows nothing.
I install my dependencies using pip install dependency. the venv library starts to populate, as confirmed by pip freeze.
After a couple of days, I go back to my project, and after activating the virtual environment via source venv/bin/activate, when running pip freeze I see the whole list of libraries installed in the system python distribution (I'm using Mac Os 10.9.5), instead of the small subset I wanted to keep inside my virtual environment.
I'm sure I must be doing something wrong in between, but I have no idea how could this happen. Any ideas?
Update:
after looking at this answer, I realized the that when running pip freeze, the pip command that's being invoked is the one at /usr/local/bin/pip instead of the one inside my virtual environment. So the virtual environment is fine, but I wonder what changes in the path might be causing this, and how to prevent them to happen again (my PYTHONPATH variable is not set).

I realized that my problem arose when moving my virtual environment folder around the system. The fix was to modify the activate and pip scripts located inside the venv/bin folder to point to the new venv location, as suggested by this answer. Now my pip freeze is showing the right files.

Related

Waitress installed via pip, but cannot be executed

Overall, pip is working fine on my server. I have just installed the package waitress and the installation seems successful. I checked it with pip freeze:
$ pip freeze | grep waitress
waitress==2.1.0
Waitress also can be imported via python3:
>>> import waitress
>>>
However, waitress-serve cannot be executed:
$ waitress-serve
Command 'waitress-serve' not found, but can be installed with:
apt install python3-waitress
Please ask your administrator.
I am not a root user on this server. Could this be a reason why the package was installed partially, or am I speculating here?
Since I am not authorized to run apt install and since the simple pip install worked in my virtualenv, I would like to be able to get this to work without using the suggested apt install python3-waitress command.
The conclusion is that, while it is installed both inside and outside virtualenv, it is only actually executable inside it.
When installable Python packages give you an actual entry point, generally it will not be on your path.
When you use a virtual environment, activating the environment puts various parts of that environment onto the path temporarily. This is intended to ensure that commands like python or python3 run the environment's Python, but it also allows those entry points to be found on the path.
A system Python installation (here I mean, not just a Python that comes with your operating system, but also one that you install manually after the fact - but not a virtual environment) will generally not have its library folders on the path by default - only enough to make python and pip work. (On Windows, often even these are not added to the path; instead, a program py is placed in the Windows installation folder, and it does the work of looking for Python executables.) Even if you are allowed to install things directly into a system Python (and you should normally not do this if you can avoid it, even if you're allowed to), they won't be findable there.
Of course, you could execute these things just fine by explicitly specifying their paths. However, the normally correct approach is to just ensure that, when you want to run the program, the same virtual environment is activated into which you installed the package.
(On my system, I have one main "sandbox" virtual environment that I use for all my projects - unless I am specifically testing the installation process, or testing how the code works on a different version of Python. Then I use a wrapper script to open a terminal window, navigate to a folder that contains all my projects, and activate the environment.)
if waitress is installed inside a virtual environment, I think you may have accidentally (or not) gotten out of said virtual environment.
If you are running a virtual environment, you can try the following commands, one after the other:
source venv/bin/activate #venv is assumed to be the name of the virtual environment you are using.
pip install waitress
waitress-serve
anytime you need to use waitress you will need to activate the virtual environment yet again:
source venv/bin/activate
waitress-serve
Please take note I am assuming you are running in a Linux environment
If this is not the issue you are facing, then feel free to expaund a bit more on your question.
Edit: Installing with pip and running it worked perfectly on my virtualenv; see the picture below, running on python 3.8.10

Questions Virtualenv [duplicate]

This question already has answers here:
How can I set up a virtual environment for Python in Visual Studio Code?
(23 answers)
Closed 2 years ago.
just wanted to set up my new MacBookPro M1. As I want to organize my MB this time, i want to start using virtualenv.
So, what I've done so far:
installed brew
installed virtualenv
set up a dir, in there create my first env called sec_env
installed some packages for testing
Now I want to use my virtualenv:
I started it, source sec_env/dir/activate
AND now here we go, I want to code something in this env. So I start up my code-insiders and try to import the package i already installed....does not work ;( (EDIT1: Maybe i failed config it inside vs code?)
Do I missunderstand the use of virtualenv? I thought of it kinda like a virtual machine...So i can install package in need for one project and code it. But if i work on another, i would just switch, start up my vs-code again and keep writing on the other project.
Or is the problem just, that all the project I want to code have to be inside the dir of the virtualenv(sec_env)? At the moment , I have a dir virtualenvs where I store all my environments , start one up and change to desktop to work . And all the projects are on my desktop.
Would be awesome if someone give me any tipps on this, or another way to separate my different projects. I am super new to this topic, since I used different virtual-box images before...now i am forced to use something else...M1 :D !
Your understanding is generally correct as virtualenv are a way to keep projects' dependencies separated from each other, like a VM would.
Your code doesn't need to be in the same directory as where your virtual environment, but many people tend to organize it that way for the sake of convenience. That way you don't need to think about what venv you coded a project with since it's right there in the directory.
With your steps, I think you installed a package before activating the environment. Doing it in that order installs the package in your system site-packages, not your virtual environment packages. Before you install a package, you need to activate your environment. Also, it appears from How to tell Homebrew to install inside virtualenv? that homebrew doesn't support installing a package into a virtual env. So in order to install packages into a virtualenv, I would suggest using pip as your package manager.
So the sequence of commands would be...
source <path to virtualenv>/dir/activate
pip install <modules you want to install>
# Now you can run your code that references those installed modules.

Reuse PyCharm Python virtual environment in Cygwin?

I have created a PyCharm project containing several Python scripts, it's using virtual environment. All is set-up and it's up and running. Windows 10.
I would now like to run the same Python scripts from within Cygwin command line. Is there a way to reuse the virtual environment created by PyCharm (C:\Users\joe_doe\\.virtualenvs\prj_name)?
I would say: no, I don't believe it is possible, and even if it were it is not worth the trouble.
Virtual environments should probably be considered as throwaway things. Use something like pip freeze > requirements.txt to save the list of projects installed in the virtual environment. And then pip install --requirement requirements.txt to install these projects in a new environment. It is a good habit to curate the list of requirements and one should be comfortable with deleting and recreating virtual environments on a whim without fear of losing any information.

Will libraries installed to my "base" system Python be available in a virtualenv?

I'm new to python, so please be gentle.
In learning python and writing my first few scripts, I quickly glossed over any tutorial sections on virtualenv, figuring it wouldn't provide me any benefit in my nascent stage.
I proceeded to hack away, installing packages as I went with pip3 install package
Now I've built something that is potentially useful to my organization, and I'd like to share it. In this case, I want to distribute it as a windows executable.
Before building this distribution, I figure it's now time to take the next leap from individual scripts to proper python projects. It seems like virtualenv should be part of that.
Given that I've installed a number of packages to my "base" python environment: in order to do development in a "clean" virtual environment, do I need to somehow "revert" my base python environment (i.e. uninstall all non-standard packages), or will virtualenv shield a project within a virtual environment from non-standard packages installed to my "base" environment?
If you are using the venv module there is --system-site-packages flag that will grant the created virtual environment access to the system-wide site-packages directory:
--system-site-packages
Give the virtual environment access to the system
site-packages dir.
Go install VirtualEnvWrapper first. After that, create a new virtualenv, activate it, and run pip freeze. You should see nothing in there because nothing is installed. Deactivate the env to go back to your 'Base' environment and pip freeze again. You will see all the installs you have.
A best practice is to create a requirements.txt file and version control it so everyone can use the same versions of the same packages. If you don't want to do this, simply activate your new virtual env and pip install everything you want.
You can specify separately the required libraries and check if they are installed and if not then you can install them automatically.
Have a look at:
https://packaging.python.org/discussions/install-requires-vs-requirements/

create virtual environment based on other virtual environment with canopy

In my company I have a setup where I have an original canopy distribution installed. Through some batch process a virtual environment is then created of that which contains additional python packages.
The virtual environment works fine from pycharm, however, I have the following problems:
When starting pip or python from the command line, the original canopy installation seems to be started. Am I right in thinking that 'activating' the virtual environment simply means adjusting the path variables to folders of the virtual environment? How is this best done automatically? Does canopy or python provide a good script? I want pip to install packages to the virtual environment, which it currently doesn't.
What is the best way to create a new virtual environment based on the virtual environment I already have?
I know that with anaconda this would all be easier, but my solution needs to be based on pure python or canopy.
Not sure about your specific environment, but for python projects, I usually get by with
pip freeze > requirements.txt
to save the list of packages installed in a virtual environment to a file
and
pip install -r requirements.txt
to restore the packages on a new virtual environment.
I've used requirements.txt as the filename, but you can pretty much use any file name you want for this.

Categories

Resources