everytime I want to do something with terminal I have to type this sequence in the terminal (Im on mac osx lion terminal):
>Public/projects/installs # location of my venv
>. venv/bin/activate # activates the venv within terminal
Is their anyway to do this faster or create a custom function/ command in the terminal?
There is virtualenvwrapper.
It allows you to switch virtualenvs by typing workon <env_name>. You create virtualenvs by mkvirtualenv <env_name> or mkproject <project_name> if you have set up a PROJECT_HOME and want the working directory there.
You can do a lot more than just switch venvs, though. For example you can set up hooks that are performed for every new venv (installing ipython if you want to, set up a .hgignore) and when activating one (e.g. setting the PATH if you have things installed via npm).
In addition to virtualenvwrapper (already described in two other answers), you might want to check out autoenv. That lets you get into a venv just by doing a cd to its directory.
For fancy stuff, there are a lot of differences between the two projects, and I think virtualenvwrapper is generally more powerful and flexible. But for simple use cases like yours, the choice comes down to which of these you'd prefer:
$ workon projects_installs
… or
$ cd Public/projects/installs
Checkout virtualenvwrapper.
It can be installed with pip install virtualenvwrapper, and requires setting up some lines in your .bashrc file. Then you get the mkproject and workon commands to make creating and switching virtualenvs much easier.
Related
I created an environment using virtualenvwrapper while my machine was running Python 3.8.3. Naturally, this made the environment with Python 3.8.3 as well.
Later, I updated Python on my main machine using home brew to 3.10.
Now I have 3.10-specific code in the virtual env project, running 3.8.3.
The first entry in that project's $PATH is set to the virtual env itself, which uses the old Python. The 2nd entry is Python 3.10. I believe this is standard. The virtual env itself is added to the front of $PATH by virtualenverapper upon activation.
Short of manually manipulating the .zprofile or a virtalenvwrapper's postactivate script, I am wondering if there is a more sweeping and automatic way of updating a virtual environment using virtualenvwarpper?
I am no expert on this stuff, so my concern is that manually manipulating files, or applying one-time fixes will just be asking for trouble down the line.
Thanks for your help.
EDIT: I wanted to add that I am also learning git and have a repo set up in this project. I would like to ideally preserve that information through the "upgrade," which it sounds like involves creating a new env. The virtualenvs are stored on the ~user directory in .virtualenvs. The git repo is in the project directory.
You don't want to do this. As you said, even if you pulled it off you're sure to have hidden issues that'll be a major headache down the line. Fortunately, it's very easy to recreate the virtual env with exactly the same installed packages you had before but with a new Python version.
What you want is to compile a list of installed packages in your old virtualenv, make your new venv with the desired Python version, then reinstall all the packages. We can do this simply like this :
workon oldenv
pip freeze > env_requirements.txt
deactivate
mkvirtualenv newenv -p `which python3.10`
pip install -r env_requirements.txt
If you're happy with the result, you can then delete the old venv :
rmvirtualenv oldenv
As to your concern with git, this should have absolutely no impact whatsoever on your git repo.
I use Python and pip trhough the asdf version manager, and in my older PC, when I create a new venv it already came with some default packages. I'm now using a newer verison of python (3.9.12) and pip (22.0.4), and it does not come with basic and essential things, such as the gnureadline, to be able to see the commands history in the python's shell. It's really annoying having to install it again in all new venvs and I'm trying to avoid an alias to create a venv and install what I want inside it. There is a way to set some default packages to all venvs?
There is an option:
--system-site-packages
Give the virtual environment access to the system
site-packages dir.
So if you use python3 -m venv --system-site-packages .venv then you may install the packages you want available to all envs at the system level. You'll have to be hygienic about the packages installed at the system level.
If the system python on your distribution is a hot mess, and you can't remove things you don't want visible in venvs from it, then you'll want to look for another option.
One possibility is just to install common packages to some target directory:
python3 -m pip install dep1 dep2 --target=/path/to/common
And then make this directory of packages always visible:
export PYTHONPATH=/path/to/common
I'm a beginner to Django and Python, and I've never used virtualenv before. However, I do know the exact commands to activate and deactivate virtual environments (online search). However, this learning course takes time and sometimes I need to split the work over 2 days.
When I create a virtualenv today and do some work, I'm unable to access the same virtualenv tomorrow. Even when I navigate to that folder and type in .\venv\Scripts\activate, it says "system cannot find specific path".
How can I open already existing virtual environments and the projects within them? Could it be that I need to end my previous session in a certain way for me to access it the next time?
Even though pipenv had so many problems. I suggest you use it when you are new to virtual env.
Just
pip install pipenv
cd $your-work-directory
pipenv shell
Then you created your project env.
You can active it by:
cd $your-work-directory
pipenv shell
You can install packages by:
cd $your-work-directory
pipenv install $yourpackage --skip-lock
Open the command prompt
Go to the project directory, where you created virtual environment
And type the same as error shows, as in my case it was
File D:\Coding files\Python*recommendation-system\venv\Scripts\activate.ps1* cannot be loaded because running scripts is disabled on this system.
So I typed recommendation-system\venv\Scripts\activate.ps1
And it resolved my problem.
Use this and it will work:
cd $working directory u have virtual env on
pipenv shell
you can use this command it worked for me for reusing your existing venv
$ workon then the name of your existing venv
There are a few Python dependencies that I would like to be available in every venv (virtual environment) that I create for each project. For example black, flake8 and pytest. Is that possible and if so, how to achieve that?
I'd like to install these three once under my main Python installation, instead I have to reinstall all of them in every venv that I create when I start a new project. This is specially annoying when using VSCode which throws popups complaining about "Linter flake8 is not installed" or "...black is not installed", etc. when you switch to a venv where you haven't installed these packages.
Let me answer my own question based on comment from #jonrsharpe.
Assuming you want to have black, flake8 and pytest available 'globally' or in other words you want to have these packages in every new venv that you create but don't want to repeat pip install black flake8 pytest each and every time. Here's what you can do:
install the packages once under your main Python version (that you'd like to use for your venvs. NOTE: you make have several Python versions installed.)
when creating a new venv use --system-site-packages option. For example:
python -m venv --system-site-packages .venv/dev
activate your venv, i.e. source .venv/dev/bin/activate and check w/ pip list that the packages are available
One Possible solution would be,
Deactivate virtual environment.
Install all packages which you need to be available globally.
Again activate virtual environment.
make sure you enable inherit package from global
Note: Please refer to these SO threads as well for more information.
How do I install a pip package globally instead of locally?
What does sudo -H do?
Don't use sudo pip install
If you use sudo pip install black, it will pollute your global Python installation. You don't want to do that.
You also don't need to use --system-site-packages in your virtualenvs. You can use it for other reasons, but using it just so that black etc. can work is a bad idea.
Intermediate solution: Use pip install --user
Deactivate your virtualenv, then execute pip install --user black (--user is the default in many systems when pip isn't in a venv/virtualenv and isn't run as root). This will install black somewhere in the user's home directory, such as $HOME/.local.
You will subsequently be able to run black regardless which venv/virtualenv you have activated, and regardless whether the venv/virtualenv has been created with --system-site-packages.
Why this works: If you wanted to import black in your code, this wouldn't work in a venv/virtualenv (unless with --system-site-packages), because black is installed "system-wide", albeit in a user's home dir ("user-wide" would be a more correct term in this case). However you don't want to import black; what you want is to execute the black command, and this will work regardless which venv/virtualenv you have activated, because to your shell black is just a command it can find in the path, just like ls or more (it's in $HOME/.local/bin, which should be in the PATH). This is true of pretty much everything you want to install "system-wide" (i.e. outside of any venv/virtualenv): shell commands like black, flake8 and pytest.
If you look at $HOME/.local/bin/black, you'll see that its first line is something like #!/usr/bin/python3. This means it uses the system-wide python installation and not a venv/virtualenv. (If it was #!/usr/bin/env python then it would use the activated venv/virtualenv instead.)
Best solution: Use pipx
The problem with the above solution is that it pollutes your "user-wide" python installation, which is almost like polluting your system-wide installation. You can do this instead (with venvs/virtualenvs deactivated):
pip install --user pipx
pipx install black
What pipx does is it creates a venv specifically for black and installs black in that venv. The $HOME/.local/bin/black executable that it creates is such that it runs black in that venv. pipx uninstall black removes both the executable and the venv.
Further reading
If there's still something unclear, my article virtualenv demystified may help.
I attempt to create a virtual environment (and store my Python scripts) at directory /Users/MyName/Desktop/TestFolder. Specifically, I'd like to use numpy 1.8.0 in the virtual environment.
My default project interpreter in PyCharm is ~/anaconda/bin/python. How do I do this in PyCharm?
Use virtualenv to create isolated envinronment:
python3 -m venv ~/VirtualPython
~/VirtualPython/bin/pip3 install numpy # and whatever You want to have as dependencies
Make PyCharm be using that virtualenv: https://www.jetbrains.com/help/pycharm/2017.1/creating-virtual-environment.html
Use ~/VirtualPython/bin/python3 as interpreter
More about virtualenv: https://docs.python.org/3/library/venv.html
Feel no fear using Your new venv. Really, there is always a fear like "should I do some magick before virtualenv scripts is really virtual"? Answer is "No". When You run ~/VirtualPython/bin/python3 my_script.py it's always will be executed by virtualenv