I attempt to create a virtual environment (and store my Python scripts) at directory /Users/MyName/Desktop/TestFolder. Specifically, I'd like to use numpy 1.8.0 in the virtual environment.
My default project interpreter in PyCharm is ~/anaconda/bin/python. How do I do this in PyCharm?
Use virtualenv to create isolated envinronment:
python3 -m venv ~/VirtualPython
~/VirtualPython/bin/pip3 install numpy # and whatever You want to have as dependencies
Make PyCharm be using that virtualenv: https://www.jetbrains.com/help/pycharm/2017.1/creating-virtual-environment.html
Use ~/VirtualPython/bin/python3 as interpreter
More about virtualenv: https://docs.python.org/3/library/venv.html
Feel no fear using Your new venv. Really, there is always a fear like "should I do some magick before virtualenv scripts is really virtual"? Answer is "No". When You run ~/VirtualPython/bin/python3 my_script.py it's always will be executed by virtualenv
Related
How does one create virtual environment that runs on it's own executable?
I used anaconda to create new Venv (to experiment with installing new modules) by cloning some old Venv
conda create --prefix ./new_env --clone C:\full_path_to_old_env\old_env
But new Venv is still using the "python.exe" from old Venv, as can be seen by calling
import sys print(sys.executable)
Because of that, when I install new modules, using for example
pip install selenium
inside the new Venv, it gets installed in the folder of new Venv (as expected) and is unreachable for old executable.
I saw similar question addressed here Changing Python Executable by changing where to install new modules, but that defeats the idea of independent environments.
The only way to for Venv to run it's own copy of python I found is using
sys.executable = r'C:\full_path_to_new_venv\python.exe'
at the beginning of the notebook, but this seems more like forced patch, rather than solution.
This also gets overridden with every initiation of new Venv, so all notebooks running in new Venv would need it
I found what was wrong: I was using Jupyter Notebook to run python virtual environments. But I did not install ipython kernel to new VEnv
Apparently, in such case Jupyter doesn't throw an error when a kernel exists somewhere else and just runs the kernel it found.
Hence, the fix is to install ipython kernel to new VEnv:
$ipython kernel install --name .venv --user
I need python3.6 for tensorflow installation, so I downloaded python3.6.12.tar. And I found that I should pip install tarfile. However, in this case it is an older version of python. FYI, In my computer(laptop) I installed python3.9.
My question is: can I pip install python.tar inside a virtualenv?
This is not how virtual environments work. I suggest you to do a little bit more research on virtual environments in Python.
Virtual Environments and Packages
Basically you need to install the necessary python version onto your machine. Then go ahead and use that specific python (which is version 3.6 in your case), to create a virtual environment with the command
~ /usr/bin/<path-to-python3.6> -m venv venv
This command will create a folder called venv. Now you need to source the activation script inside this folder to activate your environment.
Handy note: if you are dealing with different versions of python, a more robust way of handling such situations is via using a tool called pyenv.
After moving over to Fedora (from Windows), I realized that it came with both installations of Python 2.7.5 and Python 3.6.6.
As I familiarized myself with using Python, I learned of the great utilities of virtual environments and how organized they keep everything.
However, my current dilemma is for which Python version should I do pip(2 or 3) install virtualenv virtualenvwrapper.
From my research, I understand that the virtualenvwrapper provides the ability to create a virtual environment using a specified version of Python: mkvirtualenv -p /usr/bin/python(2 or 3) {name}.
Therefore, should I only install virtualenv and virtualenvwrapper on one of the Python versions and use the aforementioned feature? Or should I install virtualenv and virtualenvwrapper on both versions of Python.
Would there be any conflicts?
Edit
More importantly, assuming that I have virtualenv and virtualenvwrapper installed for both Python 2.7.5 and Python 3.6.6, which version's command is called when I run any of the following: workon, mkvirtualenv, rmvirtualenv, etc.?
Would there be any conflicts?
Not until you mistakenly run the default system python command with a script that's using the opposite version as compared to the more specific python2 or python3 commands.
The virtualenvs do not conflict, and must be activated to be used. You can also of course have as many virtualenv's as you wish.
To avoid any problems setting up an environment, its suggested to run python2 -m virtualenv for example, rather than simply virtualenv command itself
For the commands listed at the bottom of the question, it depends on how your PATH is configured. Personally, I use pyenv rather than virtualenv directly, which injects itself into the OS PATH variable
In my company I have a setup where I have an original canopy distribution installed. Through some batch process a virtual environment is then created of that which contains additional python packages.
The virtual environment works fine from pycharm, however, I have the following problems:
When starting pip or python from the command line, the original canopy installation seems to be started. Am I right in thinking that 'activating' the virtual environment simply means adjusting the path variables to folders of the virtual environment? How is this best done automatically? Does canopy or python provide a good script? I want pip to install packages to the virtual environment, which it currently doesn't.
What is the best way to create a new virtual environment based on the virtual environment I already have?
I know that with anaconda this would all be easier, but my solution needs to be based on pure python or canopy.
Not sure about your specific environment, but for python projects, I usually get by with
pip freeze > requirements.txt
to save the list of packages installed in a virtual environment to a file
and
pip install -r requirements.txt
to restore the packages on a new virtual environment.
I've used requirements.txt as the filename, but you can pretty much use any file name you want for this.
I'm developing a python project on my desktop (OS X) and it's running well. Now I need to run it on a computing cluster that is running Linux and I have no root access. On my Home in the computing cluster I installed Anaconda locally. Then when I run my project there I got so many strange bugs and errors. THe code runs fine on my desktop. So I thought probably the problem is that probably Anaconda has newr modules than what I have and that's probably causing errors. (backward compatibility issues)
Is there any way to use the same python modules versions I have in my desktop on the computing cluster?
I wonder if there is a command that produces all the packages and modules I have on my desktop with their versions and then import this file into some tool in that linux machine and then I get a cloned version of my setup. Is there something like that?
Use pyenv to control the python version
pyenv is designed for installing python into the home folder so you don't need root access (but even with root access it's a great idea), it lets you use a specified version of python within your home folder hierarchy rather than the OS version. You should also install pyenv virtualenv so you can use a virtual environment for your project (not strictly essential, but virtual environments are a great idea and you should always use one, with pyenv they're practically effortless).
One of the neat things about pyenv is the pyenv local command which specifies which version of python (or which virtualenv) should be used for a folder (and subfolders), once you've used pyenv local in your project folder to set the python version any time you use the python command it'll use the version set by pyenv local. It's not needed if you only install one version of python, and don't use a virtualenv (in that case you can use pyenv global to set the version for the user). The neatest thing about pyenv local/global setup is it works great with both scripts and manually invoking python, it is simply set-and-forget unlike other python environment managers which require activation.
In brief, once you've set up pyenv you control exactly what version of python will run, and as it is installed into the home folder the OS has no power to affect it.
Having installed pyenv and pyenv virtualenv, you would then use it to install the same version of python as is used on your development machine, the commands you'd run would be something like this:
pyenv install 3.4.2
pyenv virtualenv 3.4.2 my_project_env
cd my_project
pyenv local my_project_env
Install modules into python environment
To get a list of python module version use pip freeze, you can do this on your development machine:
pip freeze > requirements.txt
Now copy requirements.txt to your deployment machines (with pyenv already setup using pyenv local or pyenv global) and run:
pip install -r requirements.txt
Which will install the same modules into the python environment.
Duplicating a pyenv
While it's kind of dodgy, once you've done this, you can even copy the entire installation (i.e. at least the .bashrc file and .pyenv folder) onto other machines, and if the machines have the same OS and the home folders have the same name, the transplanted environment should work fine. It might be more responsible to use a setup script but I have copied pyenv environments without anything terrible happening.
If you have pip installed you could try this pip freeze.
If you want to get list of modules from python shell:
help('modules')