This question already has answers here:
How can I set up a virtual environment for Python in Visual Studio Code?
(23 answers)
Closed 2 years ago.
just wanted to set up my new MacBookPro M1. As I want to organize my MB this time, i want to start using virtualenv.
So, what I've done so far:
installed brew
installed virtualenv
set up a dir, in there create my first env called sec_env
installed some packages for testing
Now I want to use my virtualenv:
I started it, source sec_env/dir/activate
AND now here we go, I want to code something in this env. So I start up my code-insiders and try to import the package i already installed....does not work ;( (EDIT1: Maybe i failed config it inside vs code?)
Do I missunderstand the use of virtualenv? I thought of it kinda like a virtual machine...So i can install package in need for one project and code it. But if i work on another, i would just switch, start up my vs-code again and keep writing on the other project.
Or is the problem just, that all the project I want to code have to be inside the dir of the virtualenv(sec_env)? At the moment , I have a dir virtualenvs where I store all my environments , start one up and change to desktop to work . And all the projects are on my desktop.
Would be awesome if someone give me any tipps on this, or another way to separate my different projects. I am super new to this topic, since I used different virtual-box images before...now i am forced to use something else...M1 :D !
Your understanding is generally correct as virtualenv are a way to keep projects' dependencies separated from each other, like a VM would.
Your code doesn't need to be in the same directory as where your virtual environment, but many people tend to organize it that way for the sake of convenience. That way you don't need to think about what venv you coded a project with since it's right there in the directory.
With your steps, I think you installed a package before activating the environment. Doing it in that order installs the package in your system site-packages, not your virtual environment packages. Before you install a package, you need to activate your environment. Also, it appears from How to tell Homebrew to install inside virtualenv? that homebrew doesn't support installing a package into a virtual env. So in order to install packages into a virtualenv, I would suggest using pip as your package manager.
So the sequence of commands would be...
source <path to virtualenv>/dir/activate
pip install <modules you want to install>
# Now you can run your code that references those installed modules.
Related
This question already has answers here:
How do I use installed packages in PyCharm?
(14 answers)
Closed 1 year ago.
I am completely new to pycharm so this might be obvious. But currently every-time I load a project in pycharm I need to re-install the packages. Is there a way to automatically load them, or keep them on the project?
where you are installing the packages, to the python system interpreter or to the virtual environment ? If you are creating virtual env for all the projects you need to install packages everytime but if you use system python interpreter one time installation will be enough no need to install the same package again.
The default option of PyCharm is to generate a fresh virtual environment.
Which is safe, but you need to re-install packages.
See below how to select an existing environment instead.
So you could set up one environment, install what you usually need and always use that. This is fast, but comes with a big downside: Later updates to this or other installations may break programs you wrote earlier.
As you can see, there's a checkbox that says: Make this environment available to all projects which will help you doing this.
I'm working on a script in python that relies on several different packages and libraries. When this script is transferred to another machine, the packages it needs in order to run are sometimes not present or are older versions that do not have the same functionality and cause the script to fail.
I was considering using a virtual environment, but I can't find a way to have the script use the specific environment I design as it's default, and in order to use the environment a user must manually activate it from the command line.
I've also looked into trying to check the versions of the packages installed on the machine, and if they are not sufficient then updating them from the script as described here:
Installing python module within code
Is there any easier/surefire way to make sure that the needed packages will always be available regardless of where it's run?
The normal approach is to create an installation script and have that manage your dependencies. Then when you move your project to a new environment your installer will check that all dependencies are present.
I recommend you check out setuptools: https://setuptools.readthedocs.io/en/latest/
If you don't want to install dependencies whenever you need to use your script somewhere new, then you could package your script into a Docker container.
If the problem is ensuring the required packages are available in a new environment or virtual environment, you could use pip and generate a requirements.txt and check it in version control or use a tool to do that for you, like pipenv.
If you would prefer to generate a requirements.txt by hand, you should:
Install your depencencies using pip
Type pip freeze > requirements.txt to generate a requirements.txt file
Check requirements.txt in you source management software
When you need to setup a new environment, use pip install -m requirements.txt
The solution that I've been using has been to include a custom library (folder with all of my desired packages) in the folder with my script, and I simply import them from there:
from Customlib import pkg1, pkg2,...
As long as the custom library and script stay together in the same folder, it will always have access to the right packages and the correct versions of those packages.
I'm not sure how robust this solution actually is or what possible bugs may arise from this if it is passed from machine to machine, but for now this seems to work.
I'm new to python, so please be gentle.
In learning python and writing my first few scripts, I quickly glossed over any tutorial sections on virtualenv, figuring it wouldn't provide me any benefit in my nascent stage.
I proceeded to hack away, installing packages as I went with pip3 install package
Now I've built something that is potentially useful to my organization, and I'd like to share it. In this case, I want to distribute it as a windows executable.
Before building this distribution, I figure it's now time to take the next leap from individual scripts to proper python projects. It seems like virtualenv should be part of that.
Given that I've installed a number of packages to my "base" python environment: in order to do development in a "clean" virtual environment, do I need to somehow "revert" my base python environment (i.e. uninstall all non-standard packages), or will virtualenv shield a project within a virtual environment from non-standard packages installed to my "base" environment?
If you are using the venv module there is --system-site-packages flag that will grant the created virtual environment access to the system-wide site-packages directory:
--system-site-packages
Give the virtual environment access to the system
site-packages dir.
Go install VirtualEnvWrapper first. After that, create a new virtualenv, activate it, and run pip freeze. You should see nothing in there because nothing is installed. Deactivate the env to go back to your 'Base' environment and pip freeze again. You will see all the installs you have.
A best practice is to create a requirements.txt file and version control it so everyone can use the same versions of the same packages. If you don't want to do this, simply activate your new virtual env and pip install everything you want.
You can specify separately the required libraries and check if they are installed and if not then you can install them automatically.
Have a look at:
https://packaging.python.org/discussions/install-requires-vs-requirements/
While working on a new python project and trying to learn my way through virtual environments, I've stumbled twice with the following problem:
I create my virtual environment called venv. Running pip freeze shows nothing.
I install my dependencies using pip install dependency. the venv library starts to populate, as confirmed by pip freeze.
After a couple of days, I go back to my project, and after activating the virtual environment via source venv/bin/activate, when running pip freeze I see the whole list of libraries installed in the system python distribution (I'm using Mac Os 10.9.5), instead of the small subset I wanted to keep inside my virtual environment.
I'm sure I must be doing something wrong in between, but I have no idea how could this happen. Any ideas?
Update:
after looking at this answer, I realized the that when running pip freeze, the pip command that's being invoked is the one at /usr/local/bin/pip instead of the one inside my virtual environment. So the virtual environment is fine, but I wonder what changes in the path might be causing this, and how to prevent them to happen again (my PYTHONPATH variable is not set).
I realized that my problem arose when moving my virtual environment folder around the system. The fix was to modify the activate and pip scripts located inside the venv/bin folder to point to the new venv location, as suggested by this answer. Now my pip freeze is showing the right files.
I need to get Python code, which relies on Python 2.6, running on a machine with only Python 2.3 (I have no root access).
This is a typical scenario for virtualenv. The only problem is that I cannot convince it to copy all libraries to the new environment as well.
virtualenv --no-site-packages my_py26
does not do what I need. The library files are still only links to the /usr/lib/python2.6 directory.
No I'm wondering, whether virtualenv is the right solution for this scenario at all. From what I understand it is only targetted to run on machines with exactly the same Python version.
Tools like cx_Freeze and the like do not work for me, as I start the Python file after some environment variable tweeking.
Is there maybe a hidden virtualenv option that copies all the Python library files into the new environment? Or some other tool that can help here?
No, I think you completely misunderstood what virtualenv does. Virtualenv is to create a new environment on the same machine that is isolated from the main environment. In such an environment you can install packages that do not get installed in the main environment, and with --no-site-packages you can also isolate you from the main environments installed modules.
If you need to run a program that requires Python 2.6 on a machine that does not have 2.6, you need to install Python 2.6 on that machine.
I can't help you with your virtualenv problem as I have never used it. But I will just point something out for future use.
You can install software from sources into your home folder and run them without root access. for example to install python 2.6:
~/src/Python-2.6.2 $ ./configure --prefix=$HOME/local
~/src/Python-2.6.2 $ make
...
~/src/Python-2.6.2 $ make install
...
export PATH=$HOME/local/bin:$PATH
export LD_LIBRARY_PATH=$HOME/local/lib:$LD_LIBRARY_PATH
~/src/Python-2.6.2 $ which python
/home/name/local/bin/python
This is what I have used at Uni to install software where I don't have root access.
You haven't clearly explained why cx_Freeze and the like wouldn't work for you. The normal approach to distributing Python applications to machines which have an older version of Python, or even no Python at all, is a tool like PyInstaller (in the same class of tools as cx_Freeze). PyInstaller makes copies of all your dependencies and allows you to create a single executable file which contains all your Python dependencies.
You mention tweaking environment variables as a reason why you can't use such tools; if you expand on exactly why this is, you may be able to get a more helpful answer.