I am a beginner trying to learn a bit of Python; first practical applications will be data analytics. My learning setup consists of Mac OS X, Miniconda2, Pycharm and Git.
Is it better to set up a project folder 'bar' within a conda environment folder 'foo' (~/miniconda2/env/foo/bar)?
Or is it better to leave the conda environment alone as ~/miniconda2/env/foo and set up a project folder as ~/repos/bar?
Virtualenv users I've seen put the env and the project in a single folder, but I have not seen a similar, popular or recommended workflow for conda.
Thank you in advance for any advice.
While I haven't used conda myself, I expect they aren't trying to change the concept of a virtual environment too much. That being said, I personally find it better to keep them separate, i.e. have a ~/.virtualenvs and a ~/repos folder.
As you mentioned, though, it's pretty common to store both the virtualenv and the project itself in the same folder. What I would stress here is that the virtualenv should then be in the project folder, not the other way around. For example:
~/repos/Foo/.fooenv
The reason for this is that virtualenvs should be disposable, whereas your projects are not. That means that you should be able to freely remove a virtualenv without fearing you've accidentally deleted your project folder along with it.
Related
I created a django project and its interpreter is an 3.5.2 ENV, all the extensions that I install in Pycharm it doesn't recognize them, when I try to add them in Installed APPS, they aren't available.
But if the interpreter is only python.exe it recognizes.
So, How do I change the intrepeter of a project that is set to the 3.5 2 ENV to another, I don't know exatly what ENV is, and why it doesn't allow me to use installed extensions.
Go to preferences then project. You can set the interpreter there.
I assume the ENV you are talking about is a virtual environment. You normally create your project inside a virtual environment in order to maintain project-specific dependencies. E.g. If you install a dependency in your virtual environment, it can be accessed ONLY from that container. So, it is not installed system-wide and therefore cannot be accessed by things outside of the ENV.
Makes sense because you don't really want to install project specific things system-wide. E.g. what if you wanted to work on one project with Django 1.10 and another with 1.8? You would create two virtualenvs to encapsulate each!
I know it doesn't answer your main question but it may help to understand what's going on.
https://virtualenv.pypa.io/en/stable/
Where is the common location/directory to store configuration in a python virtualenv?
For Linux there is /etc for user stuff there is XDG_CONFIG_HOME (~/.config) but for virtualenv ...?
I know that I can store my configuration in any location that I want, but maybe there is a common location, which makes my application easier to understand by python experts.
So I think this is the most common approach...
1. postactivate with virtualenvwrapper
I've always done this in the postactivate file myself. In this approach, you can either define environment variables directly in that file (my preference) or in a separate file in your project dir which you source in the postactivate file. To be specific, this is actually a part of virtualenvwrapper, as opposed to virtualenv itself.
http://virtualenvwrapper.readthedocs.io/en/latest/scripts.html#postactivate
(If you want to be really clean, you can also unset your environment vars in the postdeactivate file.)
Alternatively, you can do this directly in the activate file. I find that approach less desirable because there are other things going on in there too.
https://virtualenv.pypa.io/en/latest/userguide.html#activate-script
Two popular alternatives I've also used are:
2. .env with autoenv
Independent of virtualenv, another approach towards solving the same problem is Kenneth Reitz' autoenv, which automatically sources a .env whenever you cd into the project directory. I do not use this one much anymore.
https://github.com/kennethreitz/autoenv
3. .env with Python Decouple
If you only need the environment variables for Python code (and not, for example, in a shell script inside your project) then Python Decouple is a related approach which uses a simplified .env file in the root of your project. I find myself using it more and more these days personally.
https://github.com/henriquebastos/python-decouple/
I'm somewhat surprised to see this is not discussed in detail in The Hitchhiker's Guide to Python - Virtual Environments. Perhaps we can generate a pull request about it from this question.
I am a beginner in Python.
I read virtualenv is preferred during Python project development.
I couldn't understand this point at all. Why is virtualenv preferred?
Virtualenv keeps your Python packages in a virtual environment localized to your project, instead of forcing you to install your packages system-wide.
There are a number of benefits to this,
the first and principle one is that you can have multiple virtulenvs, so you
can have multiple sets of packages that for different projects, even
if those sets of packages would normally conflict with one another.
For instance, if one project you are working on runs on Django 1.4
and another runs on Django 1.6, virtualenvs can keep those projects
fully separate so you can satisfy both requirements at once.
the second, make it easy for you to release your project with its own dependent
modules.Thus you can make it easy to create your requirements.txt
file.
the third, is that it allows you to switch to another installed python interpreter for that project*. Very useful (Think old 2.x scripts), but sadly not available in the now built-in venv.
Note that virtualenv is about "virtual environments" but is not the same as "virtualization" or "virtual machines" (this is confusing to some). For instance, VMWare is totally different from virtualenv.
A Virtual Environment, put simply, is an isolated working copy of Python which allows you to work on a specific project without worry of affecting other projects.
For example, you can work on a project which requires Django 1.3 while also maintaining a project which requires Django 1.0.
VirtualEnv helps you create a Local Environment(not System wide) Specific to the Project you are working upon.
Hence, As you start working on Multiple projects, your projects would have different Dependencies (e.g different Django versions) hence you would need a different virtual Environment for each Project. VirtualEnv does this for you.
As, you are using VirtualEnv.. Try VirtualEnvWrapper : https://pypi.python.org/pypi/virtualenvwrapper
It provides some utilities to create switch and remove virtualenvs easily, e.g:
mkvirtualenv <name>: To create a new Virtualenv
workon <name> : To use a specified virtualenv
and some others
Suppose you are working on multiple projects, one project requires certain version of python and other project requires some other version. In case you are not working on virtual environment both projects will access the same version installed in your machine locally which in case can give error for one.
While in case of virtual environment you are creating a new instance of your machine where you can store all the libraries, versions separately. Each time you can create a new virtual environment and work on it as a new one.
There is no real point to them in 2022, they are a mechanism to accomplish what C#, Java, Node, and many other ecosystems have done for years without virtual environments.
Projects need to be able to specify their package and interpreter dependencies in isolation from other projects. Virtual Environments are a fine but legacy solution to that issue. (Vs a config file which specifies interpreter version and local __pypackages__)
pep 582 aims to address this lack of functionality in the python ecosystem.
As my limited brain has come to understand it after much reading, relative imports bad, absolute imports good. My question is, how can one effectively manage a "live" and "development" version of a package? That is, if I use absolute imports, my live code and development code are going to be looking at the same thing.
Example
/admin/project1/__init__.py
/scripts/__init__.py
/main1.py
/main2.py
/modules/__init__.py
/helper1.py
with "/admin" on my PYTHONPATH, the contents of project1 all use absolute imports. For example:
main1.py
import project1.modules.helper1
But I want to copy the contents of project1 to another location, and use that copy for development and testing. Because everything is absolute, and because "/admin" is on PYTHONPATH, my copied version is still going to be referencing the live code. I could add my new location to PYTHONPATH, and change the names of all files by hand (i.e. add "dev" to the end of everything), do my changes/work, then when I'm ready to go live, once again, by hand, remove "dev" from everything. This, will work, but is a huge hassle and prone to error.
Surely there must be some better way of handling "live" and "development" versions of a Python project.
You want to use virtualenv (or something like it).
$ virtualenv mydev
$ source mydev/bin/activate
This creates a local Python installation in the mydev directory and modifies several key environment variables to use mydev instead of the default Python directories.
Now, your PYTHONPATH looks in mydev first for any imports, and anything you install (using pip, setup.py, etc) will go in mydev. When you are finished using the mydev virtual environment, run
$ deactivate
to restore your PYTHONPATH to its previous value. mydev remains, so you can always reactivate it later.
#chepner's virtualenv suggestion is a good one. Another option, assuming your project is not installed on the machine as a python egg, is to just add your development path to the front of PYTHONPATH. Python will find your development project1 before the regular one and everyone is happy. Eggs can spoil the fun because they tend to get resolved before the PYTHONPATH paths.
I'm developing a Python utility module to help with file downloads, archives, etc. I have a project set up in a virtual environment along with my unit tests. When I want to use this module on the same computer (essentially as "Production"), I move the files to the mymodule directory in the ~/dev/modules/mymodule
I keep all 3rd-party modules under ~/dev/modules/contrib. This contrib path is on my PYTHONPATH, but mymodule is NOT because I've noticed that if mymodule is on my PYTHONPATH, my unit tests cannot distinguish between the "Development" version and the "Production" version. But now if I want to use this common utility module, I have to manually add it to the PYTHONPATH.
This works, but I'm sure there's a better, more automated way.
What is the best way to have a Development and Production module on the same computer? For instance, is there a way to set PYTHONPATH dynamically?
You can add/modify python paths at sys.path, just make sure that the first path is the current directory ".", because some third-party modules rely on importing from the directory of the current module.
More information on python paths:
http://djangotricks.blogspot.com/2008/09/note-on-python-paths.html
I'm guessing by virtual environment you mean the virtualenv package?
http://pypi.python.org/pypi/virtualenv
What I'd try (and apologies if I've not understood the question right) is:
Keep the source somewhere that isn't referenced by PYTHONPATH (e.g. ~/projects/myproject)
Write a simple setuptools or distutils script for installing it (see Python distutils - does anyone know how to use it?)
Use the virtualenv package to create a dev virtual environment with the --no-site-packages option - this way your "dev" version won't see any packages installed in the default python installation.
(Also make sure your PYTHONPATH doesn't have any of your source directories)
Then, for testing:
Activate dev virtual environment
Run install script, (usually something like python setup.py build install). Your package ends up in /path/to/dev_virtualenv/lib/python2.x/site-packages/
Test, break, fix, repeat
And, for production:
Make sure dev virtualenv isn't activated
Run install script
All good to go, the "dev" version is hidden away in a virtual environment that production can't see...
...And there's no (direct) messing around with PYTHONPATH
That said, I write this with the confidence of someone who's not actually tried setting using virtualenv in anger and the hope I've vaguely understood your question... ;)
You could set the PYTHONPATH as a global environment variable pointing to your Production code, and then in any shell in which you want to use the Development code, change the PYTHONPATH to point to that code.
(Is that too simplistic? Have I missed something?)