Should we gitignore the .python-version file? - python

I have a .python-version file, and when I create a Python repo with github and specify that it should have a .gitignore, it adds the .python-version file to it. It seems to me that that file should NOT be ignored since other people running the code on different machines would want to know what version of Python they need.
So why is it .gitignored?

While being too specific, you can still version that file (meaning: not include it in the default .gitignore), as :
it will be used only by pyenv
it is a good addition to the README, in order to illustrate what version of python is recommended for the specific project,
it can be overridden easily (if you are using pyenv), or simply ignored (if you don't have pyenv).
As the article "How to manage multiple Python versions and virtual environments " states:
When setting up a new project that is to use Python 3.6.4 then pyenv local 3.6.4 would be ran in its root directory.
This would both set the version, and create a .python-version file, so that other contributors’ machines would pick it up.
But:
pyenv looks in four places to decide which version of Python to use, in priority order:
The PYENV_VERSION environment variable (if specified).
You can use the pyenv shell command to set this environment variable in your current shell session.
The application-specific .python-version file in the current directory (if present).
You can modify the current directory's .python-version file with the pyenv local command.
The first .python-version file found (if any) by searching each parent directory, until reaching the root of your filesystem.
The global version file. You can modify this file using the pyenv global command.
If the global version file is not present, pyenv assumes you want to use the "system" Python. (In other words, whatever version would run if pyenv weren't in your PATH.)

The reason why .python-version should be gitignored is because its version is too specific. Tiny versions of Python (e.g. 2.7.1 vs 2.7.2) are generally compatible with each other, so you don't want to lock down to a specific tiny version. Furthermore, many Python apps or libraries should work with a range of Python versions, not just a specific one. Using .python-version indicates that you want other developers to use an exact, specific Python version, which is usually not a good idea.
If you want to indicate the minimum Python version needed, or otherwise a version range, then I believe documenting that in a README is a more appropriate solution.

It can also be a bit problematic when using python virtual environments, as people may want to use virtual environment names different than 3.7.2/envs/myvenv.

Old post but still relevant.
My answer would be "it depends".
The name of a virtual env can also be used in .python-version, if it is managed with the help of the virtualenv plugin of pyenv. This makes this file pretty useless it the project is managed on a public Git repo and you can exclude it (but not to do is harmless as told in other answers).
But (and I am in this situation) if you manage the project on a private repo and share virtual envs, it can make sense to not exclude it from Git. This allows you to work with a different environment (including the Python version) on an experimental branch of the project. Of course, it would have been far cleaner to fork or clone the original project and experiment with the new env in the copy, but sometimes it easier to just create a new branch.
At the end of the day, IMHO there is no universal answer to the question, and it depends on your workflow.

Well sir I think answer to your question is YES. I just openend GitHub official repo and checked the project gitignore.
It showed .python-version file mentioned there.
And if it's not getting ignored you can simply check for correct way to mention.

Related

How do I replace my Homebrew based Python configuration with Conda

I currently have a rather complex Python configuration that has evolved over the years, and I'd like to clean it up and "modernize" it.
The existing configuration has a the default macOS Python, and Homebrew's Python 3 and Python 2 all existing side-by-side, along with their associated Pips. I also have some python command line tools that these Pythons or their associated installed packages have created, and which I use more or less frequently.
What I'd like to do is:
Leave macOS Python untouched
Eliminate all Homebrew Python's
Remove non-macOS Python 2 entirely
Switch to Conda Python as my Python 3
Have access to mkvirtualenv (as an alternative to creating environments) with virtualenvwrapper
Have access to Jupyter
I'm not sure how to do this without creating problems, and want to confirm that the obvious thing is the safe thing:
use Homebrew to uninstall its Pythons,
install Conada, and then
use (Conda's) pip to install mkvirtualenv, virtualenvwrapper, and Jupyter (and any other tools I subsequently need)
Is that the correct procedure? Is so are there particular forms of the commands I should use or options I should chose for them?
The biggest and/or first issue is how to not break existing functionality that relies on Python. There are two broad camps here:
1) tools and other scripts that hard-code the Python executable's location, and
2) tools and other scripts that rely on the/a system PATH variable.
#1 is the easier one. If you aren't going to remove any Python versions, then these are no work at all...these will keep working. If you do want to uninstall some Python versions, then you have to work to switch any tools relying on those versions you want to remove to another version that also works for that tool. The path in question is commonly in a shebang ('#! xxx') line at the top of each main Python binary, but there are other ways that the path to the Python binary can be formed. In short, why uninstall anything? Disk space is cheap. Maybe instead just make sure that these unwanted versions are not referenced by any PATH variables.
#2 is the hard one. It isn't necessarily the case that all of the tools in this category are using the version of Python you get when you just type "python" at a command prompt for your primary account. There can be other modes of operation that initialize the execution environment (the PATH variable) in different ways, and so may be running different Python versions despite depending on the value of PATH.
Part of #2 is worrying about not just "python" references, but "python2", "python3", and possibly other variants as well.
Only once you've got a plan for dealing with the above so you don't break things can you worry about possibly getting rid of Python versions and installing new ones. Hopefully, Brew does a good job of uninstalling the versions it's installed, so if you can remove dependencies on one or more of them, they can potentially be easily removed. If you've got self-installed Python versions, those should be easy to uninstall as well by just removing references to them in PATH variables (or not...shouldn't be a big problem if you miss some) and then deleting the install directory.
Then there's adding the new version(s) of Python. This can only affect #2 above. You have to think about that one and know what affect you're going to have if the new install(s) manipulate any PATH variables. If it only manipulates your own user's PATH, or it leaves it to you to do so, this is a much easier to understand task, but any change to the environment is a chance to break existing functionality.
Finally, there's the mechanisms for choosing different Python versions for new development, including the use of virtual envs. This is probably the easiest part, as you can do research, try things, and test that you can do whatever you want to do. This part of the problem is the best bounded.
I don't know anything about Jupyter, other than knowing vaguely what it is, so I don't know how that complicates all this.
UPDATE: A final note. As you may already know, Python does a good job of isolating itself in terms of each version keeping its unique identity. If you use the right 'pip' and 'easy_install' that are sitting right next to the 'python' binary you're going to run with, you should be cleanly affecting just that one environment. I can't know that it's this easy for all Python versions, but I've never seen this convention broken by a version of Python that I've used. The complications here, of course, involve which versions of these tools you're getting in various situations when they are found via a PATH variable.
First, install anaconda or miniconda. The installation is non-destructive and does not conflict with your other Python installations. Check that it works before you consider removing homebrew installed Pythons.
The conda command is used both as a package manager and as an environment manager. You cannot avoid creating conda environments: the default installation is already part of an environment named base. I'm not sure why you would want to, either.
You can use pip to install any package you choose into a conda environment, but since you can use conda install for any package available on any conda channel (e.g. 'defaults', 'conda-forge'), using pip often is redundant.
You could use non-conda virtual environments, but again: why? conda create -n foo python=x.x jupyter #etc and then conda activate foo is all you need to get one up and running.

Location for configuration in a virtualenv

Where is the common location/directory to store configuration in a python virtualenv?
For Linux there is /etc for user stuff there is XDG_CONFIG_HOME (~/.config) but for virtualenv ...?
I know that I can store my configuration in any location that I want, but maybe there is a common location, which makes my application easier to understand by python experts.
So I think this is the most common approach...
1. postactivate with virtualenvwrapper
I've always done this in the postactivate file myself. In this approach, you can either define environment variables directly in that file (my preference) or in a separate file in your project dir which you source in the postactivate file. To be specific, this is actually a part of virtualenvwrapper, as opposed to virtualenv itself.
http://virtualenvwrapper.readthedocs.io/en/latest/scripts.html#postactivate
(If you want to be really clean, you can also unset your environment vars in the postdeactivate file.)
Alternatively, you can do this directly in the activate file. I find that approach less desirable because there are other things going on in there too.
https://virtualenv.pypa.io/en/latest/userguide.html#activate-script
Two popular alternatives I've also used are:
2. .env with autoenv
Independent of virtualenv, another approach towards solving the same problem is Kenneth Reitz' autoenv, which automatically sources a .env whenever you cd into the project directory. I do not use this one much anymore.
https://github.com/kennethreitz/autoenv
3. .env with Python Decouple
If you only need the environment variables for Python code (and not, for example, in a shell script inside your project) then Python Decouple is a related approach which uses a simplified .env file in the root of your project. I find myself using it more and more these days personally.
https://github.com/henriquebastos/python-decouple/
I'm somewhat surprised to see this is not discussed in detail in The Hitchhiker's Guide to Python - Virtual Environments. Perhaps we can generate a pull request about it from this question.

Why is virtualenv necessary?

I am a beginner in Python.
I read virtualenv is preferred during Python project development.
I couldn't understand this point at all. Why is virtualenv preferred?
Virtualenv keeps your Python packages in a virtual environment localized to your project, instead of forcing you to install your packages system-wide.
There are a number of benefits to this,
the first and principle one is that you can have multiple virtulenvs, so you
can have multiple sets of packages that for different projects, even
if those sets of packages would normally conflict with one another.
For instance, if one project you are working on runs on Django 1.4
and another runs on Django 1.6, virtualenvs can keep those projects
fully separate so you can satisfy both requirements at once.
the second, make it easy for you to release your project with its own dependent
modules.Thus you can make it easy to create your requirements.txt
file.
the third, is that it allows you to switch to another installed python interpreter for that project*. Very useful (Think old 2.x scripts), but sadly not available in the now built-in venv.
Note that virtualenv is about "virtual environments" but is not the same as "virtualization" or "virtual machines" (this is confusing to some). For instance, VMWare is totally different from virtualenv.
A Virtual Environment, put simply, is an isolated working copy of Python which allows you to work on a specific project without worry of affecting other projects.
For example, you can work on a project which requires Django 1.3 while also maintaining a project which requires Django 1.0.
VirtualEnv helps you create a Local Environment(not System wide) Specific to the Project you are working upon.
Hence, As you start working on Multiple projects, your projects would have different Dependencies (e.g different Django versions) hence you would need a different virtual Environment for each Project. VirtualEnv does this for you.
As, you are using VirtualEnv.. Try VirtualEnvWrapper : https://pypi.python.org/pypi/virtualenvwrapper
It provides some utilities to create switch and remove virtualenvs easily, e.g:
mkvirtualenv <name>: To create a new Virtualenv
workon <name> : To use a specified virtualenv
and some others
Suppose you are working on multiple projects, one project requires certain version of python and other project requires some other version. In case you are not working on virtual environment both projects will access the same version installed in your machine locally which in case can give error for one.
While in case of virtual environment you are creating a new instance of your machine where you can store all the libraries, versions separately. Each time you can create a new virtual environment and work on it as a new one.
There is no real point to them in 2022, they are a mechanism to accomplish what C#, Java, Node, and many other ecosystems have done for years without virtual environments.
Projects need to be able to specify their package and interpreter dependencies in isolation from other projects. Virtual Environments are a fine but legacy solution to that issue. (Vs a config file which specifies interpreter version and local __pypackages__)
pep 582 aims to address this lack of functionality in the python ecosystem.

Why use Pythons 'virtualenv' on Linux when one has 'chroot' (and union/overlay filesystems)?

First of all let me state that I am a proponent of generic software (in general ;-). I am no expert on Python, but it seems that the 'virtualenv' utility solves pretty much the same problem 'chroot' can help to solve - bootstrapping a directory tree that can be passed as root, thus effectively protecting the real directory tree, if needed.
Since I am no expert in Python as already mentioned, I wonder - what problem can virtualenv solve that chroot cannot? I mean, can't I just set up a nice fake root tree (possibly using union mounting), chroot into it, and do pip install a package I want in my new environment, and then play around within the bounds of my new environment, running python scripts and what not?
Am I missing something here?
Update:
Can't one install packages/modules locally in whatever application directory, I mean, without root privileges and subsequently without overwriting or adding files to /usr/lib or /usr/local/lib? It appears that this is what virtualenv does, however I think it has to symlink or otherwise provide a python interpreter for each environment one creates, does it not?
bootstrapping a directory tree that can be passed as root
That's not what virtualenv does, except (to some degree) for Python packages. It provides a place where these can be installed without replacing the rest of the filesystem. It also works without root privileges and it's portable as it needs no kernel support, unlike chroot, which (I presume) won't work on Windows.
Can't one install packages/modules locally in whatever application directory
Yes, but virtualenv does one more thing, which is that it disables (by default at least) the system's Python package directories. That means you can test whether your package correctly installs all of its dependencies (you might have forgotten to list one because it's already installed on your system) and it allows installing different versions in case you need either newer or older versions. The ability to install older versions should not be overlooked because sometimes new versions of packages introduce bugs.

Setting Python path while developing library module

I am developing a library and an application that uses the library in Python 2.6. I've placed a "mylib.pth" file in "site-packages" so that I can import mylib from within my application.
I am using a DVCS so when I want to fix a bug or add a feature to the library I make a branch of the repository and work within that branch. To test my application with the changes I am making to the library I edit the path in "mylib.pth" to point to the new development branch.
This gets a little tedious if I have a few parallel branches of my library going on at one. I have to keep editing the "mylib.pth" file before testing to ensure I am testing against the correct version of my library. Is there a way to use the current path (i.e. the development branch of the library that I am current in) to set the library path when I invoke my application instead of using the "mylib.pth" in the global "site-packages" directory?
Is virtualenv what you're looking for? From the description:
Imagine you have an application that
needs version 1 of LibFoo, but another
application requires version 2. How
can you use both these applications?
If you install everything into
/usr/lib/python2.4/site-packages (or
whatever your platform's standard
location is), it's easy to end up in a
situation where you unintentionally
upgrade an application that shouldn't
be upgraded.
Suggested reading: Tools of the Modern Python Hacker: Virtualenv, Fabric and Pip. It addresses a number of problems with development and deployment of Python apps.
If you use setuptools, then you can say setup.py develop in your working tree, and it will do the .pth file manipulation for you.
Sure, you can alter sys.path to add the current directory (or a subdirectory of it) to the search path. site.addsitedir is a good way to do it. Since you'd be doing this from Python you can have any sort of logic you like for deciding which directory to add; you could base it on os.path.normpath​ing the current directory if it looks like a branch, or looking for the newest branch on-disc, or something else.
You could put this code in the sitecustomize.py module or other startup-triggered location.
You might also consider using zc.buildout. It allows you to create entry points with customized python paths.
I set my PYTHONPATH to point to the latest-and-greatest version. No editing.
export PYTHONPATH=.:/the/new/version

Categories

Resources