/usr/bin/python vs /usr/local/bin/python - python

On Linux, specifically Debian Jessie, should I use /usr/bin/python or should I install another copy in /usr/local/bin?
I understand that the former is the system version and that it can change when the operating system is updated. This would mean that I can update the version in the latter independently of the OS. As I am already using python 3, I don't see what significant practical difference that would make.
Are there other reasons to use a local version?
(I know there are ~42 SO questions about how to change between version, but I can't find any about why)

I don't think I'd recommend either of these approaches, and simply would stick to a virtualenv to further isolate Python instances.
The biggest reason that you'd use a specific Python environment - be it the system, a local one, or a virtualenv - would be control. The more control you have over the environment and what gets installed in it, the less surface area you have to find or encounter bugs due to libraries which you didn't realize you introduced. If it's a virtualenv, that also makes cleanup easier; just delete the virtualenv when you no longer need it as opposed to trying to uninstall libraries installed at the system level.
Not just that, but more and more distros are converting their scripts to use Python 3. The less stomping around you do in that environment, the better.
Finally - just as a generic Shell scripting tip - I'd also encourage the use of /usr/bin/env python to ensure that you're using the version of Python that's most prominent on your PATH.

1) You should not modify the system's binaries yourself directly
2) If your $PATH variable doesn't contain /usr/local/bin, the naming of that secondary directory isn't really important. You can install / upgrade independently wherever you have installed your extra binaries.
3) For Python specifically, you could also just use conda / virtualenv invoked by your system's python to manage your versions & projects.

Related

How to install multiple versions of Python in Windows?

up until recently I have only worked with one version of Python and used virtual environments every now and then. Now, I am working with some libraries that require older version of Python. So, I am very confused. Could anyone please clear up some of my confusion?
How do I install multiple Python versions?
I initially had Python version 3.8.x but upgraded to 3.10.x last month. There is currently only that one version on my PC now.
I wanted to install one of the Python 3.8.x version and went to https://www.python.org/downloads/. It lists a lot of versions and subversions like 3.6, 3.7, 3.8 etc. etc. with 3.8.1, 3.8.2 till 3.8.13. Which one should I pick?
I actually went ahead with 3.8.12 and downloaded the Tarball on the page: https://www.python.org/downloads/release/python-3812/
I extracted the tarball (23.6MB) and it created a folder with a setup.py file.
Is Python 3.8.12 now installed? Clicking on the setup.py file simply flashes the terminal for a second.
I have a few more questions. Hopefully, they won't get me downvoted. I am just confused and couldn't find proper answers for them.
Why does Python have such heavy dependency on the exact versions of libraries and packages etc?
For example, this question
How can I run Mozilla TTS/Coqui TTS training with CUDA on a Windows system?. This seems very beginner unfriendly. Slightly mismatched package
version can prevent any program from running.
Do virtual environments copy all the files from the main Python installation to create a virtual environment and then install specific packages inside it? Isn't that a lot of wasted resources in duplication because almost all projects require there own virtual environment.
Your questions depend a bit on "all the other software". For example, as #leiyang indicated, the answer will be different if you use conda vs just pip on vanilla CPython (the standard Windows Python).
I'm also going to assume you're actually on Windows, because on Linux I would recommend looking at pyenv. There is a pyenv-win, which may be worth looking into, but I don't use it myself because it doesn't play as nice if you also want (mini)conda environments.
1. (a) How do I install multiple Python versions?
Simply download the various installers and install them in sensible locations. E.g. "C:\Program Files\Python39" for Python 3.9, or some other location where you're allowed to install software.
Don't have Python add itself to the PATH though, since that'll only find the last version to do so and can really confuse things.
Also, you probably want to use virtual environments consistently, as this ties a specific project very clearly to a specific Python version, avoiding future confusion or problems.
1. (b) "3.8.1, 3.8.2 till 3.8.13" which should I pick?
Always pick the latest 3.x.y, so if there's a 3.8.13 for Windows, but no 3.8.14, pick that. Check if the version is actually available for your operating system, sometimes there are later versions for one OS, but not for another.
The reason is that between a verion like 3.6 and 3.7, there may be major changes that change how Python works. Generally, there will be backwards compatibility, but some changes may break how some of your packages work. However, when going up a minor version, there won't be any such breaking changes, just fixes and additions that don't get in the way of what was already there. A change from 2.x to 3.x only happens if the language itself goes through a major change, and rarely happens (and perhaps never will again, depending on who you ask).
An exception to the "no minor version change problems" is of course if you run some script that very specifically relies on something that was broken in 3.8.6, but no fixed in 3.8.7+ (as an example). However, that's very bad coding, to rely on what's broken and not fixing it later, so only go along with that if you have no other recourse. Otherwise, just the latest minor version of any version you're after.
Also: make sure you pick the correct architecture. If there's no specific requirement, just pick 64-bit, but if your script needs to interact with other installed software at the binary level, it may require you to install 32-bit Python (and 32-bit packages as well). If you have no such requirement, 64-bit allows more memory access and has some other benefits on modern computers.
2. Why does Python have such heavy dependency on the exact versions of libraries and packages etc?
It's not just Python, this is true for many languages. It's just more visible to the end user for Python, because you run it as an interpreted language. It's only compiled at the very last moment, on the computer it's running on.
This has the advantage that the code can run on a variety of computers and operating systems, but the downside that you need the right environment where you're running it. For people who code in languages like C++, they have to deal with this problem when they're coding, but target a much smaller number of environments (although there's still runtimes to contend with, and DirectX versions, etc.). Other languages just roll everything up into the program that's being distributed, while a Python script by itself can be tiny. It's a design choice.
There are a lot of tools to help you automate the process though and well-written packages will make the process quite painless. If you feel Python is very shakey when it comes to this, that's probable to blame on the packages or scripts you're using, not really the language. The only fault of the language is that it makes it very easy for developers to make such a mess for you and make your life hard with getting specific requirements.
Look for alternatives, but if you can't avoid using a specific script or package, once you figure out how to install or use it, document it or better yet, automate it so you don't have to think about it again.
3. Do virtual environments copy all the files from the main Python installation to create a virtual environment and then install specific packages inside it? Isn't that a lot of wasted resources in duplication because almost all projects require there own virtual environment.
Not all of them, but quite a few of them. However, you still need the original installation to be present on the system. Also, you can't pick up a virtual environment and put it somewhere else, not even on the same PC without some careful changes (often better to just recreate it).
You're right that this is a bit wasteful - but this is a difficult choice.
Either Python would be even more complicated, having to manage many different version of packages in a single environment (Java developers will be able to tell you war stories about this, with their dependency management - or wax lyrically about it, once they get it themselves).
Or you get what we have: a bit wasteful, but in the end diskspace is a lot cheaper than your time. And unlike your time, diskspace is almost infinitely expandable.
You can share virtual environments between very similar projects though, but especially if you get your code from someone else, it's best to not have to worry and just give up a few dozen MB for the project. On the upside: you can just delete a virtual environment directory and that pretty much gets rid of the whole things. Some applications like PyCharm may remember that it was once there, but other than that, that's the virtual environment gone.
Just install them. You can have any number of Python installations side by side. Unless you need to have 2 different minor versions, for example 3.10.1 and 3.10.2, there is no need to do anything special. (And if you do need that then you don't need any advice.) Just set up separate shortcuts for each one.
Remember you have to install any 3rd-party libraries you need in each version. To do this, navigate to the Scripts folder in the version you want to do the install in, and run pip from that folder.
Python's 3rd-party libraries are open-source and come from projects that have release schedules that don't necessarily coincide with Python's. So they will not always have a version available that coincides with the latest version of Python.
Often you can get around this by downloading unofficial binaries from Christoph Gohlke's site. Google Python Gohlke.
Install Python using the windows executable installers from python.org. If the version is 3.x.y, use the highest y that has a windows executable installer. Unless your machine is very old, use the 64-bit versions. Do not have them add python to your PATH environment variable, but in only one of the installs have it install the python launcher py. That will help you in using multiple versions. See e.g. here.
Python itself does not. But some modules/libraries do. Especially those that are not purely written in Python but contain extensions written in C(++). The reason for this is that compiling programs on ms-windows can be a real PITA. Unlike UNIX-like operating systems with Linux, ms-windows doesn't come with development tools as standard. Nor does it have decent package management. Since the official Python installers are built with microsoft tools, you need to use those with C(++) extensions as well. Before 2015, you even had to use exactly the same version of the compiler that Python was built with. That is still a good idea, but no longer strictly necessary. So it is a signigicant amount of work for developers to release binary packages for each supported Python version. It is much easier for them to say "requires Python 3.x".

How do I replace my Homebrew based Python configuration with Conda

I currently have a rather complex Python configuration that has evolved over the years, and I'd like to clean it up and "modernize" it.
The existing configuration has a the default macOS Python, and Homebrew's Python 3 and Python 2 all existing side-by-side, along with their associated Pips. I also have some python command line tools that these Pythons or their associated installed packages have created, and which I use more or less frequently.
What I'd like to do is:
Leave macOS Python untouched
Eliminate all Homebrew Python's
Remove non-macOS Python 2 entirely
Switch to Conda Python as my Python 3
Have access to mkvirtualenv (as an alternative to creating environments) with virtualenvwrapper
Have access to Jupyter
I'm not sure how to do this without creating problems, and want to confirm that the obvious thing is the safe thing:
use Homebrew to uninstall its Pythons,
install Conada, and then
use (Conda's) pip to install mkvirtualenv, virtualenvwrapper, and Jupyter (and any other tools I subsequently need)
Is that the correct procedure? Is so are there particular forms of the commands I should use or options I should chose for them?
The biggest and/or first issue is how to not break existing functionality that relies on Python. There are two broad camps here:
1) tools and other scripts that hard-code the Python executable's location, and
2) tools and other scripts that rely on the/a system PATH variable.
#1 is the easier one. If you aren't going to remove any Python versions, then these are no work at all...these will keep working. If you do want to uninstall some Python versions, then you have to work to switch any tools relying on those versions you want to remove to another version that also works for that tool. The path in question is commonly in a shebang ('#! xxx') line at the top of each main Python binary, but there are other ways that the path to the Python binary can be formed. In short, why uninstall anything? Disk space is cheap. Maybe instead just make sure that these unwanted versions are not referenced by any PATH variables.
#2 is the hard one. It isn't necessarily the case that all of the tools in this category are using the version of Python you get when you just type "python" at a command prompt for your primary account. There can be other modes of operation that initialize the execution environment (the PATH variable) in different ways, and so may be running different Python versions despite depending on the value of PATH.
Part of #2 is worrying about not just "python" references, but "python2", "python3", and possibly other variants as well.
Only once you've got a plan for dealing with the above so you don't break things can you worry about possibly getting rid of Python versions and installing new ones. Hopefully, Brew does a good job of uninstalling the versions it's installed, so if you can remove dependencies on one or more of them, they can potentially be easily removed. If you've got self-installed Python versions, those should be easy to uninstall as well by just removing references to them in PATH variables (or not...shouldn't be a big problem if you miss some) and then deleting the install directory.
Then there's adding the new version(s) of Python. This can only affect #2 above. You have to think about that one and know what affect you're going to have if the new install(s) manipulate any PATH variables. If it only manipulates your own user's PATH, or it leaves it to you to do so, this is a much easier to understand task, but any change to the environment is a chance to break existing functionality.
Finally, there's the mechanisms for choosing different Python versions for new development, including the use of virtual envs. This is probably the easiest part, as you can do research, try things, and test that you can do whatever you want to do. This part of the problem is the best bounded.
I don't know anything about Jupyter, other than knowing vaguely what it is, so I don't know how that complicates all this.
UPDATE: A final note. As you may already know, Python does a good job of isolating itself in terms of each version keeping its unique identity. If you use the right 'pip' and 'easy_install' that are sitting right next to the 'python' binary you're going to run with, you should be cleanly affecting just that one environment. I can't know that it's this easy for all Python versions, but I've never seen this convention broken by a version of Python that I've used. The complications here, of course, involve which versions of these tools you're getting in various situations when they are found via a PATH variable.
First, install anaconda or miniconda. The installation is non-destructive and does not conflict with your other Python installations. Check that it works before you consider removing homebrew installed Pythons.
The conda command is used both as a package manager and as an environment manager. You cannot avoid creating conda environments: the default installation is already part of an environment named base. I'm not sure why you would want to, either.
You can use pip to install any package you choose into a conda environment, but since you can use conda install for any package available on any conda channel (e.g. 'defaults', 'conda-forge'), using pip often is redundant.
You could use non-conda virtual environments, but again: why? conda create -n foo python=x.x jupyter #etc and then conda activate foo is all you need to get one up and running.

How can I escape from python environment hell?

I don't know how I've gotten here but I have many competing installations of python on my Ubuntu 16.04 path. Some I use, some I don't.
I'm at the point now where I want to clean up things to save headache when troubleshooting issues but I don't know any strategies or tools of tackling this.
What is the best way I can find out which environments are being used and not used?
How can I determine which python directories are being pointed to and which ones are abandoned?
Whats a quick way I can get a list of non-standard packages installed to each environment?
Here is what you can try
which python usually for python2.x and which python3 for python3.x.
Then decide which version you want to use by default then you can use export python='Your required python interpreter path' for permanent changes, or you can use alias python=PATH for temporary usage.
Also see where the pip and pip3 are pointing at by using which pipX. Thus you can use one of them to install required packages.
I would recommend you to use virtualenv or pipenv so that you get more fine grained control over the interpreter selection according to the need of your project.
Note do not uninstall any of the above python packages without some research as there might be system dependencies thus breaking your system.

How do I switch from Python 3.5 back to 3.6 on mac terminal

I just dealt with the worse bug in my entire 3 years of computer programming! It turns out that because I wanted to work with the natural language toolkit I had to install python 3.5 even though I'm using python 3.6. So I downloaded 3.5 and now my terminal is using python 3.5 by default and I can't get it back to 3.6. Because I was using python 3.5 which does not automatically order dictionaries it was throwing my program off because it relies on ordered dictionaries. It took me 4 hours to figure that out.
You want to use virtualenv and/or virtualenvwrapper. This is a utility that allows you to use multiple different environments, with different Python versions, different pip packages installed, etc.
To find the 3.5 version, run which python in your terminal to find the path to the python executable; then look at your PATH environment, and see where the location of that Python is on your PATH. Then you need to find out where that path is getting added; this will depend on your OS/Shell.
Tough times for sure, sorry to hear that.
I use pyenv to manage the different python versions on my system. This allows you to create virtual environments using whichever version you want.
EDIT to address comments.
I totally understand that setting up virtualenv or something like pyenv is not simple. However, it is unfortunately the easiest way to deal with (and avoid) situations like this. There are two essential concepts that are important here:
1) Isolation - Virtualenv takes care of this. When you install dependencies in a virtual environment, they will not affect your other environments or system python installation.
2) Multiple Python Versions - In your case, you needed to use a module that did not support 3.6. Instead of creating a virtual environment using python 3.5, you accidentally messed up your system installation of 3.6. Recovering from these types of misconfigurations can be difficult, and it is often easier to simply prevent it in the first place.
Again, I completely understand that this might be complicated, I remember thinking the same thing, but it is less complicated than troubleshooting the misconfigurations that can occur without this tooling.

Upgrade python in linux

I have a linux VPS that uses an older version of python (2.4.3). This version doesn't include the UUID module, but I need it for a project. My options are to upgrade to python2.6 or find a way to make uuid work with the older version. I am a complete linux newbie. I don't know how to upgrade python safely or how I could get the UUID modules working with the already installed version. What is a better option and how would I go about doing it?
The safest way to upgrading Python is to install it to a different location (away from the default system path).
To do this, download the source of python and do a
./configure --prefix=/opt
(Assuming you want to install it to /opt which is where most install non system dependant stuff to)
The reason why I say this is because some other system libraries may depend on the current version of python.
Another reason is that as you are doing your own custom development, it is much better to have control over what version of the libraries (or interpreters) you are using rather than have a operating system patch break something that was working before. A controlled upgrade is better than having the application break on you all of a sudden.
The UUID module exists as a separate package for Python 2.3 and up:
http://pypi.python.org/pypi/uuid/1.30
So you can either install that in your Python2.4, or install Python2.6. If your distro doesn't have it, then Python is quite simple to compile from source. Look through the requirements to make sure all the libraries you need/want are installed before compiling Python. That's it.
The best solution will be installing python2.6 in the choosen directory - It will you give you access to many great features and better memory handling (infamous python=2.4 memory leak problem).
I have got several pythons installed onto my two computers, I found that the best solution for are two directories:
$HOME/usr-32
$HOME/usr-64
respectively to using operating system (I share $HOME between 32 and 64 bit versions of Linux).
In each I have one directory for every application/program, for example:
ls ~/usr-64/python-2.6.2/
bin include lib share
It leads completetely to avoiding conflicts between version and gives great portability (you can use usb pendrives etc).
Python 2.6.2 in previously example has been installed with option:
./configure --prefix=$HOME/usr-64/python-2.6.2

Categories

Resources