Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have installed Python 2.7.9 in /usr/local/bin. Now it doesn't work any more. I have another Python in /usr/bin/ but in the path is /usr/local/bin/ first. How can i remove the 2.7.9 Python?
Your question is lacking in details, the most pertinent being how you actually installed Python into /usr/local/bin. The installation method would indicate how to remove the installed files.
The most common way of installing packages into the /usr/local hierarchy of directories to is to compile from source and to run sudo make install after compiling and linking. If you didn't already remove the original (uncompressed) source directory, you can change into it and remove the compiled Python package by running:
sudo make uninstall
If the source code has been deleted, you could try re-downloading the source again.
If there’s no uninstall target for make (unfortunately, more common than you might think), another (inelegant) option is to use the find command to search for all files in /usr/local directory tree that have the same modification time as other files in the application that you want to remove.
These days, I would recommend installing the checkinstall tool. Instead of running make install, this can be used to create an RPM or Debian package which can then be installed (and uninstalled) using the system’s regular software installation tools.
DISCLAIMER: I've since learned a lot, and would recommend setting environment variables for a shell or shell session rather than use this answer. For example, if you manually relink the system's Python2 interpreter to a Python3 interpreter, you may wreak havoc on your system. Please use this answer with caution.
Just reset the symlink.
First, find out which python:
$ which python
In my case, I get:
/usr/local/bin/python
Then find where the symlink points to
$ file /usr/local/bin/python
/usr/local/bin/python: symbolic link to `/usr/bin/python'
Then just repoint the symlink back to the default (in this case, I use the default: /usr/bin/python).
No uninstalls necessary.
Update
I've since found a lot of better ways to enact this exact same behavior, without having effects on the entire system.
Say I have an undesired python install in /usr/bin, and a desired python install in /opt/bin. Let's say for the point of comparison that the /usr/bin is Python 3.5, and the /opt/bin is Python 2.7. This would create immediate consequences for using the wrong Python interpreter, rather than subtle errors down the line.
Application Defaults
If you would like to (on Linux systems) change which interpeter runs Python scripts, you can change this either via a GUI, or via xdg-mime (a walkthrough can be found here). For macOS or Windows, this can be done easily through a GUI.
Interactive Shell
If you would like to change the default Python for a specific shell, I can see two good ways of doing this. One would be to change the default search PATH to set /opt/bin before usr/bin for a specific situation, however, if you have numerous alternative installs to system packages, this might pose issues too. Another would be to set an alias for Python to the version you want to use. This is the preferred solution, as it only changes the interpreter and is merely a shortcut to reference an existing command.
For example, to set the alias I could use:
alias python="/opt/bin/python"
And to change the default path, I could use:
export PATH=/opt/bin:$PATH
Adding these lines to ~/.bashrc or ~/.bash_aliases (the latter is Ubuntu-only by default) will make these shortcuts be the default on any interactive shell you start. Combining application defaults and interactive shell scripting allows you to have tight control over which interpreter runs your code, but does not require interfering with potentially crucial system files.
Your PATH environment variable. It has a list of directories which bash searches (in the same order) when it's looking for an program to execute. Basically you want to put /usr/local/bin at the start of your PATH environment variable. Add the following to your ~/.bashrc file:
export PATH=/usr/local/bin:$PATH
You can have a look at the current setting by running the set command in bash.
Alternatively, you can simply rename /usr/bin/python to /usr/bin/python2.3 and create a symlink pointing to the new version, e.g.
ln -s /usr/local/bin/python /usr/bin/python
You can use checkinstall to remove Python, :
Install checkinstall
Use checkinstall to make a deb of your Python installation
Use dpkg -r to remove the deb.
See this post for more details.
Related
I tried to update Python 3.8.5. to 3.8.10 on a Windows 7 machine, but some part of Python's and/or pip's messy installer/path/package management system bricked everything. Nobody I asked knows a canonical solution to this + pretty much everybody is suggesting a complete reinstallation.
Which is why I've now completely removed Python and have to reinstall Python, pip, and all my packages one by one. I've already uninstalled/removed Python and pip and downloaded the official Python 3.8.10 64-bit Windows installer as well as get-pip.py.
But despite spending days and days of reading, I can't see through Python's complicated mess of "user-specific vs. local vs. system-wide" installation schemes, varying package installations paths, the seemingly arbitrary variations introduced by using python vs. python -m, pip install vs. pip install --user etc. during package installations, and the regular whining about some PATH environment variables not being set properly etc. pp. - if you've ever used Python professionally, you'll have an idea of what I'm describing here.
Anyway - what I want to do now is make one clean installation where I stick to one set of rules for everything. All packages installed to one single superdirectory (vs. getting scattered all over the system) and all PATH variables set accordingly to the most universal and complete configuration possible (I don't want to see any complaints from Python ever again in this regard). Note that I'm the administrator of the machine, but working from a normal user account with Windows UAC enabled and want an installation for all users - the most general solution possible, no limiting scenarios that may cause the very problems I'm trying to avoid.
Also, I do not want to use virtual environments for now, but this is different topic I'm already working on independently. So no suggestions regarding venv.
Question: How to procede with the installation?
Possible sub-issues that need to be addressed:
Correct privileges for the Windows installer, e.g. the confusing "for all users (requires elevation)" and subsequent (second!) "Install for all users" options. The latter changes the installation path from C:\Users\<Username>\AppData\Local\Programs\Python\Python38 to C:\Program Files\Python38 and Windows UAC may prevent access of Python and/or pip to C:\Program Files\ without proper exception handling (e.g. user prompt) in place:
Usage of get-pip.py vs. some other methods to install pip, which also concerns the usage of python vs. python -m vs. pip install vs. pip install --user etc. in this context and subsequent installation of user packages.
Prevention of scattering/fragmentation of the Python development framework over the system/different folders/different users, causing e.g. annoying PATH or dependency issues/conflicts.
Defining the correct set of Windows path variables under these requirements and addressing concerns/doubts about Python's and pip's ability of reliably handling this issue on their own.
Note: I'm the owner/only user on this machine and therefore have administrator rights. Managing installations/environments for multiple users is not the subject here and of no interest for me.
Anyway - what I want to do now is make one clean installation where I stick to one set of rules for everything. All packages installed to one single superdirectory (vs. getting scattered all over the system) and all PATH variables set accordingly to the most universal and complete configuration possible (I don't want to see any complaints from Python ever again in this regard).
This is contradictory. A single installation with a common set of rules, every possible installed package, comprehensive PATH etc. is inherently not "clean". The point of maintaining separate installations is so that one user's changes do not unexpectedly impact on the operation of another user's code. It is not possible to do this with a single installation. If you upgrade a library, for example, it affects every user who wrote code that uses that library, who must now check for incompatibilities.
Correct privileges for the Windows installer, e.g. the confusing "for all users (requires elevation)" and subsequent (second!) "Install for all users" options.
The first one means "Install the launcher for all users", which is why that checkbox is on the same line as "py launcher". (If you are not familiar with the Python launcher for Windows, please read the documentation.) it says "(requires elevation)" because the installer requires elevation in order to install this feature for all users. You do not need to do this if you have installed it from a previous version of Python.
and subsequent (second!) "Install for all users" options.
This is the option to install this version of Python for all users. The install directory changes according to the "for all users" setting, in the expected way.
Windows UAC may prevent access of Python and/or pip to C:\Program Files\ without proper exception handling (e.g. user prompt) in place
It will prevent write access without elevation, yes. This is by design, and it is why the option exists for per-user installations.
pip requires write access, because its purpose is to install libraries. I do not know whether it can request elevation from the command prompt; probably not. You can work around this by using the command in an elevated command window. With UAC enabled, you will not be able to install libraries into a Python for-all-users installation (i.e., in Program Files): it is the explicit purpose of UAC to prevent that sort of thing.
Python normally does not require write access to these directories. The standard library does not need write access once the library is installed. There might be a problem if you uncheck "Precompile standard libraries"; I have not tested this and have stopped using Windows. Third-party libraries normally will not require write access for their own installation directories, either. If you encounter a problem, consult the documentation for that library.
Usage of get-pip.py vs. some other methods to install pip, which also concerns the usage of python vs. python -m vs. pip install vs. pip install --user etc. in this context and subsequent installation of user packages.
get-pip etc. are deprecated. They are tools intended to account for the fact that Python didn't always come bundled with pip. It does now. pip works the same way regardless of whether it was bundled with Python or installed to an older Python version separately. There is not a clear question here; there are several specific things that need to be researched and understood about how to use pip.
For testing my libraries on multiple Python versions I have a single virtual environment that I install them into, and reference them with their complete name/version (i.e. python3.7). Recently I noticed that sys.path is still referencing the source library instead of the copied library (i.e. /source/python/... instead of /source/virtualenv/lib/python3.7/...)
I've tried make install instead of make altinstall 1, I've searched for answers -- so far nothing has helped.
How do I fix this?
1 PSA: If you use make (alt)install and you don't want to clobber your system Python, make sure and use
./configure --prefix /path/to/install_to/here
TL;DR Remove the pyvenv.cfg in the virtualenv root directory.
The issue is the interaction with the virtualenv, and not make. Somewhere in Python's startup it checks to see if it is running in a virtualenv and, if it is, uses the libraries from it's original installation (and I had created the virtualenv from my source copy).
The solution is to remove the pyvenv.cfg file in the root of the virtualenv. This will completely isolate the virtualenv from the system (so no sharing of site-packages nor dist-packages), but is exactly what I wanted for my purposes.
I currently have a rather complex Python configuration that has evolved over the years, and I'd like to clean it up and "modernize" it.
The existing configuration has a the default macOS Python, and Homebrew's Python 3 and Python 2 all existing side-by-side, along with their associated Pips. I also have some python command line tools that these Pythons or their associated installed packages have created, and which I use more or less frequently.
What I'd like to do is:
Leave macOS Python untouched
Eliminate all Homebrew Python's
Remove non-macOS Python 2 entirely
Switch to Conda Python as my Python 3
Have access to mkvirtualenv (as an alternative to creating environments) with virtualenvwrapper
Have access to Jupyter
I'm not sure how to do this without creating problems, and want to confirm that the obvious thing is the safe thing:
use Homebrew to uninstall its Pythons,
install Conada, and then
use (Conda's) pip to install mkvirtualenv, virtualenvwrapper, and Jupyter (and any other tools I subsequently need)
Is that the correct procedure? Is so are there particular forms of the commands I should use or options I should chose for them?
The biggest and/or first issue is how to not break existing functionality that relies on Python. There are two broad camps here:
1) tools and other scripts that hard-code the Python executable's location, and
2) tools and other scripts that rely on the/a system PATH variable.
#1 is the easier one. If you aren't going to remove any Python versions, then these are no work at all...these will keep working. If you do want to uninstall some Python versions, then you have to work to switch any tools relying on those versions you want to remove to another version that also works for that tool. The path in question is commonly in a shebang ('#! xxx') line at the top of each main Python binary, but there are other ways that the path to the Python binary can be formed. In short, why uninstall anything? Disk space is cheap. Maybe instead just make sure that these unwanted versions are not referenced by any PATH variables.
#2 is the hard one. It isn't necessarily the case that all of the tools in this category are using the version of Python you get when you just type "python" at a command prompt for your primary account. There can be other modes of operation that initialize the execution environment (the PATH variable) in different ways, and so may be running different Python versions despite depending on the value of PATH.
Part of #2 is worrying about not just "python" references, but "python2", "python3", and possibly other variants as well.
Only once you've got a plan for dealing with the above so you don't break things can you worry about possibly getting rid of Python versions and installing new ones. Hopefully, Brew does a good job of uninstalling the versions it's installed, so if you can remove dependencies on one or more of them, they can potentially be easily removed. If you've got self-installed Python versions, those should be easy to uninstall as well by just removing references to them in PATH variables (or not...shouldn't be a big problem if you miss some) and then deleting the install directory.
Then there's adding the new version(s) of Python. This can only affect #2 above. You have to think about that one and know what affect you're going to have if the new install(s) manipulate any PATH variables. If it only manipulates your own user's PATH, or it leaves it to you to do so, this is a much easier to understand task, but any change to the environment is a chance to break existing functionality.
Finally, there's the mechanisms for choosing different Python versions for new development, including the use of virtual envs. This is probably the easiest part, as you can do research, try things, and test that you can do whatever you want to do. This part of the problem is the best bounded.
I don't know anything about Jupyter, other than knowing vaguely what it is, so I don't know how that complicates all this.
UPDATE: A final note. As you may already know, Python does a good job of isolating itself in terms of each version keeping its unique identity. If you use the right 'pip' and 'easy_install' that are sitting right next to the 'python' binary you're going to run with, you should be cleanly affecting just that one environment. I can't know that it's this easy for all Python versions, but I've never seen this convention broken by a version of Python that I've used. The complications here, of course, involve which versions of these tools you're getting in various situations when they are found via a PATH variable.
First, install anaconda or miniconda. The installation is non-destructive and does not conflict with your other Python installations. Check that it works before you consider removing homebrew installed Pythons.
The conda command is used both as a package manager and as an environment manager. You cannot avoid creating conda environments: the default installation is already part of an environment named base. I'm not sure why you would want to, either.
You can use pip to install any package you choose into a conda environment, but since you can use conda install for any package available on any conda channel (e.g. 'defaults', 'conda-forge'), using pip often is redundant.
You could use non-conda virtual environments, but again: why? conda create -n foo python=x.x jupyter #etc and then conda activate foo is all you need to get one up and running.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm novice in this, and I have started learning Python, but I have some questions that I'm not be able to understand,
What exactly is the PYTHONPATH (on Ubuntu)? Is it a folder?
Is Python provided by default on Ubuntu, or does it have to be installed explicitly?
Where is the folder in which all modules are (I have a lot folders called python_)?
If I wish a new module to work when I'm programming (such as pyopengl) where should I go to introduce all the folders I've got in the folder downloaded?
Coming back from the PYTHONPATH issue, how do I configure the PYTHONPATH in order to start working on my new module?
PYTHONPATH is an environment variable which you can set to add additional directories where python will look for modules and packages. e.g.:
# make python look in the foo subdirectory of your home directory for
# modules and packages
export PYTHONPATH=${PYTHONPATH}:${HOME}/foo
Here I use the sh syntax. For other shells (e.g. csh,tcsh), the syntax would be slightly different. To make it permanent, set the variable in your shell's init file (usually ~/.bashrc).
Ubuntu comes with python already installed. There may be reasons for installing other (independent) python versions, but I've found that to be rarely necessary.
The folder where your modules live is dependent on PYTHONPATH and where the directories were set up when python was installed. For the most part, the installed stuff you shouldn't care about where it lives -- Python knows where it is and it can find the modules. Sort of like issuing the command ls -- where does ls live? /usr/bin? /bin? 99% of the time, you don't need to care -- Just use ls and be happy that it lives somewhere on your PATH so the shell can find it.
I'm not sure I understand the question. 3rd party modules usually come with install instructions. If you follow the instructions, python should be able to find the module and you shouldn't have to care about where it got installed.
Configure PYTHONPATH to include the directory where your module resides and python will be able to find your module.
PYTHONPATH is an environment variable
Yes (see https://unix.stackexchange.com/questions/24802/on-which-unix-distributions-is-python-installed-as-part-of-the-default-install)
/usr/lib/python2.7 on Ubuntu
you shouldn't install packages manually. Instead, use pip. When a package isn't in pip, it usually has a setuptools setup script which will install the package into the proper location (see point 3).
if you use pip or setuptools, then you don't need to set PYTHONPATH explicitly
If you look at the instructions for pyopengl, you'll see that they are consistent with points 4 and 5.
PYTHONPATH is an environment variable those content is added to the sys.path where Python looks for modules. You can set it to whatever you like.
However, do not mess with PYTHONPATH. More often than not, you are doing it wrong and it will only bring you trouble in the long run. For example, virtual environments could do strange things…
I would suggest you learned how to package a Python module properly, maybe using this easy setup. If you are especially lazy, you could use cookiecutter to do all the hard work for you.
Is there anything equivalent or close in terms of functionality to Python's virtualenv, but for Perl?
I've done some development in Python and a possibility of having non-system versions of modules installed in a separate environment without creating any mess is a huge advantage. Now I have to work on a new project in Perl, and I'm looking for something like virtualenv, but for Perl. Can you suggest any Perl equivalent or replacement for python's virtualenv?
I'm trying to setup X different sets of non-system Perl packages for Y different applications to be deployed. Even worse, these applications may require different versions of the same package, so each of them may require to be installed in a separate module/library environment. You may want to do this manually for X < Y < 3. But you should not do this manually for 10 > Y > X.
Ideally what I'm looking should work like this:
perl virtualenv.pl my_environment
. my_environment/bin/activate
wget http://.../foo-0.1.tar.gz
tar -xzf foo-0.1.tar.gz ; cd foo-0.1
perl Makefile.pl
make install # <-- package foo-0.1 gets installed inside my_environment
perl -MCPAN -e 'install Bar' # <-- now package Bar with all its deps gets installed inside my_environment
There's a tool called local::lib that wraps up all of the work for you, much like virtualenv. It will:
Set up #INC in the process where it's used.
Set PERL5LIB and other such things for child processes.
Set the right variables to convince CPAN, MakeMaker, Module::Build, etc. to install libraries and store configuration in a local directory.
Set PATH so that installed binaries can be found.
Print environment variables to stdout when used from the commandline so that you can put eval $(perl -Mlocal::lib)
in your .profile and then mostly forget about it.
I've used schroot for this purpose. It is a bit heavier than virtualenv but you can be sure that nothing will leak in that shouldn't.
Schroot manages a chroot environment for you, but mounts your home directory in the chroot so it appears like a normal shell session, just using the binaries and libraries in the chroot.
I think it may be debian/ubuntu only though.
After setting up the schroot, your script above would look like
schroot -c my_perl_dev
wget ...
See http://www.debian-administration.org/articles/566 for an interesting article about it
Also checkout perl-virtualenv , this seems to be wrapper around local::lib as suggested by Hobbs, but creates a bin/activate and bin/deactivate so you can use it just like the python tool.
I've been using it quite successfully for a month or so without realising it wasn't as standards as perhaps it should be.
It makes it lot easier to set up a working virtualenv for perl as while local:lib will tell you what variables you need to set, etc. perl-virtualenv creates an activate script which does it for you.
While investigating, I discovered this and some other pages (this one is too old and misses new technologies, this reddit post is a slight misdirect).
The problem with perlbrew and plenv is that they seem to be replacements for pyenv, not virtualenv. As noted here pyenv is for managing python versions, virtualenv is for managing per-project module versions. So, yes, in some ways similar to local::lib, but with better usability.
I've not seen a proper answer to this question yet, but from what I've read, it looks like the best solution is something along the lines of:
Perl version management: plenv/perlbrew (with most people
favouring the more contemporary bash based plenv over the perl based
perlbrew from what I can see)
Module version management: Carton
Module installation: cpan (well, cpanminus anyway, ymmv)
To be honest, this is not an ideal set up, although I'm still learning, so it may yet be superior. It just doesn't feel right. It certainly isn't a like for like replacement for virtualenv.
There are a couple of posts I've found saying "it is possible" but neither has gone any further.
I am not sure whether this is the same as that virtualenv thing you are talking about, but have a look for the #INC special variable in the perlvar manpage.
Programs can modify what directories they check for libraries uwith use lib. This lib directory can be relative to the current directory. Libraries from these directories will be used before system libraries, as they are placed at the beginning of the #INC array.
I believe cpan can also install libraries to specific directories. Granted, cpan draws from the CPAN site in order to install things, so this may not be the best option.
It looks like you just need to use the INSTALL_BASE configuration for Makefile.PL (or the --install_base option for Build.PL)? What exactly do you need the solution to do for you? It sounds like you just need to get the installed module in the right place. You've presented your problem as an XY Problem by specifying what you think is the solution is rather than letting us help you with your task.
See How do I keep my own module/library directory? in perlfaq8, for instance.
If you are downloading modules from CPAN, the latest cpan command (in App::Cpan) has a -j switch to allow you to choose alternate CPAN.pm configuration files. In those configuration files you can set the CPAN.pm options to install wherever you like.
Based on your clarification, it sounds like local::lib might work for you in single, simple cases, but I do this for industrial strength deployments where I set up custom, private CPANs per application, and install directly from those custom CPANs. See my MyCPAN::App::DPAN module, for instance. From that, I use custom CPAN.pm configs that analyze their environment and set the proper values to each application can install everything in a directory just for that application.
You might also consider distributing your application as a Task::. You install it like any other Perl module, but dependencies share that same setup (i.e. INSTALL_BASE).
What I do is start the CPAN shell (cpan) and install my own Perl 5.10 from it
(I believe the command is install perl-5.10). This will ask for various configuration
settings; I make sure to make it point to paths under /usr/local
(or some other installation location other than the default).
Then I put its resulting location in my executable $PATH before the standard perl, and use its CPAN shell to install the modules I need (usually, a lot).
My Perl scripts all start with the line
#!/usr/bin/env perl
Never had a problem with this approach.