I'm new to python. Are there alternatives to downloading a tarball for a python module & installing it via python setup install ? Anything like rubygems?
UPDATE I'm surprised that there are so many solutions for this one problem.
setuptools is one option. Install that and then you can install many Python modules using the easy_install command-line tool.
There are many options - you can stick with your system's default packaging system (if it has one) or you can pick one (or more) of existing Python tools like easy_install, zc.buildout or pip. I would recommend you to use Distribute together with pip.
easy_install or pip. I also recommend checking out virtualenv to isolate environments for test running packages. The Python Package Index(pypi, also called Cheeseshop) is the official third-party software repository for Python.
Pip is great:
$ pip install some_python_module
$ pip freeze > requirements.txt
$ cat requirements.txt
Output from freeze:
Creoleparser==0.7.3
Django==1.3
Genshi==0.6
PIL==1.1.7
South==0.7.3
django-debug-toolbar==0.8.5
....
After this in any other place:
$ pip install < requirements.txt
Checkout pip
Related
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.
I usually just use the command:
pip install --user <package>
but I've seen here that this:
pip install <package> --install-option="--prefix=~"
can also be used to bypass the need for sudo privileges. About this command the site says:
There is also a –user option with pip install, which installs into ~/.local. This is fine for the python module, but it puts the corr2 executable into ~/.local/bin, which is probably not in your path. The above command will instead install corr2 into ~/bin.
So apparently it does not behave the same way as the first command.
Is one way preferred over the other and if so why?
The official Python package installation guide is here:
https://packaging.python.org/en/latest/installing.html
It recommends creating Python virtual environments per project using virtualenv command (or python3.4 -m venv).
This is because if you are working with multiple Python projects they have different dependencies and having per project installation environments is the sane way to deal with this in Python.
I have some problems with jenkins and creating a virtualenv. I'm using the shiningpanda plugin and the "Virtualenv Builder" build step combined with pyenv.
I can install packages with "pip install package" but I cannot install requirements from a requirements file, because the subsequent packages cannot find the installed packages, e.g. numexpr cannot find/import numpy.
As I was typing my question, I found the answer to that problem: The current version (v0.21) of the shiningpanda plugin does NOT support pip's requirements.txt in virtualenv builders.
https://wiki.jenkins-ci.org/display/JENKINS/ShiningPanda+Plugin
Current version (0.23) works in our setup like this (in Build-Virtualenv Builder, with Nature: Shell):
pushd %run_dir%
SET PYTHONPATH=%CD%
python -m pip install --upgrade -r configurations/requirements.txt
This has worked well even if libraries require each other.
Python's easy_install makes installing new packages extremely convenient. However, as far as I can tell, it doesn't implement the other common features of a dependency manager - listing and removing installed packages.
What is the best way of finding out what's installed, and what is the preferred way of removing installed packages? Are there any files that need to be updated if I remove packages manually (e.g. by rm /usr/local/lib/python2.6/dist-packages/my_installed_pkg.egg or similar)?
pip, an alternative to setuptools/easy_install, provides an "uninstall" command.
Install pip according to the installation instructions:
$ wget https://bootstrap.pypa.io/get-pip.py
$ python get-pip.py
Then you can use pip uninstall to remove packages installed with easy_install
To uninstall an .egg you need to rm -rf the egg (it might be a directory) and remove the matching line from site-packages/easy-install.pth
First you have to run this command:
$ easy_install -m [PACKAGE]
It removes all dependencies of the package.
Then remove egg file of that package:
$ sudo rm -rf /usr/local/lib/python2.X/site-packages/[PACKAGE].egg
All the info is in the other answers, but none summarizes both your requests or seem to make things needlessly complex:
For your removal needs use:
pip uninstall <package>
(install using easy_install pip)
For your 'list installed packages' needs either use:
pip freeze
Or:
yolk -l
which can output more package details.
(Install via easy_install yolk or pip install yolk)
There are several sources on the net suggesting a hack by reinstalling the package with the -m option and then just removing the .egg file in lib/ and the binaries in bin/. Also, discussion about this setuptools issue can be found on the python bug tracker as setuptools issue 21.
Edit: Added the link to the python bugtracker.
If the problem is a serious-enough annoyance to you, you might consider virtualenv. It allows you to create an environment that encapsulates python libraries. You install packages there rather than in the global site-packages directory. Any scripts you run in that environment have access to those packages (and optionally, your global ones as well). I use this a lot when evaluating packages that I am not sure I want/need to install globally. If you decide you don't need the package, it's easy enough to just blow that virtual environment away. It's pretty easy to use. Make a new env:
$>virtualenv /path/to/your/new/ENV
virtual_envt installs setuptools for you in the new environment, so you can do:
$>ENV/bin/easy_install
You can even create your own boostrap scripts that setup your new environment. So, with one command, you can create a new virtual env with, say, python 2.6, psycopg2 and django installed by default (you can can install an env-specific version of python if you want).
Official(?) instructions: http://peak.telecommunity.com/DevCenter/EasyInstall#uninstalling-packages
If you have replaced a package with another version, then you can just delete the package(s) you don't need by deleting the PackageName-versioninfo.egg file or directory (found in the installation directory).
If you want to delete the currently installed version of a package (or all versions of a package), you should first run:
easy_install -mxN PackageName
This will ensure that Python doesn't continue to search for a package you're planning to remove. After you've done this, you can safely delete the .egg files or directories, along with any scripts you wish to remove.
try
$ easy_install -m [PACKAGE]
then
$ rm -rf .../python2.X/site-packages/[PACKAGE].egg
To list installed Python packages, you can use yolk -l. You'll need to use easy_install yolk first though.
Came across this question, while trying to uninstall the many random Python packages installed over time.
Using information from this thread, this is what I came up with:
cat package_list | xargs -n1 sudo pip uninstall -y
The package_list is cleaned up (awk) from a pip freeze in a virtualenv.
To remove almost all Python packages:
yolk -l | cut -f 1 -d " " | grep -v "setuptools|pip|ETC.." | xargs -n1 pip uninstall -y
I ran into the same problem on my MacOS X Leopard 10.6.blah.
Solution is to make sure you're calling the MacPorts Python:
sudo port install python26
sudo port install python_select
sudo python_select python26
sudo port install py26-mysql
Hope this helps.
For me only deleting this file : easy-install.pth
worked, rest pip install django==1.3.7
This worked for me. It's similar to previous answers but the path to the packages is different.
sudo easy_install -m
sudo rm -rf /Library/Python/2.7/site-packages/.egg
Plaform: MacOS High Sierra version 10.13.3
Simple Question ;)
Is there a way to simply install the package using pip once you build it. Using easy_install I would simply build my package it (python setup.py build), then if I was happy do a easy_install . and this would dump the resulting egg into the right place. How do I do this using pip?
pip install -e . will install from a local source like easy_install . would.
Most of pip's commands and functionality are designed around installing from source repositories or PyPI package listings, or maintaining consistently versioned dependencies though.
If you are going through the steps of building the package yourself, are you sure you don't want to python setup.py install manually after you are satisfied with the build?