I always thought you can install a python package by
Checking out the code (e.g. git clone ...)
cd into that folder
Run pip install .
But now I read that you need to run
python setup.py install
to install all dependencies defined in install_requires in setup.py (see HERE).
Can someone please explain the differences? And why pip install ignores the package list in install_requires? Or am I doing something completely wrong?
Related
If I clone a repos that uses setuptools, I can install it using python3 setup.py install --user.
For example:
git clone https://github.com/pybliometrics-dev/pybliometrics
cd pybliometrics
python3 setup.py install --user
However, I cannot pip uninstall it anymore. In fact:
$ pip3 uninstall pybliometrics
Found existing installation: pybliometrics 3.2.1.dev2
Can't uninstall 'pybliometrics'. No files were found to uninstall.
I have to change directory for the uninstallation command to be successful. Then change directory back if I want to reinstall it.
Why is that?
How can I uninstall from the same folder that I used to install it?
Here is the output of pip show as asked in the comment:
$ pip show -f pybliometrics
Name: pybliometrics
Version: 3.2.1.dev2
Summary: Python-based API-Wrapper to access Scopus
Home-page: https://pybliometrics.readthedocs.io/en/stable/
Author: 'John Kitchin and Michael E. Rose
Author-email: Michael.Ernst.Rose#gmail.com
License: MIT
Location: /run/media/MYNAME/MYID/data/progetti_miei/pybliometrics
Requires: pbr, requests, simplejson, tqdm
Required-by:
Files:
Cannot locate RECORD or installed-files.txt
In the output of the pip show -f pybliometrics command, we can read:
Files:
Cannot locate RECORD or installed-files.txt
This might explain why it can not be uninstalled. And I am not sure how this happened, nor how to fix it.
But with that said, here are some notes:
The commands shown in your question are inconsistent. On one hand you call pip show -f pybliometrics and on the other you call pip3 uninstall pybliometrics. But pip and pip3 are not necessarily the same thing, and do not necessarily interact with the same projects.
Do not use python setup.py install. Calling the setup.py is now deprecated, the recommended way of installing a Python project is via pip.
One should never call the pip scripts directly, but should always prefer explicitly calling the pip executable module with the targeted Python interpreter (see this reference article and this other answer for details).
So in your case what you probably should have done (no guarantee it would have solved your issue, but it would have minimized risks):
Clearly identify which Python interpreter you want to use, let's say it is path/to/bin/pythonX.Y
Install project with: path/to/bin/pythonX.Y -m pip install --user path/to/pybliometrics
Check installed project with path/to/bin/pythonX.Y -m pip show -f pybliometrics
Uninstall project with: path/to/bin/pythonX.Y -m pip uninstall pybliometrics
I bundle my package as a sdist zip file , after that i can import my package anywhere using pip install , but i want to run some post install commands automatically after calling pip install.
I cannot use python setup.py install because it is a sdist and i am using pip to install it . I do have a PostInstall class but nothing runs after pip install package. Is there a way to automatically run a script after pip install package.
I have tried using postinstall but it doesnt work and also i am not sure how to use the scripts atrib in the setup method.
This is my setup.py file :
Setup.py
I cannot use python setup.py install because it is a sdist and i am using pip to install it . I do have a MyInstall class but nothing run after pip install package.
Is there a way, using setup.py, to install a python package as a wheel/pip-style package (i.e. dist-info) instead of the egg installation that setup.py does by default (i.e. egg-info)?
For example, if I have a python package with a setup.py script and I run the following command, it will install the package as an egg.
> python setup.py install
However, I can build a wheel first, and then use pip to install that wheel as a wheel/dist-info type installation
> python setup.py bdist_wheel
> pip install ./dist/package-0.1-py2-none-any.whl
Is there a way to install the package as a wheel/dist-info installation directly from setup.py? Or is the two-step process using both setuptools and pip necessary?
Update: Confirmed, this has landed in pip now. If you are still seeing .egg-info installs when pip installing from a directory, then just upgrade your pip installation. Note that --editable installs will still use egg-info.
Original answer below:
This feature is coming soon. This was issue #4611. Follow the trail and you will find PR 4764 to pip, merged into master approx a week ago. In the meantime, you can
pip wheel .
pip install ./mypackage.whl
For me the proposed solution still didn't work (even with pip 21.0.1), and due to versioning (package-name-XX.YY), I also didn't know the name of the .whl file. You can tell pip to look in the directory and take the .whl from there:
python setup.py bdist_wheel
pip install package-name --find-links dist/
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.
In nodejs, I can do npm install package --save-dev to save the installed package into the package.
How do I achieve the same thing in Python package manager pip? I would like to save the package name and its version into, say, requirements.pip just after installing the package using something like pip install package --save-dev requirements.pip.
There isn't an equivalent with pip.
Best way is to pip install package && pip freeze > requirements.txt
You can see all the available options on their documentation page.
If it really bothers you, it wouldn't be too difficult to write a custom bash script (pips) that takes a -s argument and freezes to your requirements.txt file automatically.
Edit 1
Since writing this there has been no change in providing an auto --save-dev option similar to NPM however Kenneth Reitz (author of requests and many more) has released some more info about a better pip workflow to better handle pip updates.
Edit 2
Linked from the "better pip workflow" article above it is now recommended to use pipenv to manage requirements and virtual environments. Having used this a lot recently I would like to summarise how simple the transition is:
Install pipenv (on Mac)
brew install pipenv
pipenv creates and manages it's own virtual environments so in a project with an existing requirements.txt, installing all requirements (I use Python3.7 but you can remove the --three if you do not) is as simple as:
pipenv --three install
Activating the virtualenv to run commands is also easy
pipenv shell
Installing requirements will automatically update the Pipfile and Pipfile.lock
pipenv install <package>
It's also possible to update out-of-date packages
pipenv update
I highly recommend checking it out especially if coming from a npm background as it has a similar feel to package.json and package-lock.json
This simple line is a starting point. You can easily built a bash command to reuse the PACKAGE in the line.
pip install PACKAGE && pip freeze | grep PACKAGE >> requirements.txt
Thanks to #devsnd for the simple bash function example:
function pip-install-save {
pip install $1 && pip freeze | grep $1 >> requirements.txt
}
To use it, just run:
pip-install-save some-package
I've created python package that wraps around the actual pip called pipm. All pip commands will work as it is, plus they will be reflected in the requirements file. Unlike pip-save (inactive for sometime), a similar tool I found and wasn't able to use, it can handle many files and environments(test, dev, production, etc. ). It also has a command to upgrade all/any of your dependencies.
installation
pipm install pkg-name
installation as development dependency
pipm install pkg-name --dev
installation as testing dependency
pipm install pkg-name --test
removal
pipm uninstall pkg-name
update all your dependencies
pipm update
install all your dependencies from the requirements file
pipm install
including development dependencies
pipm install --dev
Update: apparently, pipenv is not officially endorsed by Python maintainers, and the previously-linked page is owned by a different organization. The tool has its pros and cons, but the below solution still achieves the result that the OP is seeking.
pipenv is a dependency management tool that wraps pip and, among other things, provides what you're asking:
https://pipenv.kennethreitz.org/en/latest/#example-pipenv-workflow
$ pipenv install <package>
This will create a Pipfile if one doesn’t exist. If one does exist, it will automatically be edited with the new package your provided.
A Pipfile is a direct equivalent of package.json, while Pipfile.lock corresponds to package-lock.json.
you can manually save it in a Makefile (or a text file and then imported in your Makefile):
PYTHON=.venv/bin/python # path to pyphon
PIP=.venv/bin/pip # path to pip
SOURCE_VENV=. .venv/bin/activate
install:
virtualenv .venv
$(SOURCE_VENV) && $(PIP) install -e PACKAGE
$(SOURCE_VENV) && $(PIP) install -r requirements.txt # other required packages
and then just run make install
How about make a shell function to do this ?
Add below code to your ~/.profile or ~/.bashrc
pips() {
local pkg=$1
if [ -z "$1" ]; then
echo "usage: pips <pkg name>"
return 1
fi
local _ins="pip install $pkg"
eval $_ins
pip freeze | grep $pkg -i >> requirements.txt
}
then run source ~/.profile or source ~/.bashrc to import it to your current terminal
when you want to install && save a package, just run, for example pips requests.
after package was installed, its version will be save into requirements.txt in your current directory.
I am using this small command line to install a package and save its version in requirements.txt :
pkg=package && pip install $pkg && echo $(pip freeze | grep -i $pkg) >> requirements.txt
I made a quick hack on pip to add --save option to install/uninstall commands.
Please have a look at my blog for more information about this hack:
http://blog.abhiomkar.in/2015/11/12/pip-save-npm-like-behaviour-to-pip/
Installation (GitHub):
https://github.com/abhiomkar/pip-save
Hope this helps.
What about this one:
pip freeze >> requirements.txt