I'm trying to release a demo repository (where people can directly run python scripts to demonstrate some experiment). I also need to include dependencies (numpy, etc). I'd like to use pip to make it easy.
I've already made a setup.py file listing all the dependencies. I'd now like to install my repo's code the the current directory, and all the dependencies to the default path (eg. ./venv/lib/python2.7, venv/src/, etc).
Now, if I just run
pip install -e git+http://github.com/petered/my_repo.git#egg=my_repo
Everything works, except the code in my_repo gets saved in the /venv/src (whereas I want it in the root directory).
I can also run
pip install -e git+http://github.com/petered/my_repo.git#egg=my_repo --target=.
Which installs everything in the root (current) directory. But then all dependencies also end up in this directory.
How can I pip install just the source code of a package in the current directory, but all dependencies in the default directory for dependencies?
My projects usually have a setup.py file that defines all dependencies. To install the project in a virtualenv I then first clone the repository and then simply install the cloned repository:
git clone http://github.com/petered/my_repo.git .
pip install -e .
This will install my_repo where it is, but install all dependencies into lib/python2.7/site-packages/.
You will notice that this layout makes it possible to later publish the my_repo to PyPI, or install it as a dependency into lib/... if you wish to do so as the library itself has no idea about how it was installed.
Whenever I have several "private dependencies" (closed source, only available on our git server), I write installation instructions like
git clone http://github.com/petered/my_repo.git
git clone http://github.com/petered/my_repo_dependency_1.git
git clone http://github.com/petered/my_repo_dependency_2.git
pip install -e my_repo_dependency_1
pip install -e my_repo_dependency_2
pip install -e my_repo
in the readme file. This will install all private dependencies in place, but install all public PyPI dependencies in lib/python2.7/site-packages/.
Related
Is there a way to build a python package that contains only the dependencies needed by another project? The goal here is to install this module in a different project using a single pip install command. The dependent project installs almost 30 packages (public and private)
If such a thing is possible,
How should it be structured ?
What other files are needed apart from requirements file ?
If such a thing is not possible, what are my options ?
For example when installing an offline environment you can utilize pip download functionality: https://pip.pypa.io/en/stable/cli/pip_download/
Basically instead of install you can download the package contents to a directory of your choice with
pip download -r requirements.txt -d <directory> SomePackage
then move that package to the offline deployment and install it with
pip install --no-index --find-links <directory>
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.
In man pip it says under --editable <path/url>,
Install a project in editable mode (i.e. setuptools "develop mode")
from a local project path or a VCS url
What does that mean? Can I give it a repo branch on Github, and it'll go get it and install it and keep it updated as the branch changes?
If you just want to install package from git repo read
-e or --editable is little bit different, it is used, as stated in docs, for setuptools's development mode. It makes installed packages editable.
Yes you can give it link to github. Read this answer for more info. But this link will only work if this repository contains setup.py with all installation instructions. And this package will be updated when you call
pip install -U -e <url>
But only if version of package in setup.py is higher than the one in your environment.
You can forcefully reinstall this package if you need to, when source did change but version didn't.
pip install -I -e <url>
In nodejs, I can do npm install package --save-dev to save the installed package into the package.
How do I achieve the same thing in Python package manager pip? I would like to save the package name and its version into, say, requirements.pip just after installing the package using something like pip install package --save-dev requirements.pip.
There isn't an equivalent with pip.
Best way is to pip install package && pip freeze > requirements.txt
You can see all the available options on their documentation page.
If it really bothers you, it wouldn't be too difficult to write a custom bash script (pips) that takes a -s argument and freezes to your requirements.txt file automatically.
Edit 1
Since writing this there has been no change in providing an auto --save-dev option similar to NPM however Kenneth Reitz (author of requests and many more) has released some more info about a better pip workflow to better handle pip updates.
Edit 2
Linked from the "better pip workflow" article above it is now recommended to use pipenv to manage requirements and virtual environments. Having used this a lot recently I would like to summarise how simple the transition is:
Install pipenv (on Mac)
brew install pipenv
pipenv creates and manages it's own virtual environments so in a project with an existing requirements.txt, installing all requirements (I use Python3.7 but you can remove the --three if you do not) is as simple as:
pipenv --three install
Activating the virtualenv to run commands is also easy
pipenv shell
Installing requirements will automatically update the Pipfile and Pipfile.lock
pipenv install <package>
It's also possible to update out-of-date packages
pipenv update
I highly recommend checking it out especially if coming from a npm background as it has a similar feel to package.json and package-lock.json
This simple line is a starting point. You can easily built a bash command to reuse the PACKAGE in the line.
pip install PACKAGE && pip freeze | grep PACKAGE >> requirements.txt
Thanks to #devsnd for the simple bash function example:
function pip-install-save {
pip install $1 && pip freeze | grep $1 >> requirements.txt
}
To use it, just run:
pip-install-save some-package
I've created python package that wraps around the actual pip called pipm. All pip commands will work as it is, plus they will be reflected in the requirements file. Unlike pip-save (inactive for sometime), a similar tool I found and wasn't able to use, it can handle many files and environments(test, dev, production, etc. ). It also has a command to upgrade all/any of your dependencies.
installation
pipm install pkg-name
installation as development dependency
pipm install pkg-name --dev
installation as testing dependency
pipm install pkg-name --test
removal
pipm uninstall pkg-name
update all your dependencies
pipm update
install all your dependencies from the requirements file
pipm install
including development dependencies
pipm install --dev
Update: apparently, pipenv is not officially endorsed by Python maintainers, and the previously-linked page is owned by a different organization. The tool has its pros and cons, but the below solution still achieves the result that the OP is seeking.
pipenv is a dependency management tool that wraps pip and, among other things, provides what you're asking:
https://pipenv.kennethreitz.org/en/latest/#example-pipenv-workflow
$ pipenv install <package>
This will create a Pipfile if one doesn’t exist. If one does exist, it will automatically be edited with the new package your provided.
A Pipfile is a direct equivalent of package.json, while Pipfile.lock corresponds to package-lock.json.
you can manually save it in a Makefile (or a text file and then imported in your Makefile):
PYTHON=.venv/bin/python # path to pyphon
PIP=.venv/bin/pip # path to pip
SOURCE_VENV=. .venv/bin/activate
install:
virtualenv .venv
$(SOURCE_VENV) && $(PIP) install -e PACKAGE
$(SOURCE_VENV) && $(PIP) install -r requirements.txt # other required packages
and then just run make install
How about make a shell function to do this ?
Add below code to your ~/.profile or ~/.bashrc
pips() {
local pkg=$1
if [ -z "$1" ]; then
echo "usage: pips <pkg name>"
return 1
fi
local _ins="pip install $pkg"
eval $_ins
pip freeze | grep $pkg -i >> requirements.txt
}
then run source ~/.profile or source ~/.bashrc to import it to your current terminal
when you want to install && save a package, just run, for example pips requests.
after package was installed, its version will be save into requirements.txt in your current directory.
I am using this small command line to install a package and save its version in requirements.txt :
pkg=package && pip install $pkg && echo $(pip freeze | grep -i $pkg) >> requirements.txt
I made a quick hack on pip to add --save option to install/uninstall commands.
Please have a look at my blog for more information about this hack:
http://blog.abhiomkar.in/2015/11/12/pip-save-npm-like-behaviour-to-pip/
Installation (GitHub):
https://github.com/abhiomkar/pip-save
Hope this helps.
What about this one:
pip freeze >> requirements.txt
By default pip installs editable packages into src subdirectory of the directory where Python is installed.
I'd like to install a package from version control to a directory of my choosing using pip's support for checking out a package from source control, for example:
pip install -e git+https://github.com/kennethreitz/requests.git#355b97165c#egg=requests-org
Is this possible?
pip help install says:
--src=DIR, --source=DIR, --source-dir=DIR, --source-directory=DIR
Check out --editable packages into DIR
For example:
pip install -e git+https://github.com/kennethreitz/requests.git#355b97165c#egg=requests-org --source-directory=/tmp
Will install the requests source in /tmp/requests-org