In man pip it says under --editable <path/url>,
Install a project in editable mode (i.e. setuptools "develop mode")
from a local project path or a VCS url
What does that mean? Can I give it a repo branch on Github, and it'll go get it and install it and keep it updated as the branch changes?
If you just want to install package from git repo read
-e or --editable is little bit different, it is used, as stated in docs, for setuptools's development mode. It makes installed packages editable.
Yes you can give it link to github. Read this answer for more info. But this link will only work if this repository contains setup.py with all installation instructions. And this package will be updated when you call
pip install -U -e <url>
But only if version of package in setup.py is higher than the one in your environment.
You can forcefully reinstall this package if you need to, when source did change but version didn't.
pip install -I -e <url>
Related
Hello internet strangers,
I need to do XSD verification with lxml, but do not have sudo abilities on the machine I'm using - so pip is not an option. I'm on Fedora 27 and found the source code for lxml, which requires:
sudo apt-get install libxml2-dev libxslt-dev python-dev
But I cant use sudo on the machine I'm deploying to. Once I have those dependencies all I need to do is get the lxml source code through wget or github, and then run setup.py install, but I cant do that if I cant install the above dependencies.
Help?
Useful links:
https://gist.github.com/blite/868292
https://github.com/lxml/lxml
I'd look into creating a virtual environment and installing dependencies there. Here are the docs: https://docs.python.org/3.6/library/venv.html
Basically you create a virtual environment like this:
$ python -m venv my_venv_name
That creates a folder call my_venv_name with a few folders under it (do this somewhere in your home directory). The issue the command
$ source my_venv_name/bin/activate
That activates your environment. At this point you can pip install into your virtual environment to your hearts content. You just need to remember to activate the environment for each session that you want to use it in.
I'm trying to release a demo repository (where people can directly run python scripts to demonstrate some experiment). I also need to include dependencies (numpy, etc). I'd like to use pip to make it easy.
I've already made a setup.py file listing all the dependencies. I'd now like to install my repo's code the the current directory, and all the dependencies to the default path (eg. ./venv/lib/python2.7, venv/src/, etc).
Now, if I just run
pip install -e git+http://github.com/petered/my_repo.git#egg=my_repo
Everything works, except the code in my_repo gets saved in the /venv/src (whereas I want it in the root directory).
I can also run
pip install -e git+http://github.com/petered/my_repo.git#egg=my_repo --target=.
Which installs everything in the root (current) directory. But then all dependencies also end up in this directory.
How can I pip install just the source code of a package in the current directory, but all dependencies in the default directory for dependencies?
My projects usually have a setup.py file that defines all dependencies. To install the project in a virtualenv I then first clone the repository and then simply install the cloned repository:
git clone http://github.com/petered/my_repo.git .
pip install -e .
This will install my_repo where it is, but install all dependencies into lib/python2.7/site-packages/.
You will notice that this layout makes it possible to later publish the my_repo to PyPI, or install it as a dependency into lib/... if you wish to do so as the library itself has no idea about how it was installed.
Whenever I have several "private dependencies" (closed source, only available on our git server), I write installation instructions like
git clone http://github.com/petered/my_repo.git
git clone http://github.com/petered/my_repo_dependency_1.git
git clone http://github.com/petered/my_repo_dependency_2.git
pip install -e my_repo_dependency_1
pip install -e my_repo_dependency_2
pip install -e my_repo
in the readme file. This will install all private dependencies in place, but install all public PyPI dependencies in lib/python2.7/site-packages/.
We have a python/django based web application, many components of which are installed using pip. So I would like to ask if there is a way to save or download and save the particular python packages that we are having pip install (example: pip install django==1.5.1). We would like to have in the end a collection of the packages in the versions known to be working and with which the app was developed locally. Any and all advice will be appreciated.
If I understood your question right, you can pip freeze > requirements.txt, this command will add all the libraries you have used/"downloaded" for your app in the file requirements.txt(in case it exists the file be overwritten). This command allows you to later do pip install -r requirements.txt. However, be aware that your Django project must be running in a virtual environment, otherwise the install command will attempt to install all the python packages in your development machine.
The freeze command will allow you to have the current version of the app so upon installation will attempt to install that same version. Your requirements file will look something like:
Flask==0.8
Jinja2==2.6
Werkzeug==0.8.3
certifi==0.0.8
chardet==1.0.1
distribute==0.6.24
gunicorn==0.14.2
requests==0.11.1
Your packages are installed (if using virtualenv) at: ../<your project>/<your virtual env>/<lib>/<python version>/<site-packages>/
As for downloading you can use pip install --download command as #atupal suggested in his response, however think if this is really needed you can also fork those libraries on github to accomplish the same.
Here is a good source of information on how this works: http://www.pip-installer.org/en/latest/cookbook.html
Maybe what you want is:
Download the packages:
pip install --download /path/to/download/to packagename
OR
pip install --download=/path/to/packages/downloaded -r requirements.txt
install all of those libraries just downloaded:
pip install --no-index --find-links="/path/to/downloaded/dependencies" packagename
OR
pip install --no-index --find-links="/path/to/downloaded/packages" -r requirements.txt
Shamelessly stolen from this question
Create a requirements.txt file.
Put:
django==1.5.1
in the first line.
Then run pip install -r requirements.txt
Then you can complete that file...
In nodejs, I can do npm install package --save-dev to save the installed package into the package.
How do I achieve the same thing in Python package manager pip? I would like to save the package name and its version into, say, requirements.pip just after installing the package using something like pip install package --save-dev requirements.pip.
There isn't an equivalent with pip.
Best way is to pip install package && pip freeze > requirements.txt
You can see all the available options on their documentation page.
If it really bothers you, it wouldn't be too difficult to write a custom bash script (pips) that takes a -s argument and freezes to your requirements.txt file automatically.
Edit 1
Since writing this there has been no change in providing an auto --save-dev option similar to NPM however Kenneth Reitz (author of requests and many more) has released some more info about a better pip workflow to better handle pip updates.
Edit 2
Linked from the "better pip workflow" article above it is now recommended to use pipenv to manage requirements and virtual environments. Having used this a lot recently I would like to summarise how simple the transition is:
Install pipenv (on Mac)
brew install pipenv
pipenv creates and manages it's own virtual environments so in a project with an existing requirements.txt, installing all requirements (I use Python3.7 but you can remove the --three if you do not) is as simple as:
pipenv --three install
Activating the virtualenv to run commands is also easy
pipenv shell
Installing requirements will automatically update the Pipfile and Pipfile.lock
pipenv install <package>
It's also possible to update out-of-date packages
pipenv update
I highly recommend checking it out especially if coming from a npm background as it has a similar feel to package.json and package-lock.json
This simple line is a starting point. You can easily built a bash command to reuse the PACKAGE in the line.
pip install PACKAGE && pip freeze | grep PACKAGE >> requirements.txt
Thanks to #devsnd for the simple bash function example:
function pip-install-save {
pip install $1 && pip freeze | grep $1 >> requirements.txt
}
To use it, just run:
pip-install-save some-package
I've created python package that wraps around the actual pip called pipm. All pip commands will work as it is, plus they will be reflected in the requirements file. Unlike pip-save (inactive for sometime), a similar tool I found and wasn't able to use, it can handle many files and environments(test, dev, production, etc. ). It also has a command to upgrade all/any of your dependencies.
installation
pipm install pkg-name
installation as development dependency
pipm install pkg-name --dev
installation as testing dependency
pipm install pkg-name --test
removal
pipm uninstall pkg-name
update all your dependencies
pipm update
install all your dependencies from the requirements file
pipm install
including development dependencies
pipm install --dev
Update: apparently, pipenv is not officially endorsed by Python maintainers, and the previously-linked page is owned by a different organization. The tool has its pros and cons, but the below solution still achieves the result that the OP is seeking.
pipenv is a dependency management tool that wraps pip and, among other things, provides what you're asking:
https://pipenv.kennethreitz.org/en/latest/#example-pipenv-workflow
$ pipenv install <package>
This will create a Pipfile if one doesn’t exist. If one does exist, it will automatically be edited with the new package your provided.
A Pipfile is a direct equivalent of package.json, while Pipfile.lock corresponds to package-lock.json.
you can manually save it in a Makefile (or a text file and then imported in your Makefile):
PYTHON=.venv/bin/python # path to pyphon
PIP=.venv/bin/pip # path to pip
SOURCE_VENV=. .venv/bin/activate
install:
virtualenv .venv
$(SOURCE_VENV) && $(PIP) install -e PACKAGE
$(SOURCE_VENV) && $(PIP) install -r requirements.txt # other required packages
and then just run make install
How about make a shell function to do this ?
Add below code to your ~/.profile or ~/.bashrc
pips() {
local pkg=$1
if [ -z "$1" ]; then
echo "usage: pips <pkg name>"
return 1
fi
local _ins="pip install $pkg"
eval $_ins
pip freeze | grep $pkg -i >> requirements.txt
}
then run source ~/.profile or source ~/.bashrc to import it to your current terminal
when you want to install && save a package, just run, for example pips requests.
after package was installed, its version will be save into requirements.txt in your current directory.
I am using this small command line to install a package and save its version in requirements.txt :
pkg=package && pip install $pkg && echo $(pip freeze | grep -i $pkg) >> requirements.txt
I made a quick hack on pip to add --save option to install/uninstall commands.
Please have a look at my blog for more information about this hack:
http://blog.abhiomkar.in/2015/11/12/pip-save-npm-like-behaviour-to-pip/
Installation (GitHub):
https://github.com/abhiomkar/pip-save
Hope this helps.
What about this one:
pip freeze >> requirements.txt
By default pip installs editable packages into src subdirectory of the directory where Python is installed.
I'd like to install a package from version control to a directory of my choosing using pip's support for checking out a package from source control, for example:
pip install -e git+https://github.com/kennethreitz/requests.git#355b97165c#egg=requests-org
Is this possible?
pip help install says:
--src=DIR, --source=DIR, --source-dir=DIR, --source-directory=DIR
Check out --editable packages into DIR
For example:
pip install -e git+https://github.com/kennethreitz/requests.git#355b97165c#egg=requests-org --source-directory=/tmp
Will install the requests source in /tmp/requests-org