Install Local Python Package with pip - python

I'm building a python package to use for 'global' functions (i.e. stuff that I will use in multiple other projects). I have built the package using py -m build and it then puts the MyPackage-0.1.0.tar.gz into the dist directory in my folder.
My goal is to be able to run pip install MyPackage from within any other projects, and it will install the latest build of my package. In other words, I do not want to use something like --find-links. This way, I could also include the package in a requirements.txt file.
I have tried putting the tarball in a directory which is on my system's PATH, and into a subfolder within there (e.g. PathDir/MyPackage/MyPackage-0.1.0.tar.gz), but I keep getting the same 'No matching distribution found' error.
The documentation for pip install says:
pip looks for packages in a number of places: on PyPI (if not disabled via --no-index), in the local filesystem, and in any additional repositories specified via --find-links or --index-url.
When it says 'in the local filesystem' where does it begin it's search? Is there a way to change this (e.g. set some environment variable)

When looking for files in the local filesystem, pip has no notion of search path. You must give a path accessible from the current working directory. It can be an absolute path:
pip install /path/to/MyPackage-0.1.0.tar.gz
a relative path:
cd /path
pip install to/MyPackage-0.1.0.tar.gz
or a simple name if the package file is inside the current working directory:
cd /path/to
pip install MyPackage-0.1.0.tar.gz

I found the answer after a lot of searching, and so here is the solution:
pip uses configuration files to define its internal settings. In these configuration files, you can specify default values for find-links. This means that python will look here for compatible packages, as well as online.
You can check what configurations have been set, and what files they will be searched in by running pip config list -v. You just need to edit/create one of the files listed and add your configuration as pip.ini with the following:
[install]
find-links=file://C:/Users/.../PathDir/MyPackage/
By creating this at the User/Global level (rather than the site level), this installation also works when inside a virtual environment.
Source: https://pip.pypa.io/en/stable/topics/configuration/

pip can install packages from the local file system. They do not need to be even package files, they can be just working directories or git checkouts.
Usually I use pip --editable:
pip --editable /path/to/my/python/package
With --editable changes in .py files in the folder are automatically reflected to your application.
You can use --editable in requirements.txt file as well.

Related

Transferring a Python project to different computers

I am working on a Python project on both my work computer and my home computer. GitHub has made the experience pretty seamless.
But I'm having a problem with the pyvenv.cfg file in my venv folder. Because my Python SDK has a different file path on my work computer compared to my home computer, I have to manually go into pyvenv.cfg to change the home = C:\Users\myName\... filepath each time I pull the updated version of my project from my other computer, or else the interpreter doesn't work.
Does anyone know a solution to this problem?
As confirmed in the comments, you've added the virtual environment folder to your project and included it in the files that you put on GitHub.
That's generally a bad idea, since it defeats part of the purpose of having a virtual environment in the first place. Your virtual environment will contain packages specific to the platform and configuration of the machine it is on - you could be developing on Linux on one machine and Windows on another and you'd have bigger problems than just one line in a configuration file.
What you should do:
create a virtual environment in a folder outside your project / source folder.
assuming you're using pip, you can run pip freeze > requirements.txt to create a requirements.txt folder which you can then use on the other system with pip install -r requirements.txt to recreate the exact same virtual environment.
All you have to do is keep that requirements.txt up to date and update the virtual environments on either computer whenever it changes, instead of pulling it through GitHub.
In more detail, a simple example (for Windows, very similar for Linux):
create a project folder, e.g. C:\projects\my_project
create a virtual environment for the project, e.g. python -m venv C:\projects\venv\my_project
activate the environment, i.e. C:\projects\venv\my_project\Scripts\activate.bat
install packages, e.g. pip install numpy
save what packages were installed to a file in C:\projects\my_project, called requirements.txt with pip freeze > requirements.txt
store the project in a Git repo, including that file
on another development machine, clone or pull the project, e.g. git clone https://github.com/my_project D:\workwork\projects\my_project
on that machine, create a new virtual environment, e.g. python -m venv D:\workwork\venv\my_project
activate the environment, i.e. D:\workwork\venv\my_project\Scripts\activate.bat
install the packages that are required with pip install -r D:\workwork\projects\my_project\requirements.txt
Since you say you're using PyCharm, it's a lot easier still, just make sure that the environment created by PyCharm sits outside your project folder. I like to keep all my virtual environments in one folder, with venv-names that match the project names.
You can still create a requirements.txt in your project and when you pull the project to another PC with PyCharm, just do the same: create a venv outside the project folder. PyCharm will recognise that it's missing packages from the requirements file and offer to install them for you.
You shouldn't keep the full virtualenv in source control, since more often than not it's much larger than your code, and there may be platform-and-interpreter-version specific bits and bobs in there.
Instead, you should save the packages required -- the tried and tested way is a requirements.txt file (but there are plenty of alternatives such as Pyenv, Poetry, Dephell, ...) -- and recreate the virtualenv on each machine you need to run the project on.
To save a requirements file for a pre-existing project, you can
pip freeze > requirements.txt
when the virtualenv is active.
Then, you can use
pip install -r requirements.txt
to install exactly those packages.
In general, I like to use pip-tools, so I only manage a requirements.in file with package requirement names, and the pip-compile utility then locks those requirements with exact versions into requirements.txt.

Where is Pip's list of files to remove for a package

Where does pip keep a record of what files to remove when uninstall a package?
I have an application package which is installed from pip, often in editable mode. For convenience after install there are user scripts to add desktop and menu shortcuts. How do I tell pip and other package managers about these extra files so they can be automatically removed when uninstalled?
Typical install scenario:
git clone {application}
pip install --editable path\to\myapp-code
python user-scripts\make-menu-shortcuts.py
cross posted to https://discuss.python.org/t/how-to-add-to-list-of-files-to-uninstall .
draft answer in progress. If you have something better than this, jump in!
Wheel installs
Look for PYTHONHOME/Lib/site-packages/{package}{version}.dist-info.
In there is a file called RECORD.The Wheel specification has some details on the RECORD format:
https://www.python.org/dev/peps/pep-0427/#the-dist-info-directory
This directory isn't created for packages installed in editable mode (pip install --editable path/to/code).
# extract from *.dist-info/RECORD:
../../Scripts/myapp.exe,sha256=tQaANRLxdJ3Su3vLNakbzlNhRtnU-HBhdwHGTpJHTxc,103271
myapp-0.1.20.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
...
myapp/__pycache__/__init__.cpython-36.pyc,,
myapp/__pycache__/_version.cpython-36.pyc,,
Pip uninstall will remove any files at the paths we add to this file. The hash isn't necessary but take care to append trailing commas if not used.
Windows: it's okay to use native format (C:\users\...\myapp.lnk) but files recorded must exist on same drive as dist-info (ref).
# myapp.dist-info/RECORD:
myapp-0.1.20.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
...
myapp/__pycache__/__init__.cpython-36.pyc,,
myapp/__pycache__/_version.cpython-36.pyc,,
C:\Users\me\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\myapp.lnk,,
> pip uninstall myapp
Uninstalling myapp-0.1.20:
Would remove:
c:\tools\miniconda3\envs\test\lib\site-packages\myapp-0.1.20.dist-info\*
c:\tools\miniconda3\envs\test\lib\site-packages\myapp\*
c:\tools\miniconda3\envs\test\scripts\myapp.exe
c:\users\me\appdata\roaming\microsoft\windows\start menu\programs\myapp.lnk
Proceed (y/n)? y
Editable installs
Look in PYTHONHOME/lib/site-packages for myapp.egg-link. That file contains a path to the code location:
D:\code-external\app-code
.
In that folder look for myapp.egg-info. It has a similar structure to dist-info but is not the same.
Running pip show --files myapp yields an error:
Name: myapp
...
Files:
Cannot locate installed-files.txt
Creat .egg-info/installed-files.txt and put the extra filename in it. Now pip show works, but unfortunately pip uninstall still misses it.
> pip show -f myapp
Name: myapp
...
Files:
..\..\test-link.lnk
> pip uninstall myapp
Uninstalling myapp-0.1.20:
Would remove:
c:\tools\miniconda3\envs\test\lib\site-packages\myapp.egg-link
c:\tools\miniconda3\envs\test\scripts\myapp-script.py
c:\tools\miniconda3\envs\test\scripts\myapp.exe
Proceed (y/n)? n
Notes
Figured dist-info/RECORD part out by creating a bare virtual conda environment, noting install time, installing a single small package with pip, and then used Windows Advanced Query Syntax to search files modified after timestamp locate what was changed within the envs folder.
installed-files.txt: On Windows the paths must be on same drive as the egg-info folder. Pip show always resolves the path as relative, but that's impossible across drives in Windows. You've hit this bug if get the error ValueError: path is on mount 'C:', start on mount 'D:'

When would the -e, --editable option be useful with pip install?

When would the -e, or --editable option be useful with pip install?
For some projects the last line in requirements.txt is -e .. What does it do exactly?
As the man page says it:
-e,--editable <path/url>
Install a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.
So you would use this when trying to install a package locally, most often in the case when you are developing it on your system. It will just link the package to the original location, basically meaning any changes to the original package would reflect directly in your environment.
Some nuggets around the same here and here.
An example run can be:
pip install -e .
or
pip install -e ~/ultimate-utils/ultimate-utils-proj-src/
note the second is the full path to where the setup.py would be at.
Concrete example of using --editable in development
If you play with this test package as in:
cd ~
git clone https://github.com/cirosantilli/vcdvcd
cd vcdvcd
git checkout 5dd4205c37ed0244ecaf443d8106fadb2f9cfbb8
python -m pip install --editable . --user
it outputs:
Obtaining file:///home/ciro/bak/git/vcdvcd
Installing collected packages: vcdvcd
Attempting uninstall: vcdvcd
Found existing installation: vcdvcd 1.0.6
Can't uninstall 'vcdvcd'. No files were found to uninstall.
Running setup.py develop for vcdvcd
Successfully installed vcdvcd-1.0.6
The Can't uninstall 'vcdvcd' is normal: it tried to uninstall any existing vcdvcd to then replace them with the "symlink-like mechanism" that is produced in the following steps, but failed because there were no previous installations.
Then it generates a file:
~/.local/lib/python3.8/site-packages/vcdvcd.egg-link
which contains:
/home/ciro/vcdvcd
.
and acts as a "symlink" to the Python interpreter.
So now, if I make any changes to the git source code under /home/ciro/vcdvcd, it reflects automatically on importers who can from any directory do:
python -c 'import vcdvcd'
Note however that at my pip version at least, binary files installed with --editable, such as the vcdcat script provided by that package via scripts= on setup.py, do not get symlinked, just copied to:
~/.local/bin/vcdcat
just like for regular installs, and therefore updates to the git repository won't directly affect them.
By comparison, a regular non --editable install from the git source:
python -m pip uninstall vcdvcd
python -m pip install --user .
produces a copy of the installed files under:
~/.local/lib/python3.8/site-packages/vcdvcd
Uninstall of an editable package as done above requires a new enough pip as mentioned at: How to uninstall editable packages with pip (installed with -e)
Tested in Python 3.8, pip 20.0.2, Ubuntu 20.04.
Recommendation: develop directly in-tree whenever possible
The editable setup is useful when you are testing your patch to a package through another project.
If however you can fully test your change in-tree, just do that instead of generating an editable install which is more complex.
E.g., the vcdvcd package above is setup in a way that you can just cd into the source and do ./vcdcat without pip installing the package itself (in general, you might need to install dependencies from requirements.txt though), and the import vcdvcd that that executable does (or possibly your own custom test) just finds the package correctly in the same directory it lives in.
From Working in "development" mode:
Although not required, it’s common to locally install your project in
“editable” or “develop” mode while you’re working on it. This allows
your project to be both installed and editable in project form.
Assuming you’re in the root of your project directory, then run:
pip install -e .
Although somewhat cryptic, -e is short for
--editable, and . refers to the current working directory, so together, it means to install the current directory (i.e. your
project) in editable mode.
Some additional insights into the internals of setuptools and distutils from “Development Mode”:
Under normal circumstances, the distutils assume that you are going to
build a distribution of your project, not use it in its “raw” or
“unbuilt” form. If you were to use the distutils that way, you would
have to rebuild and reinstall your project every time you made a
change to it during development.
Another problem that sometimes comes up with the distutils is that you
may need to do development on two related projects at the same time.
You may need to put both projects’ packages in the same directory to
run them, but need to keep them separate for revision control
purposes. How can you do this?
Setuptools allows you to deploy your projects for use in a common
directory or staging area, but without copying any files. Thus, you
can edit each project’s code in its checkout directory, and only need
to run build commands when you change a project’s C extensions or
similarly compiled files. You can even deploy a project into another
project’s checkout directory, if that’s your preferred way of working
(as opposed to using a common independent staging area or the
site-packages directory).
To do this, use the setup.py develop command. It works very similarly
to setup.py install, except that it doesn’t actually install anything.
Instead, it creates a special .egg-link file in the deployment
directory, that links to your project’s source code. And, if your
deployment directory is Python’s site-packages directory, it will also
update the easy-install.pth file to include your project’s source
code, thereby making it available on sys.path for all programs using
that Python installation.
It is important to note that pip uninstall can not uninstall a module that has been installed with pip install -e. So if you go down this route, be prepared for things to get very messy if you ever need to uninstall. A partial solution is to (1) reinstall, keeping a record of files created, as in sudo python3 -m setup.py install --record installed_files.txt, and then (2) manually delete all the files listed, as in e.g. sudo rm -r /usr/local/lib/python3.7/dist-packages/tdc7201-0.1a2-py3.7.egg/ (for release 0.1a2 of module tdc7201). This does not 100% clean everything up however; even after you've done it, importing the (removed!) local library may succeed, and attempting to install the same version from a remote server may fail to do anything (because it thinks your (deleted!) local version is already up to date).
As suggested in previous answers, there is no symlinks that are getting created.
How does '-e' option work? -> It just updates the file "PYTHONDIR/site-packages/easy-install.pth" with the project path specified in the 'command pip install -e'.
So each time python search for a package it will check this directory as well => any changes to the files in this directory is instantly reflected.

How to install python modules in a local directory? --user and exporting pythonpath isn't working

I don't know very much about python but would like to install some python modules in a local directory on a server on which I don't have sudo access.
I start by going into my desired directory (not root) and create the directory tree needed to store my custom modules
cd /root/example/sub-example
mkdir -p local/lib/python2.7/site-packages
I then export this local path to PYTHONPATH
export PYTHONPATH=$PYTHONPATH:/root/example/sub-example/local/lib/python2.7/site-packages
I then make a new sub-directory to store the python package while extracting
mkdir example-python-directory
cd example-python-directory
wget http://example-python-package
tar -xvf example-python-package.tar.gz
cd example-python-package
Last, I run the setup.py script with the --user flag to try to get it to install in my specified /local directory
python setup.py install --user
The problem is, nothing is installed in my /root/example/sub-example/local/lib/python2.7/site-packages directory, and instead I find that I now have a new directory at root: /root/.local/lib/python2.7/site-packages
Is there a way to prevent this? I feel like my lack of Python knowledge is causing me to make some silly error that is probably obvious to others. Thanks for the help!
create a folder called "lib"
For Python 3
pip3 install <your_python_module_name> -t lib/
For Python 2
pip install <your_python_module_name> -t lib/
CFLAGS=-I$(brew --prefix)/include LDFLAGS=-L$(brew --prefix)/lib pip install <package>
I found that on servers where you haven't root access to, you can usually install the python module into your .brew/lib using this.
virtualenv is what I would recommend for this case (and pretty much any other case). I use it for pretty much everything I do in Python.
It allows you to essentially create a sandbox containing a Python environment that is bootstrapped from a Python install on your machine, and to install any modules you want into it.
It should not, in general, require the use of sudo, since you're not touching the system install. You can generally pip install a module right into the virtualenv, and then you run your scripts out of that virtualenv. You would basically just need a location you can read/write/execute from, say a directory you create in your user's home directory.
You can keep track of what's installed by doing a pip freeze > requirements and checking that into an SCM, and then a new virtualenv can be recreated using that file, ready to run your scripts.
The link I provided above has more details about how to use virtualenv.
Edit in response to comment from OP:
You can still use pip install outside of virtualenv, and I would recommend that. However, that can only operate on various Python installs that may be on the box (invoke pip from the bin directory of the install you want to use).
However, that won't work for installation into arbitrary directories. For that, you could try to unzip the egg file (they are supposed to be zip files) into the directory you want, and then make sure that directory is on the PYTHONPATH. Some egg files are available for direct download off of PyPI, although some are source only.
I think is approach is much more complex and prone to problems than virtualenv would be, though.

Pip creates build/ directories

I use virtualenv to create isolated environments for my Python projects. Then i install dependencies with pip - Python package manager. Sometimes i forget to do source venv/bin/activate, and then pip creates build/ directories inside my projects. Why does pip create them? May i delete them, and if not, may i put them in my .hgignore file?
As far as i understand, pip stores source of downloaded packages there along a file called pip-delete-this-directory.txt. But when i delete it, everything still works, as the real code is being put into venv/lib/python2.7/site-packages/. Then what is build/ really for?
The build directory is where a packages gets unpacked into and build from. When the package is installed successfully, pip removes the unpacked dir from build, unless you've removed pip-delete-this-directory.txt. As described in pip-delete-this-directory.txt:
This file is placed here by pip to indicate the source was put
here by pip.
Once this package is successfully installed this source code will be
deleted (unless you remove this file).
Thus its less important for runtime environment. You could ignore it safely.
Also, you could use pip install -b customized_build_directory to specify another directory as build base directory, for example /tmp
Furthermore, you could pip install --no-download package_name to rebuild the package w/o downloading it, if the previous installation of the package failed.

Categories

Resources