So I decided to install python packages (technically Django apps) directly from the downloaded tar file, by extracting it and using the following command:
python setup.py install
However, inside my site-packages directory, I find that the package was installed inside a .egg directory that also has version numbers. The directories look annoyingly like this:
site-packages/django_cms-2.1.3-py2.7.egg/cms
site-packages/django_cms-2.1.3-py2.7.egg/mptt
I need the packages to install as a directory with the package name with no .egg or version number, otherwise, Django can't find the package. It should be like this:
site-packages/cms
site-packages/mptt
Attempting to install the same package from pip, and it works fine. This is frustrating, so some help would be appreciated.
I tracked down a thread that discusses something similar, but it didn't give a solution that worked.
The Django project is unable to locate the packages I installed because the packages aren't installed at the root of site-packages dir. Instead, it resides inside the .egg directories. What I had to do is manually move the package to the root, but how do you correctly install python packages?
Also, I didn't suppress the easy-install.pth file, but it isn't picked up by PyDev.
Related
I'm building a python package to use for 'global' functions (i.e. stuff that I will use in multiple other projects). I have built the package using py -m build and it then puts the MyPackage-0.1.0.tar.gz into the dist directory in my folder.
My goal is to be able to run pip install MyPackage from within any other projects, and it will install the latest build of my package. In other words, I do not want to use something like --find-links. This way, I could also include the package in a requirements.txt file.
I have tried putting the tarball in a directory which is on my system's PATH, and into a subfolder within there (e.g. PathDir/MyPackage/MyPackage-0.1.0.tar.gz), but I keep getting the same 'No matching distribution found' error.
The documentation for pip install says:
pip looks for packages in a number of places: on PyPI (if not disabled via --no-index), in the local filesystem, and in any additional repositories specified via --find-links or --index-url.
When it says 'in the local filesystem' where does it begin it's search? Is there a way to change this (e.g. set some environment variable)
When looking for files in the local filesystem, pip has no notion of search path. You must give a path accessible from the current working directory. It can be an absolute path:
pip install /path/to/MyPackage-0.1.0.tar.gz
a relative path:
cd /path
pip install to/MyPackage-0.1.0.tar.gz
or a simple name if the package file is inside the current working directory:
cd /path/to
pip install MyPackage-0.1.0.tar.gz
I found the answer after a lot of searching, and so here is the solution:
pip uses configuration files to define its internal settings. In these configuration files, you can specify default values for find-links. This means that python will look here for compatible packages, as well as online.
You can check what configurations have been set, and what files they will be searched in by running pip config list -v. You just need to edit/create one of the files listed and add your configuration as pip.ini with the following:
[install]
find-links=file://C:/Users/.../PathDir/MyPackage/
By creating this at the User/Global level (rather than the site level), this installation also works when inside a virtual environment.
Source: https://pip.pypa.io/en/stable/topics/configuration/
pip can install packages from the local file system. They do not need to be even package files, they can be just working directories or git checkouts.
Usually I use pip --editable:
pip --editable /path/to/my/python/package
With --editable changes in .py files in the folder are automatically reflected to your application.
You can use --editable in requirements.txt file as well.
Can't seem to get imports to work. I've installed using
pip install pyperclip
I can can confirm that it was successfully installed:
But then when attempt to confirm in in the Shell:
Is there another step to importing that I'm just missing?
Your problem is that pip is installing for the global (all users) version of python, and you're using a version of python installed for only your user c:\Users\bbarker\AppData\Local\Programs\Python\Python36. You'll want to either use the global install instead c:\program files (x86)\python36-32 or change your pip defaults as described here.
You'll notice the folder where pip told you where pyperclip was installed does not show up in sys.path. Therefore python does not know to search there for libraries. Those few file paths you did have in your sys.path are automatically generated defaults that are relative to the install directory of the particular instance of python you're currently using. If you use the instance in your \program files (x86)\ folder, the paths will be relative to that folder instead
tldr;
You have 2 instances of python installed, and you're installing libraries to one and using the other.
I'm trying to include the pyobjc package in my pip requirements file, but I need a committed version that doesn't have a release yet in order to pull in a much needed bug fix. The pyobjc package is a pseudo-package to install all the other framework dependencies.
I can specify the HG path in the pip requirements just fine. The problem I'm facing is that the repository doesn't have a setup.py in the root directory. Instead it has a subdirectory labeled pyobjc (with all the framework subdirectories alongside) that contains setup.py. In the root directory of the repo, there's a file labeled install.py that pyobjc's readme recommends using when installing from source.
Does anyone have any idea how to call install.py from pip instead of setup.py or point to the subdirectory location?
In the pyobjc/pyobjc subdirectory I see setup.py, not `install.py.
pip can be advised to look into a subdirectory of a VCS repository for setup.py:
pip install -e 'hg+https://bitbucket.org/ronaldoussoren/pyobjc#39c254b20bf2d63ef2891cb57bba027dfe7e06e8#egg=pyobjc&subdirectory=pyobjc'
For a number of reasons, such as when the package takes a long time to compile (lxml) it seems to be recommended to symlink such packages from the system sitepackages directory to a virtualenv.
Some example questions:
Use a single site package (as exception) for a virtualenv
How to install lxml into virtualenv from the local system?
But such packages are not recognized by pip, which will happily try to reinstall them. How to deal with this?
Okay, it seems the trick is to also link the egg-info directory.
I use virtualenv to create isolated environments for my Python projects. Then i install dependencies with pip - Python package manager. Sometimes i forget to do source venv/bin/activate, and then pip creates build/ directories inside my projects. Why does pip create them? May i delete them, and if not, may i put them in my .hgignore file?
As far as i understand, pip stores source of downloaded packages there along a file called pip-delete-this-directory.txt. But when i delete it, everything still works, as the real code is being put into venv/lib/python2.7/site-packages/. Then what is build/ really for?
The build directory is where a packages gets unpacked into and build from. When the package is installed successfully, pip removes the unpacked dir from build, unless you've removed pip-delete-this-directory.txt. As described in pip-delete-this-directory.txt:
This file is placed here by pip to indicate the source was put
here by pip.
Once this package is successfully installed this source code will be
deleted (unless you remove this file).
Thus its less important for runtime environment. You could ignore it safely.
Also, you could use pip install -b customized_build_directory to specify another directory as build base directory, for example /tmp
Furthermore, you could pip install --no-download package_name to rebuild the package w/o downloading it, if the previous installation of the package failed.