I have a package that depends on docker-py and I want to upgrade the dependency to docker.
Unfortunately those two packages don't play along with each other very well.
A safe way to do things would be to first uninstall docker-py and then install my package, which will install docker in its place (I already changed the requirements from docker-py to docker).
Is there a way for this to happen in setup.py when I upgrade my package (via pip or any other way) without messing up the python environment?
The first thing that came to my mind was to check, in setup.py, if docker-py is already installed and run pip uninstall like so:
from setuptools import setup
...
if 'docker-py' in [x.project_name for x in pip.get_installed_distributions()]:
submodule.check_call("pip uninstall -y docker-py".split())
setup(
...
)
Setup will then install the new dependecy and everything will work fine.
Is this safe?
Any better alternatives?
pip is not a full blown package manager, it doesn't have such concepts as "This package is incompatible with that" or "This package replaces that". What you're trying to do is emulating these important concepts. Unfortunately that doesn't work.
pip runs setup.py on user hosts only for source distributions (sdist). For eggs/wheels pip runs setup.py on the developer host and there is no way to configure a pre-install script to be run on user hosts, and wheels these days is the preferred distribution format.
You best bet is to ask user (via documentation) to uninstall docker-py manually.
Related
I'm working on a DevOps project for a client who's using Python. Though I never used it professionally, I know a few things, such as using virtualenv and pip - though not in great detail.
When I looked at the staging box, which I am trying to prepare for running a functional test suite, I saw chaos. Tons of packages installed globally, and those installed inside a virtualenv not matching the requirements.txt of the project. OK, thought I, there's a lot of cleaning up. Starting with global packages.
However, I ran into a problem at once:
β ~ pip uninstall PyYAML
Not uninstalling PyYAML at /usr/lib/python2.7/dist-packages, owned by OS
OK, someone must've done a 'sudo pip install PyYAML'. I think I know how to fix it:
β ~ sudo pip uninstall PyYAML
Not uninstalling PyYAML at /usr/lib/python2.7/dist-packages, owned by OS
Uh, apparently I don't.
A search revealed some similar conflicts caused by users installing packages bypassing pip, but I'm not convinced - why would pip even know about them, if that was the case? Unless the "other" way is placing them in the same location pip would use - but if that's the case, why would it fail to uninstall under sudo?
Pip denies to uninstall these packages because Debian developers patched it to behave so. This allows you to use both pip and apt simultaneously. The "original" pip program doesn't have such functionality
Update: my answer is relevant only to old versions of Pip. For the latest versions, Pip is configured to modify only the files which reside only in its "home directory" - that is /usr/local/lib/python3.* for Debian. For the latest tools, you will get these errors when you try to delete the package, installed by apt:
For pip 9.0.1-2.3~ubuntu1 (installed from Ubuntu repository):
Not uninstalling pyyaml at /usr/lib/python3/dist-packages, outside environment /usr
For pip 10.0.1 (original, installed from pypi.org):
Cannot uninstall 'PyYAML'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
The point is not that pip cannot install the package because you don't have enough permissions, but because it is not a package installed through pip, so it doesn't want to uninstall it.
dist-packages is where packages installed by the OS package manager reside; as they are handled by another package manager (e.g. apt on Ubuntu/Debian, pacman on Arch, rpm/yum on CentOS, ... ) pip won't touch them (but still has to know about them as they are installed packages, so they can be used to satisfy dependencies of pip-installed packages).
You should also probably avoid to touch them unless you use the correct package manager, and even so, they may have been installed automatically to satisfy the dependencies of some program, so you may not remove them without breaking it. This can usually be checked quite easily, although the exact way depends from the precise Linux distribution you are using.
I've created a new RPM using python bdist_rpm . Normally python setup.py install would install python dependencies like websocket-client or any other package. But the RPM just refuses to install anything.
Apparently the suggestion from various other posts seem to be in the line of just requiring them in setup.cfg as rpm packages. This doesn't make sense to me since most of the rpm packages seem to be on really old version and I can't possibly create rpms for all the python packages i require. I need a much recent version and it doesn't make sense that the yum installs don't actually install the packages.
What is the right (clean and easiest) way to do it ? I believe if a setup.py has something like
install_requires=[
"validictory",
"requests",
"netlogger>=4.3.0",
"netifaces",
"pyzmq",
"psutil",
"docopt"
],
Then it should try to either include them in the rpm or try to install it.
I am trying on a clean centos vm using vagrant which I keep destroying and then install the rpm.
Well the super hack way i used was to just add a post install script with all the requirements as easy_install installation (instead of pip because older versions may not have pip and even after installing pip, the approach failed on systems with python 2.6)
#Adding this in setup.py
options = {'bdist_rpm':{'post_install' : 'scripts/rpm_postinstall.sh'}},
Then the script is as follows:
easy_install -U <pkgnames>
Of course a post_uninstall can also be added if you want to clean up which I wouldn't because you have no clue what is using the packages installed apart from this app.
The logic of the rpm approach seems to be for this but its honestly over engineering and I'd rather package all the modules with the rpm to ensure it always works. ** Screaming out for a cleaner solution **
The question
Ansible is a python moduel, installable via pip. It relies on several dependencies, also pip modules. Is it possible to "roll up" all of those dependencies and Ansible itself into some sort of a single package, that can be installed offline, without root? It's highly preferable to not need pip for the install, although it will be available for package creation.
Extra background
I'm trying to install Ansible on one of our servers. The server does not have access to the internet, there is no root access. Pip is not installed, but Python is. It is possible to get pip installed there, but might be complicated. The only way to get anything on the server is via an internal tar.gz package sharing solution.
I've tried fiddling around with rpm, saving dependencies, but the absence of root access put an end to that.
Use pip on an internet-connected machine to download all the deps to a local dir with --download and -r requirements.txt, then drop that dir on the disconnected machine with pip installed, and install using --no-index and --find-links=(archive dir).
See https://pip.pypa.io/en/latest/user_guide/#fast-local-installs
Two options in setup.py develop and install are confusing me. According to this site, using develop creates a special link to site-packages directory.
People have suggested that I use python setup.py install for a fresh installation and python setup.py develop after any changes have been made to the setup file.
Can anyone shed some light on the usage of these commands?
python setup.py install is used to install (typically third party) packages that you're not going to develop/modify/debug yourself.
For your own stuff, you want to first install your package and then be able to frequently edit the code without having to re-install the package every time β and that is exactly what python setup.py develop does: it installs the package (typically just a source folder) in a way that allows you to conveniently edit your code after itβs installed to the (virtual) environment, and have the changes take effect immediately.
Note: It is highly recommended to use pip install . (regular install) and pip install -e . (developer install) to install packages, as invoking setup.py directly will do the wrong things for many dependencies, such as pull prereleases and incompatible package versions, or make the package hard to uninstall with pip.
Update:
The develop counterpart for the latest python -m build approach is as follows (as per):
From the documentation. The develop will not install the package but it will create a .egg-link in the deployment directory back to the project source code directory.
So it's like installing but instead of copying to the site-packages it adds a symbolic link (the .egg-link acts as a multiplatform symbolic link).
That way you can edit the source code and see the changes directly without having to reinstall every time that you make a little change. This is useful when you are the developer of that project hence the name develop. If you are just installing someone else's package you should use install
Another thing that people may find useful when using the develop method is the --user option to install without sudo. Ex:
python setup.py develop --user
instead of
sudo python setup.py develop
I've developed a little script that searches through a wallpaper database online and download the wallpapers, I want to give this script to another person that isn't exactly good with computers and I'm kinda starting with python so I don't know how to include the "imports" of the third party modules in my program so it can be 100% portable, Is there something that can help me do this? or I will have to enter and analyse my third party modules and copy&paste the functions that I use?
Worse thing to do
An easy thing you can do is simply bundle the other modules in with your code. That does not mean that you should copy/paste the functions from the other modules into your code--you should definitely not do that, since you don't know what dependencies you'll be missing. Your directory structure might look like:
/myproject
mycode.py
thirdpartymodule1.py
thirdpartymodule2.py
thirdpartymodule3/
<contents>
Better thing to do
The real best way to do this is include a list of dependencies (usually called requirements.txt) in your Python package that Python's package installer, pip, could use to automatically download. Since that might be a little too complicated, you could give your friend these instructions, assuming Mac or Linux:
Run $ curl http://python-distribute.org/distribute_setup.py | python. This provides you with tools you'll need to install the package manager.
Run $ curl https://raw.github.com/pypa/pip/master/contrib/get-pip.py | python. This installs the package manager.
Give your friend a list of the names of the third-party Python modules you used in your code. For purposes of this example, we'll say you used requests, twisted, and boto.
Your friend should run from command line $ pip install <list of package names>. In our example, that would look like $ pip install requests twisted boto.
Run the Python code! The lines like import boto should then work, since your friend will have the packages installed on their computer.
The easi(er) way:
Start with a clean virtual environment.
Install packages that you need to develop your code.
Once you are done, create a list of requirements for your project.
Send this file (from step 3) to your friend.
Your friend simply does pip install -r thefile.txt to get all the requirements for your application.
Here's an example:
D:\>virtualenv --no-site-packages myproject
The --no-site-packages flag is deprecated; it is now the default behavior.
New python executable in myproject\Scripts\python.exe
Installing setuptools................done.
Installing pip...................done.
D:\>myproject\Scripts\activate.bat
(myproject) D:\>pip install requests
Downloading/unpacking requests
Downloading requests-0.14.1.tar.gz (523Kb): 523Kb downloaded
Running setup.py egg_info for package requests
warning: no files found matching 'tests\*.'
Installing collected packages: requests
Running setup.py install for requests
warning: no files found matching 'tests\*.'
Successfully installed requests
Cleaning up...
(myproject) D:\>pip freeze > requirements.txt
(myproject) D:\>type requirements.txt
requests==0.14.1