I have developers that want to be able to publish a library as a "beta" or "release" version.
i.e.:
1.2.3-beta
1.2.3
On the consuming project, they couldn't give me any specific criteria of when they would want to use a beta or release package. We have CI in place, but without any definitive "when" I can't support two separate pip feeds, as they may flip flop. So I've suggested taking advantage of version range syntax in the requirements file, so they can just specify during checkin what they want. They've never had to do anything like this, and I'm basically a python rookie. Is it possible to filter on pre-release labels? I.e.
Will lib == 1.*.*-* pick up a beta package?
and
Will lib == 1.*.*, !=1.*.*-* pick up a release package and be sure to exclude any beta packages?
I would try out my theory myself, but I don't know python well enough to mock up some sort of sample libs locally, and they're too busy to research it.
By default pip will not install prereleases, like 1.0.0.b1
To enable installation of prereleases, you use the --pre flag with pip.
You can also use prerelease version specifiers to force pip to consider prereleases for individual packages without needing to use --pre. From https://pip.pypa.io/en/stable/reference/pip_install/#pre-release-versions:
If a Requirement specifier includes a pre-release or development version (e.g. >=0.0.dev0) then pip will allow pre-release and development versions for that requirement.
So in your requirements.txt file, you'd have something like:
package_a>=1.2.3 # will not install pre-releases
package_b>=1.2.3.dev0 # will install pre-releaes
Related
How to list in Python project all libraries and their version?
(similar to Java maven pom.xml and Node.js npm 'package.json')
There is definitely way to specify that on command line,
e.g. pip install pandas==0.23, but that only hopefully will be listed in README, but most likely be forgotten.
Also as there is pip and conda, do they address this? Or maybe some other 3rd tool tries to unify for whatever pip or conda usage?
There are two main mechanisms to specify libraries for a package/repository:
For an out-of-the-box package that can be installed with all dependencies, use setup.py/pyproject.toml. These are used by package managers to automatically install required dependencies on package installation.
For a reproducible installation of a given setup of possibly many packages, use requirements.txt. These are used by package managers to explicitly install required dependencies.
Notably, the two handle different use-cases: Package dependencies define the general dependencies, only restricting dependency versions to specific ranges when needed for compatibility. Requirements define the precise dependencies, restricting all direct and indirect dependency versions to replicate some pre-existing setup.
1 I get the warning msg when I pip install my Django project requirements
wechat-sdk 0.6.4 has requirement requests==2.6.0, but you'll have
requests 2.9.1 which is incompatible.
2 then follow the tips I uninstall requests and install proper version, but I get another warning
python-social-auth 0.2.21 has requirement requests>=2.9.1, but you'll
have requests 2.6.0 which is incompatible.
So, I'm trapped in the endless loop
Can anyone give any advice?
As far as I can see you have the following options:
Run pip with the --no-dependencies argument and hope it will just work. pip install wechatpy --no-dependencies (or whatever the package is called, I am not familiar with it) for example. This ignores the dependencies when installing. Maybe the requirements are outdated and this will let you proceed. For this you most likely want to satisfy the requests>=2.9.1 requirement so you should install python-social-auth normally and then try the other package without dependencies.
Look for older versions of the packages you are installing that have compatible requirements. Depending on the setup of your project this might not be possible to do because you need features of the later versions (or the old versions might be insecure).
You can try patching one of your requirements locally (download from the source, change the code to make it work with the conflicting requests version). And then import the local version of the packages. Remember to remove the requirement from your project's requirements.txt in this case to stop other people working on / using your project from running into the same issue, and include the local version as part of the project (track it on Git). Check the license of the packages you are modifying to see if you are allowed to modify and redistribute them. (Optional: Make a pull request on the packages' Github with your change(s) so other people can benefit from them)
Replace one or both of the packages by something else. They might just not be compatible or using a local, modified version might not be viable.
I need to specify a version for a package that one of my dependencies depends on, but my package does not directly depend on. Say my CI is trying to build package_a, it depends on package_b and does not depend on package_c. But package_b depends on package_c and it needs to be a particular version. In other words, "don't install this, but if you end up installing this, please install this version."
This appears to be doable in conda with run_constrainted but I cannot determine if it is possible to do in pip in the requirements.
A less appealing workaround would just be to add this other package (package_c) as another requirement.
I am using requirement.txt to specify the package dependencies that are used in my python application. And everything seems to work fine for packages of which either there are no internal dependencies or for the one using the package dependencies which are not already installed.
The issue occurs when i try to install a package which has a nested dependency on some other package and an older version of this package is already installed.
I know i can avoid this while installing a package manually bu using pip install -U --no-deps <package_name>. I want to understand how to do this using the requirement.txt as the deployment and requirement installation is an automated process.
Note:
The already installed package is not something i am directly using in my project but is part of a different project on the same server.
Thanks in advance.
Dependency resolution is a fairly complicated problem. A requirements.txt just specifies your dependencies with optional version ranges. If you want to "lock" your transitive dependencies (dependencies of dependencies) in place you would have to produce a requirements.txt that contains exact versions of every package you install with something like pip freeze. This doesn't solve the problem but it would at least point out to you on an install which dependencies conflict so that you can manually pick the right versions.
That being said the new (as of writing) officially supported tool for managing application dependencies is Pipenv. This tool will both manage the exact versions of transitive dependencies for you (so you won't have to maintain a "requirements.txt" manually) and it will isolate the packages that your code requires from the rest of the system. (It does this using the virtualenv tool under the hood). This isolation should fix your problems with breaking a colocated project since your project can have different versions of libraries than the rest of the system.
(TL;DR Try using Pipenv and see if your problem just disappears)
If i type pip freeze > requirements.txt, the resulting file looks similar to this:
argparse==1.2.1
h5py==2.2.0
wsgiref==0.1.2
Some libraries are under ongoing development. This happened to me regarding h5py, which is now (as of this writing) available in version 2.2.1. Thus, using pip install -r requirements.txt throws an error, saying version 2.2.0 of h5py was not found:
No distributions matching the version for h5py==2.2.0 (from -r requirements.txt (line 2))
Is it considered good practice to maintain the requirements via pip freeze at all? Obviously, I can not rely on specific version numbers being still available in the future. I would like to deploy my applications in the future, even if they are several years old, without compatibility problems regarding version numbers. Is there a way to make the output of pip freeze future-safe?
I thought about manipulating the output file of pip freeze by using the greater than symbol >= instead of the equals symbol ==, so the output would look like the following:
argparse>=1.2.1
h5py>=2.2.0
wsgiref>=0.1.2
But I can imagine that this will break my applications if any of the libraries breaks backward-compatibility in a future version.
To answer the first question, yes, it is pretty common to use pip freeze to manage requirements. If your project is packaged you can also set dependencies directly in the setup.py file.
You can set the requirements to greater than or equal to version x, but as you speculate, this can turn around and bite you if a dependency makes changes to their api that break your required functionality. You can also ensure that an installed dependency is less than a certain version. i.e. If you're on version 1.0 of a package and would like minor updates but a major release scares you (whether its released yet or not) you can require example>=1.0.0,<2.0.0
More info on requirements files
In the end, pip freeze is just a tool to show you what you currently have installed, it doesn't know, or care, if it works for you. What you use to replicate environments based on this data also doesn't really matter; version conflicts, updates breaking backwards compatibility, bugs and other such issues in dependencies will (at least once) cause you grief. Keeping tabs on the state of your project's major dependencies and doing automated testing with new versions will save you a lot of time and headache (or at least headache).