Poetry has a very good version solver, too good sometimes :) I'm trying to use poetry in a project that uses two incompatible packages. However they are incompatible only by declaration as one of them is no longer developed, but otherwise they work together just fine.
With pip I'm able to install these in one environment (with an error printed) and it works. Poetry will declare that the dependencies versions can't be resolved and refuse to install anything.
Is there a way to force poetry to install these incompatible dependencies? Thank you!
No.
Alternative solutions might be:
contacting the offending package's maintainers and asking for a fix + release
forking the package and releasing a fix yourself
vendoring the package in your source code - there is no need to install it if it's already there, and many of the usual downsides to vendoring disappear if the project in question is not maintained anymore
installing the package by hand post poetry install with an installer that has the option to ignore the dependency resolver, like pip (similar to what you're already doing)
Related
I have come across some nasty dependencies.
Looking around I found solutions like upgrade this, downgrade that...
Some solutions work for some but not for others.
Is there a more 'rounded' way to tackle this issue?
To test this:
I wrote a script to create a virtual environment, attempt to install a requirements file, then delete this environment which I run to quickly test if changes to requirements result in a successful installation or not.
I wrote a "test app" which uses the basic functionalities of all the libraries I use in one place, to see if despite a successful installation, there are dependencies that pip is unaware of (due to problematic 3rd party library architecture) that break. I can edit the dependencies of this app, then commit to run build actions that run it and see if it completes or crashes.
This way I can upgrade and add libraries more rapidly.
If all libraries used semantic versioning, declared all their dependencies via requirement files and did not define upper versions in requirement files, this would be a lot simpler. Unfortunately, one of the DB-vendors I use (which I shall not name) is very problematic and has a problematic Python library (amongst many other problems).
First you need to understand that pip can resolve problems one at a time and when you put it in a corner, it can't go further.
But, if you give to pip the 'big problem' it has a nice way to try to resolve it. It may not always work, but for most cases it will.
The solutions you normally find out there are in some cases a coincidence. Someone has an environment similar to another person and one solution works for both.
But if the solution takes into consideration 'your present environment' then the solution should work for more people than just 'the coincidences'.
Disclaimer: below are linux/terminal commands.
Upgrade pip. We want the smartest pip we can get.
pip install --upgrade pip
Extract the list of packages you want to install.
In my case (these and many others, trimmed for brevity)
google-cloud-texttospeech attrdict google-cloud-language transformers
Give them all at once to pip.
pip install google-cloud-texttospeech attrdict google-cloud-language transformers
It will try all the combinations of versions and dependencies' versions until it finds something suitable. This will potentially download a ton of packages just to see their dependencies, so you only want to make this once.
If happy with the result, extract the requirements file.
pip freeze > requirements.txt
This contains all the packages installed, we are not interested in all.
And from it, extract the specific versions of your desired packages.
cat requirements.txt | egrep -i "google-cloud-texttospeech|attrdict|google-cloud-language|transformers"
Note: The command above may not list all your required packages, just the ones that appear in the requirements.txt, so check that all show up.
attrdict==2.0.1
google-cloud-language==1.2.0
google-cloud-texttospeech==2.12.3
transformers==2.11.0
Now you can put that on a file like resolved-dependencies.txt
And next time, install the packages directly with the valid & compatible version with.
pip install -r resolved-dependencies.txt
In some documents and resources, it has been noted that installing packages using pip after installing conda could cause some problems like crashing conda and so on. could anybody tell me when and in which cases is that happening? I tried pip for some packages due to the smallness of conda's repository and I found no problem using both together.
I'm using pip 21.1.2 and conda 4.10.1.
The point is that pip is less strict wrt. to dependency checks, so it's clearly better to prioritize conda installs over pip:
"Pip and conda differ in how dependency relationships within an environment are fulfilled. When installing packages, pip installs dependencies in a recursive, serial loop. No effort is made to ensure that the dependencies of all packages are fulfilled simultaneously. This can lead to environments that are broken in subtle ways, if packages installed earlier in the order have incompatible dependency versions relative to packages installed later in the order. In contrast, conda uses a satisfiability (SAT) solver to verify that all requirements of all packages installed in an environment are met. This check can take extra time but helps prevent the creation of broken environments." (https://www.anaconda.com/blog/understanding-conda-and-pip)
However, some packages are only available via PyPI, and you have to resort to pip. In that case stick to the best practices as recommended in the conda docs.
(https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#using-pip-in-an-environment)
So far I never experienced any inconsistencies with conda and pip installs myself, and when you are a serious programmer and have your test suites properly set up, you'll anyhow get aware of issues well in advance and before they can cause any trouble.
everyone:
Because of the speed of network, when I conda install some packages, there will exist some related packages can not be downloaded completely. But we can not install packages have been downloaded successfully without other "related" packages(maybe "related" means the best march in version, but not necessary).
For example, When I install pytorch, it need numpy-1.14.2, but I am with numpy-1.15.1. I don't need verson 1.14.2 numpy in practice.
So I am a little confused how to make "conda" trying to install packages have been downloaded successfully, ignoring download failed packages?
Thanks!
EricKani
From the conda documentation there are two options that may help https://docs.conda.io/projects/conda/en/latest/commands/install.html
--no-update-deps
Do not update or change already-installed dependencies.
--no-deps
Do not install, update, remove, or change dependencies. This WILL lead to broken environments and inconsistent behavior. Use at your own
risk.
I believe by default conda tries with --no-update-deps first and then if that fails tries to update deps; giving it that option will make sure some version of each needed package is installed, if not necessarily the latest.
You could try --no-deps as well, which will literally prevent conda from installinh ANYTHING other than the exact packages you tell it to, but things may not work with that.
I am using requirement.txt to specify the package dependencies that are used in my python application. And everything seems to work fine for packages of which either there are no internal dependencies or for the one using the package dependencies which are not already installed.
The issue occurs when i try to install a package which has a nested dependency on some other package and an older version of this package is already installed.
I know i can avoid this while installing a package manually bu using pip install -U --no-deps <package_name>. I want to understand how to do this using the requirement.txt as the deployment and requirement installation is an automated process.
Note:
The already installed package is not something i am directly using in my project but is part of a different project on the same server.
Thanks in advance.
Dependency resolution is a fairly complicated problem. A requirements.txt just specifies your dependencies with optional version ranges. If you want to "lock" your transitive dependencies (dependencies of dependencies) in place you would have to produce a requirements.txt that contains exact versions of every package you install with something like pip freeze. This doesn't solve the problem but it would at least point out to you on an install which dependencies conflict so that you can manually pick the right versions.
That being said the new (as of writing) officially supported tool for managing application dependencies is Pipenv. This tool will both manage the exact versions of transitive dependencies for you (so you won't have to maintain a "requirements.txt" manually) and it will isolate the packages that your code requires from the rest of the system. (It does this using the virtualenv tool under the hood). This isolation should fix your problems with breaking a colocated project since your project can have different versions of libraries than the rest of the system.
(TL;DR Try using Pipenv and see if your problem just disappears)
I've a strange problem with co-existence of debian package and pip package. For example, I've python-requests (deb version 0.8.2) installed. Then when i install the requests (pip version 2.2.1), the system only apply the deb version instead of pip new version. Does anyone can resolve this problem? Thank you in advance.
In regard to installing python packages by system packages and pip, you have to define clear plan.
Personally, I follow these rules:
Install only minimal set of python packages by system installation packages
There I include supervisord in case, I am not on too old system.
Do not install pip or virtualenv by system package.
Especially with pip in last year there were many situations, when system packages were far back behind what was really needed.
Use Virtualenv and prefer to install packages (by pip) in here
This will keep your system wide Python rather clean. It takes a moment to get used, but it is rather easy to follow, especially, if you use virtualenvwrapper which helps a lot during development.
Prepare conditions for quick installation of compiled packages
Some packages require compilation and this often fails on missing dependencies.
Such packages include e.g. lxml, pyzmq, pyyaml.
Make sure, which ones you are going to use, prepare packages in the system and you are able to install them into virtualenv.
Fine-tuning speed of installation of compiled packages
There is great package format (usable by pip) called wheel. This allows to install a package (like lxml) to install on the same platform within fraction of a second (compared to minutes of compilation). See my answer at SO on this topic