I have the following issue:
One package that I want to install through conda demands a certain version of another package. However, I want to install a newer version of the second package.
Specifically, the package energysim 2.1.5 demands the package fmpy 0.2.14 and when I am trying to install a newer version of fmpy I am getting error:
ERROR: energysim 2.1.5 has requirement fmpy==0.2.14, but you'll have fmpy 0.3.0 which is incompatible.
Is it possible something like that and how?
In my answer, I am assuming the following case:
You want to install packageA, which requires packageB==v1
You also want to install packageB at version v2
Your goal: Install packageB with version v1 and v2 to make this possible
I don't know of any way this can be achieved. I also don't see a way that this would even technically work. Say you do import packageB in your code. Which version should be imported? How should python know that import packageB done by packageA should be v1, but import packageB done by you should be v2?
I see these options:
not using packageA, so that you can have packageB at the version you need
If possible, have one environment where you packageA and packageB and another with only packageB at the version you want
Fork packageA and create your own custom version that works with your required version of packageB
Wouldn't you just be able to do:
conda install packageB==2.0.0
conda install packageA
The package energysim is pinned [1] to use
fmpy 0.2.14 and no other versions (older or newer). It looks like this was done intentionally [2]; the maintainer may have good reasons to enforce this pin. pip won't let you install 0.3 because of this pin.
I would reach out to them via a GitHub issue to ask if their package is compatible with newer versions. It looks like fmpy 0.2.14 is about a year and a half old, for what it's worth. It may work fine with 0.3.x, but IMHO it should be tested and released before using it.
https://github.com/dgusain1/energysim/blob/07282257073058119664f9a5e8fd4300e138a64d/setup.py#L25-L29
https://github.com/dgusain1/energysim/commit/f84dad3ab913b43eea3187da54c132319c23d1a7
Related
I recently had to bump a google cloud library due to a conflict that was generating a bug. Long story short, I had
google-cloud-pubsub==1.4.2
which I had to bump to 1.4.3. This in turn reverted google-api-core module to 1.16.0, which generated a conflict with another module google-cloud-secret-manager which required a higher version of google-api-core.
Now, I have removed google-cloud-secret-manager. But, If I try to install the module again to the last version however, it will bump me google-api-core to a version not compatible with google-cloud-pubsub. What I want to do instead is to pip install google-cloud-secret-manager to the highest possible version that is compatible with google-api-core==1.16.0 without manually trying to install all the versions until i find the right match. Is it something possible?
Is there a pip install fix dependency version command that could allow me to easily install google-cloud-secret-manager that will not change the version of the dependency module google-api-core to a different version? Thank you
You can achieve this with a constraints file. Just put all your constraints into that file:
google-api-core==1.16.0
Then you can install via:
python -m pip install -c constraints.txt google-cloud-secret-manager
This will try every version of google-cloud-secret-manager, starting from the most recent version, until it finds a version that is compatible with the given constraints.
I'm trying to use get_worksheet_by_id function from the gspread package.
I can see the function is available in https://github.com/burnash/gspread/blob/master/gspread/models.py
It's also listed in documentation.
But I it's missing in pip and conda repositories. As a result I'm not able to use it.
https://pypi.org/project/gspread/#files
https://anaconda.org/conda-forge/gspread/files
Not sure where to report it.
As you can see if you look at the blame, the function was only added by this commit, which is from march 2021. The latest version available from pypi and conda-forge is however from february. That is why you don't have if when you install through these channels.
Some suggestions:
You could simply edit the code of the library in your site-packages
Install from the github sources, either by cloning the repo and doing python setup.py install or through python -m pip install git+https://github.com/burnash/gspread
Create an issue on the github repo and ask that the version on conda-forge/pypi is updated to include this feature.
I Have a package published on pypi and versioned as 3.0.0.
setup.py has never mentioned python_requires directive.
In release 2.5.0 there was a change which made package to be incompatible with python 2, which went unnoticed, until now.
Since 2.5.0 there were a plenty of releases of a package published on pypi.
Now, if I want to install the package using python2 - pip will install the latest release 3.0.0 which won't work.
I need pip to install version 2.4.0 - which has no compatibility issues. But how exactly can I accomplish that? (without prior knowledge of pip install package==2.4.0 - something like using pip's backtracking mechanism)
If I specify the directive python_requires=">=3.6" in release 3.1.0 pip will backtrack to release 3.0.0 installing package which won't work.
I can think of:
cx_Oracle way. Raising exception in setup.py if minimal version does not match required for installation and specifying how to install correct one.
Create 2 new releases. One, which is essentialy 2.4.0 versioned as 3.1.0 with python_requires=">=2.7,<3.6" and one which is 3.0.0 versioned as 3.1.1 with python_requires=">=3.6"
Is there a better way?
There is a relatively new feature on PyPI: you could "yank" the release(s) which are incompatible with Python 2 but not correctly specifying that in the metadata.
A yanked release is a release that is always ignored by an installer, unless it is the only release that matches a version specifier (using either == or ===).
See PEP 592 -- Adding "Yank" Support to the Simple API for more information. In fact, what you've described is the main scenario described in the motivation section of the PEP.
I have manually installed datatable (from h2o.ai) https://github.com/h2oai/datatable from HEAD of master
make build
make install
They were successful. However when running pip3 freeze I see the (v old) default version (0.6.0) that had been installed via
pip3 install datatable
some months back:
$pip3 freeze | grep datatable
datatable==0.6.0
I am uncertain whether:
the locally built version of datatable is not being used
the locally built version of datatable is being used but not reported by pip3
if that were the case: how to verify the locally built/installed version were being used (or not)
Tips appreciated.
Updates
Based on (great) comments below:
import datatable then print(datatable.__version__)
0.6.0
But the datatable.__file__ shows the local version:
In [3]: print(datatable.__file__)
/git/datatable/datatable/__init__.py
Does this possibly mean that the local installation is being used - but that the version reported by that locally built one is still the same (v old) one that was published to pip repositories months earlier?
To look precisely at the module being used, the best way, as mentioned by #duhaime is to use import datatable; print(datatable.__file__).
If your local installation was done correctly, then you should also make sure that 1) the location where you installed it is in your PYTHONPATH, 2) that if it is, the path is placed before that of the standard paths (lookup is sequential).
An easy way to check that it is in the path if you don't know where to look is just to uninstall the version installed through pip.
EDIT
Based on the edit to the question, yes, the version is still the same (see here)
I need to build a python module from source. It is just my second build and I'm a bit confused regarding the interaction between built packages and binaries installed through package manager.
Do I need to uninstall the binary first?
If I don't need to Will it overwrite the installed version or will both be available?
If it will not overwrite how can I import the built version into python?
Thank you all!
p.s: If it is case sensitive I'm on fedora 24 and the package is matplotlib which is installed through a setup.py.
I strongly recommend to use virtualenv and build your package inside. Is it really necessary to install via setup.py? If not, you can consider using pip to install your package inside virtualenv.