How to make setup.py install a different module - python

I would like to use distutils (setup.py) to be able to install a python package (from a local repository), which requires another package from a different local repository. Since I am lacking decent documentation of the setup command (I only found some examples
here and here, confused by setup-terms extras_require, install_require and dependency_links found here and here), does anyone have a complete setup.py file that shows how this can be handled, i.e. that distutils handles the installation of a package found in some SVN repository, when the main package I am installing right now requires that?
More detailed explanation: I have two local svn (or git) repositories basicmodule and extendedmodule. Now I checkout extendedmodule and run python setup.py install. This setup.py files knows that extendedmodule requires basicmodule, and automatically downloads it from the repository and installs it (in case it is not installed yet). How can I solve this with setup.py? Or maybe there is another, better way to do this?
EDIT: Followup question
Based on the answer by Tom I have tried to use a setup.py as follows:
from setuptools import setup
setup(
name = "extralibs",
version = "0.0.2",
description = ("Some extra libs."),
packages=['extralib'],
install_requires = "basiclib==1.9dev-r1234",
dependency_links = ["https://source.company.xy/svn/MainDir/SVNDir/basiclib/trunk#20479#egg=basiclib-1.9dev-r1234"]
)
When trying to install this as a normal user I get the following error:
error: Can't download https://source.company.xy/svn/MainDir/SVNDir/basiclib/trunk#20479: 401 Authorization Required
But when I do a normal svn checkout with the exact same link it works:
svn co https://source.company.xy/svn/MainDir/SVNDir/basiclib/trunk#20479
Any suggestion how to solve this without changing ANY configuration of the svn repository?

I think the problem is that your svn client is authentified (caching realm somewhere in ~/.subversion directory) what your distutils http client don't know how to do.
Distutils supports svn+http link type in dependency links. So you may try adding "svn+" before your dependency link providing username and password:
dependency_links =
["svn+https://user:password#source.company.xy/svn/MainDir/SVNDir/basiclib/trunk#20479#egg=basiclib-1.9dev-r1234"]
For security reasons you should not put your username and password in your setup.py file. One way to do that it fetching authentication information from an environment variable or event try to fetch it from your subversion configuration directory (~/.subversion)
Hope that help

Check out the answers to these two questions. They both give specific examples on how install_requires and dependency_links work together to achieve what you want.
Can Pip install dependencies not specified in setup.py at install time?
Can a Python package depend on a specific version control revision of another Python package?

Related

Pipenv package hash does not match lock file

We have a lock file which has not changed since April 2021. Recently, we have started seeing the following error on pipenv install --deploy:
ERROR: THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
gunicorn==20.1.0 from https://files.pythonhosted.org/packages/e4/dd/5b190393e6066286773a67dfcc2f9492058e9b57c4867a95f1ba5caf0a83/gunicorn-20.1.0-py3-none-any.whl (from -r /tmp/pipenv-g7_1pdnq-requirements/pipenv-d64a8p6k-hashed-reqs.txt (line 32)):
Expected sha256 e0a968b5ba15f8a328fdfd7ab1fcb5af4470c28aaf7e55df02a99bc13138e6e8
Got 9dcc4547dbb1cb284accfb15ab5667a0e5d1881cc443e0677b4882a4067a807e
We have opened an issue in the project GitHub https://github.com/benoitc/gunicorn/issues/2889
We believe that it would be unsafe to use this new version without confirmation it is correct and safe in case someone has maliciously updated the package in the package repository.
Is there a way we can grab the wheel file from a previous docker build and force that to be used for the time being so we can safely build with the existing version and checksum?
Thanks
Thanks to #Ouroborus for the answer:
e0... is for the .tar.gz (source) package, 9d... is for the .whl package. (See the "view hashes" links on PyPI's gunicorn files page) I'm not sure why your systems are choosing to download the wheel now when they downloaded the source previously. However, those are both valid hashes for that module and version.

ReadTheDocs + Sphinx + setuptools_scm: how to?

I have a project where I manage the version through git tags.
Then, I use setuptools_scm to get this information in my setup.py and also generates a file (_version.py) that gets included when generating the wheel for pip.
This file is not tracked by git since:
it has the same information that can be gathered by git
it would create a circular situation where building the wheel will modify the version which changes the sources and a new version will be generated
Now, when I build the documentation, it becomes natural to fetch this version from _version.py and this all works well locally.
However, when I try to do this within ReadTheDocs, the building of the documentation fails because _version.py is not tracked by git, so ReadTheDocs does not find it when fetching the sources from the repository.
EDIT: I have tried to use the method proposed in the duplicate, which is the same as what setuptools_scm indicate in the documentation, i.e. using in docs/conf.py:
from pkg_resources import get_distribution
__version__ = get_distribution('numeral').version
... # I use __version__ to define Sphinx variables
but I get:
pkg_resources.DistributionNotFound: The 'numeral' distribution was not found and is required by the application
(Again, building the documentation locally works correctly.)
How could I solve this issue without resorting to maintaining the version number in two places?
Eventually the issue was that ReadTheDocs did not have the option to build my package active by default and I was expecting this to happen.
All I had to do was to enable "Install Project" in the Advanced Settings / Default Settings.

Python to .exe raises Exception: Versioning for this project requires either an sdist tarball

I'm trying to build an executable from Python files. I was able to correct most errors, but now I'm stuck with this one and I can't find out how to correct it. My program interacts with the Jira API.
I'm using Cx_Freeze to build the .exe with the following setup. py file :
import sys
import setuptools
from cx_Freeze import setup, Executable
build_exe_options = {"includes": ["appdirs", "packaging.specifiers",
"packaging.requirements", "setuptools.msvc", "jira"]}
setup(name="Quick", version="1.0", executables=[Executable("main.py")],
options={"build_exe": build_exe_options},
install_requires=['selenium', 'jira', 'cx_Freeze'])
I enter in command prompt: python setup.py build and get a folder named build as a result. It contains a main.exe program. When I launch it from command prompt I get this error :
Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that there is a mismatch between the package name in setup.cfg and the argument given to pbr.version.VersionInfo. Project name jira was given, but was not able to be found.
I've tried to upgrade Jira, setuptools and disutils with pip but it didn't change anything.
I'm using Python 3.6.
I have finally gotten this working and thought I should share my results since there seem to be very few people using Jira and cx_Freeze together. It seems that cx_Freeze does not package Jira properly. Below is what I did to get my script working.
First, in setup.py, I included these packages:
packages = ["os", "sys", "atexit", "getpass", "subprocess", "datetime", "dateutil", "jira", "openpyxl", "appdirs", "packaging"]
Many of these are not necessary for everyone but jira, appdirs, and packaging helped me.
Then, after running python setup.py build, I copied:
C:\Users\me\AppData\Local\Programs\Python\Python36-32\Lib\site-packages\idna
C:\Users\me\AppData\Local\Programs\Python\Python36-32\Lib\site-packages\idna-2.6.dist-info
C:\Users\me\AppData\Local\Programs\Python\Python36-32\Lib\site-packages\jira
C:\Users\me\AppData\Local\Programs\Python\Python36-32\Lib\site-packages\jira-1.0.15.dist-info
into:
build\exe.win32-3.6\lib (the directory created by running setup.py) overwriting any conflicts.
This solved the problem for me. Let me know if you have any other issues.
I've got this exception while importing jira in a PyDev project, which links the jira git clone as a Project Reference.
The workaround for me was to extend the PATH environment to include the git executable.
Analyse
pbr/packing.py - get_version() raise
Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. ...
when _get_version_from_git() returns None . This happens, when
pbr/git.py - _run_git_functions() - _git_is_installed() does not find a git executable.
I was getting the same issue when I tried to upload my code with packages on AWS lambda function. After multiple trial and errors, adding the idna packages along with the jira packages worked for me.
idna
idna-2.10.dist-info
jira
jira-2.0.0.dist-info

How to include python-dev in buildroot?

I'm making a buildroot for my raspberrypi3 for a school project.
I've made a buildroot with everything from python included because i want to use WebIOPi. A buildroot has been done and the image has been written on the SDCard.
Now when I want to install it on the buildroot device it asks for python-dev, wich is not included by buildroot. With further research I've only found this. Thats a python-dev0.4.0 but i think there's a much recent version on my virtual ubuntu16 os.(main os is windows 10, so need image to use win32diskimager)
But I don't know how to implement this in the python buildroot packages. I've already read the manuals from buildroot, it's kinda confusing for me...
I've already tried to make a directory named 'python-dev' in the buildroot/package directory (ubuntu os) but with no succes.
This is what i've got so far:
buildroot/package/python-dev:
config.in
python-dev.mk
in the config.in file:
config BR2_PACKAGE_PYTHON_DEV
bool "python-dev"
help
in the python-dev.mk file (copied from libconfig):
################################################################################
#
# python-dev
#
################################################################################
PYTHON_DEV_VERSION = 0.4.0
PYTHON_DEV_SOURCE = dev-0.4.0.tar.gz
PYTHON_DEV_SITE = https://pypi.python.org/packages/53/34/e0d4da6c3e9ea8fdcc4657699f2ca62d5c4ac18763a897feb690c2fb0574/dev-0.4.0.tar.gz
PYTHON_DEV_LICENSE = Python software foundation license v2, others
PYTHON_DEV_LICENSE_FILES = README
PYTHON_DEV_SETUP_TYPE = setuptools
PYTHON_DEV_DEPENDENCIES = libconfig boost
$(eval $(python-package))
When I run a make menuconfig and search for python-dev, it's not there...
I hope someone could help me with this.
If there's an easier way, it's pretty much welcome.
Thank you in advance.
The python-dev package that the WebIOPi setup script is checking for has nothing to do with the dev python package that you found at https://pypi.python.org/pypi/dev.
The python-dev package is a package on Linux distributions that contains the development files for the Python library that is installed together with the Python interpreter. It installs the necessary files to allow C/C++ programs to link against libpython.
Buildroot has already installed what you need in STAGING_DIR. However, you are probably trying to install WebIOPi directly on the target, which is not how Buildroot is intended to be used. Buildroot does not allow to do development on the target: it does not provide a compiler on the target, nor the necessary files for development.
Buildroot is intended to be used as a cross-compilation environment. So what you should do instead is create a Buildroot package for WebIOPi, and have it cross-compiled (from your host machine), so that it gets installed, ready to use, in your target filesystem.

PIP install my OS project

I've created an open source project and tried to register it with PIP so people can use pip install. Unfortunately I can't seem to get it work. Here are the commands I've tried:
Created a setup.py file:
from distutils.core import setup
setup(name='AyeGotchoPayCheque',
version='.9',
description='Payment Gateway Interface',
author='Rico Cordova',
author_email='rico.cordova#rocksolidbox.com',
url='http://www.python.org/sigs/ayegotchopaycheque-sig/',
packages=['ayegotchopaycheque', 'ayegotchopaycheque'],
)
Then I used the command python setup.py register and answered the questions.
I've tried several other solutions and can't seem to get this working.
Any suggestions?
EDIT 1:
It seems I've successfully registered my project with the wrong name=AyeGotchoPayChecque, note the extra "c". How can I "unregister" this project and re-register with the correct name?
To "unregister", log into PyPI and go to the account page for the package you registered, then click on the "Remove this package completely" button. Then, you can reregister with the correct name. Don't forget to upload the project as well. I prefer to do it at the same time that I register:
python setup.py egg_info -RDb "" sdist register upload
Each time you upgrade your package's version number, re-run the above code, and PyPI will keep all versions of your package on the package's website.

Categories

Resources