I am building a wheel using
python setup.py bdist_wheel
Basically setuptools library.
My project contains LICENSE.txt file in root directory of the repository.
Aim:
Properly include this particular license file in the wheel
Relevant Code:
setup(
...,
license_files='LICENSE.txt',
...
)
Error:
warning: Failed to find the configured license file 'L'
warning: Failed to find the configured license file 'C'
warning: Failed to find the configured license file 'N'
warning: Failed to find the configured license file 't
https://setuptools.readthedocs.io/en/latest/userguide/declarative_config.html#metadata
Setuptools official doc states the datatype of license_file to be "str" and license_files to be list-comma
Solution 1: Use license_file with str
setup(
...,
license_file='LICENSE.txt',
...
)
Solution 2a: Use license_files with comma separated list
setup(
...,
license_file=LICENSE.txt,
...
)
Solution 2b setup.cfg with license_files
I instead created a new file setup.cfg and upgraded my setuptools to allow it to pick up metadata from setup.cfg
[metadata]
license_files = <name-of-license-file>
Thanks to: https://stackoverflow.com/a/48691876/5157515
Related
Our build pipeline spits out an *.so / *.pyb python extension through pybind11. As a step in the pipeline I need to package the extension as a wheel for easy distribution through pip. I am trying to come up with a setup.py that takes the existing library and does not rely on re-compiling the binaries through the setup.py (so I really don't want this). It would require a major rewrite of the devops scripts.
When having a folder structure such as:
setup.py
my_module.cpython-39-darwin.so
A very basic setup.py can create a functioning wheel (python setup.py bdist_wheel):
setup(
name = 'my_module',
version='0.9.102',
packages=["."],
package_data={'': ['*.so', '*.pyi']},
)
Unfortunately, the wheel is missing the important python tag and platform name, etc: my_module-0.9.102-py3-none-any.whl vs. my_module-0.9.102-cp39-cp39-macosx_10_13_x86_64.whl
Setting --python-tag and --plat-name works, setting --py-limited-api does not, however.
Through research I found that overwriting the distclass adds the correct tag again, but the Root-Is-Purelib is set back to false. This, unfortunately, creates a broken wheel when installing through pip as it puts the binary in a my_module-0.9.102.data/purelib folder...
Overwriting the is_pure seems to be ignored also:
from setuptools import setup, find_packages, Distribution
class BinaryDistribution(Distribution):
def is_pure(self):
return True
def has_ext_modules(foo):
return True
setup(
name = 'my_module',
version='0.9.102',
packages=["."],
package_data={'': ['*.so', '*.pyi']},
distclass=BinaryDistribution
)
What else can I do to wrap my pre-compiled python libraries to wheels for distribution without rewriting lots of the build pipeline?
Not a perfect solution but a functional wheel is created when using the wheel module instead:
from setuptools import setup
setup(name='my_module',
packages=['my_module'],
package_data={'': ['*.so', '*.pyi']},
)
create wheel with:
pip install wheel
pip wheel .
This is what my setup.py looks like:
from distutils.core import setup
setup(
author='...',
description='...',
download_url='...',
license='...',
long_description=open('README.md', 'r').read(),
long_description_content_type='text/markdown',
name='...',
packages=['...'],
url='...',
version='...'
)
Then, I can run python setup.py sdist without any errors. But if I check the package with twine (twine check dist/*), I get the following warning:
`long_description` has syntax errors in markup and would not be rendered on PyPI.
warning: `long_description_content_type` missing. defaulting to `text/x-rst`.
All of my packages are up to date, and I have no duplicate or multi-line attributes. What is causing this, and how can I fix it?
This is because you're using the setup function provided by distutils.core. Use setuptools instead:
from setuptools import setup
distutils.core doesn't expect the long_description_content_type to be provided, and seemingly ignores it. It actually says this when you run setup.py:
UserWarning: Unknown distribution option: 'long_description_content_type'
Although that's easy to miss, since it's at the top of a long block of otherwise error-free logs.
Let's say I have a simple library which uses setuptools for packaging and distributing. The library in this case also requires a minimum version of Python 3.6, meaning my setup.py would be something like as follows:
from setuptools import setup, find_packages
setup(
name='something',
version='0.0.1',
description='description',
long_description=long_description,
# More metadata
packages=find_packages(exclude=['tests', 'docs']),
python_requires='>=3.6'
)
Now, when I run python setup.py bdist_wheel, I get a file named something-0.0.1-py3-none-any.whl. As evident here, wheel is ignoring the python_requires option in setuptools when determining the Python tag for my wheel (it should be py36 but is the default py3). Obviously, I realize that I can just pass in --python-tag py36 from the command line, which will do the job, but the continuous deployment service I am using for deploying my library only takes in the name of the distribution I am using (bdist_wheel). As such, I cannot pass any command line parameters.
After doing a bit of research, I found that I could inherit from the bdist_wheel class and override the python_tag member variable, but according to the wheel README:
It should be noted that wheel is not intended to be used as a library, and as such there is no stable, public API.
Because of this, I want to avoid inheriting from the bdist_wheel class which might force me to rewrite my class every time some breaking change occurs.
Is there any alternative way through setuptools which allows me to pass in the Python tag for a wheel?
Every command line argument for every distutils command can be persisted in setup config file. Create a file named setup.cfg in the same directory your setup.py resides in and store the custom bdist_wheel configuration in there:
# setup.cfg
[bdist_wheel]
python-tag=py36
Now running python setup.py bdist_wheel will be essentially the same as running python setup.py bdist_wheel --python-tag py36.
Relevant article in the distutils docs: Writing the Setup Configuration File.
You could hack in something like
if 'bdist_wheel' in sys.argv:
if not any(arg.startswith('--python-tag') for arg in sys.argv):
sys.argv.extend(['--python-tag', 'py36'])
but it's arguably just as brittle...
I am creating a module and need to prepare my setup.py file to have some requirements. One of the requirements is a fork of one package that is already in PyPI so I want to reference my GitHub repository directly.
I tried two configurations, the first one is:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement', # The dependency name
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement', # This is my repository location
]
)
I create a local distribution of my module using python setup.py sdist and when I run pip install path/to/module/dist/mymodule-0.1.tar.gz it ends up installing the version on PyPI and not my repository.
The other configuration, I tried to change the requirement name to force searching for a dependency link like so:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement_alt', # The dependency name with a suffix
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt', # This is my repository location
]
)
But with this, I only end up getting an error that myrequirement_alt is not found...
So I ask, what is the right way to achieve this without using PyPI?
For dependency links to work you need to add the version number of the package to https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt.
or it will not know what to install.
e.g.:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement', # The dependency name
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt-1.3' # Link with version at the end
]
)
Note that I wouldn't recommend using dependency links at all as they're deprecated. You should probably, instead, use requirement files.
Trying to create a python package. Seems to work, but i get a warning.
my setup.py is:
#! /usr/bin/env python
from distutils.core import setup
setup(
name='myPKG',
version='0.02.01',
url='http://someURL.02.01',
packages=['scripts',
'statistics'],
author = 'Research-Team',
author_email = 'me#gmail.com',
description='This is my package',
scripts=['scripts/myScript.py'],
entry_points={'console_scripts' : ['myCommandlineName = scripts.myScript:testRequireDecorator']},
install_requires=['numpy >= 1.5.1', 'scipy >= 0.9.0', 'poster']
)
I get the following warnings. why, specifically, the two first user warning?
root#TK: python setup.py sdist
/usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'entry_points'
warnings.warn(msg)
/usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'install_requires'
warnings.warn(msg)
running sdist
running check
package init file 'scripts/__init__.py' not found (or not a regular file)
reading manifest template 'MANIFEST.in'
no previously-included directories found matching '.git'
warning: no previously-included files matching '*.pyc' found under directory '.'
writing manifest file 'MANIFEST'
creating myPKG-0.02.01
creating myPKG-0.02.01/scripts
creating myPKG-0.02.01/statistics
making hard links in myPKG-0.02.01...
hard linking README -> myPKG-0.02.01
hard linking setup.py -> myPKG-0.02.01
hard linking scripts/myScript.py -> myPKG-0.02.01/scripts
hard linking statistics/RunningMedian.py -> myPKG-0.02.01/statistics
hard linking statistics/RunningStdev.py -> myPKG-0.02.01/statistics
hard linking statistics/__init__.py -> myPKG-0.02.01/statistics
Creating tar archive
removing 'myPKG-0.02.01' (and everything under it)
You're using distutils, but you need at least setuptools in order to use those options.
from setuptools import setup