My problem is that I can't upload my module to PyPI. When I run
twine upload dist/easy-email-0.0.1.tar.gz
I get
HTTPError: 400 Client Error: 'Easy-email-0.0.1.tar.gz' is an invalid value for Download-URL. Error: Invalid URI see https://packaging.python.org/specifications/core-metadata for url: https://test.pypi.org/legacy/
What am I doing wrong?
Here is the setup.py:
from distutils.core import setup
setup(
name = 'easy-email',
packages = ['easy-email'],
version = '0.0.1', # Ideally should be same as your GitHub release tag varsion
description = 'Send emails in python!',
author = 'myname',
author_email = 'myemail',
url = 'https://github.com/marmadukeandbob05/Easy-Email/',
download_url = 'Easy-Email-0.0.1.tar.gz',
keywords = ['email', 'gmail'],
classifiers = [],
)
Your download_url is invalid, it is not a valid URL. Note that you don't need to set that value at all when uploading your installation archive to PyPI, because the download URL is on PyPI.
Only set download_url when you are going to host your packages elsewhere, not on PyPI. You would have to use a full URL, so one that starts with http:// or https://, and pip or easy_install would then follow that URL from PyPI to find the installation archive. You'd only use the twine register to register the metadata and just not use twine upload at all.
The error message linked you to the documentation for the field:
A string containing the URL from which this version of the distribution can be downloaded.
Bold emphasis mine; Easy-Email-0.0.1.tar.gz is not a URL. It is merely a filename.
You'd use this when you want people to download the archive from a different host, for example, from GitHub. For example, if the requests project wanted people to download the release from GitHub instead of from the PyPI servers, they could use download_url = 'https://github.com/requests/requests/archive/v2.18.4.tar.gz', and then only use twine register to put the metadata on PyPI.
Related
I have recently bumped into an issue while trying to release a new version of my PyPi package.
After updating the source code (and preparing the files for the release), I cannot find any updated guide on how to release a new version (see this, for instance) to PyPi.
Most guides reference setup.py, which has now been replaced by pyproject.toml.
So, from Windows (IDE: VScode), the old command
py setup.py sdist bdist_wheel
does not work anymore. When replacing setup.py with pyproject.toml
I get the following error:
File "C:\Users\generic\pyproject.toml", line 3
build-backend = "hatchling.build"
^
SyntaxError: cannot assign to operator
My pyproject.toml is built like any generic pyproject.toml file
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "example_package_YOUR_USERNAME_HERE"
version = "0.0.2"
authors = [
{ name="Example Author", email="author#example.com" },
]
description = "A small example package"
readme = "README.md"
requires-python = ">=3.7"
classifiers = [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
]
[project.urls]
"Homepage" = "https://github.com/pypa/sampleproject"
"Bug Tracker" = "https://github.com/pypa/sampleproject/issues"
I am aware of the Python Packages guide. However, they explain how to release using git, and I'd like an equivalent for the old method.
As user #phd pointed out, there are some guides out there. However, these only mention how to upload a new package to PyPi, not a new distribution release. Also, they do not address uploads with API tokens. Am I supposed to create a new API token? Old tokens are not accessible anymore but new ones return the following error:
ERROR HTTPError: 403 Forbidden from https://test.pypi.org/legacy/
Invalid or non-existent authentication information. See https://test.pypi.org/help/#invalid-auth for more information.
Solved:
For reference, see Trouble following packaging libraries tutorial: upload #313.
I was also confused about whether to use twine upload --repository testpypi dist* to update a package or twine upload --repository-url URL dist/* to update an existing package. The documentation on the issue is not clear to me. Also, I have encountered all sorts of issues while using API tokens for authentication. What worked for me was the following suggestion:
The correct URLs for the repositories are
https://upload.pypi.org/legacy/ and https://test.pypi.org/legacy/.
So, if you wish to update an existing package using an API token:
change the version of the project in pyproject.toml from, e.g., 0.0.1 to 0.0.2
-m build
-m twine upload --repository-url https://upload.pypi.org/legacy/ dist/*
generate a project-scoped API token on PyPi and use it for authentication
I've got a Pipfile with two sources declared: one source is the global, public PyPI, while the other is a small local repository which hosts some private packages, but doesn't mirror PyPI itself. I've got this set up as follows:
[[source]]
url = "http://my.private.repo.example.com/pypi/simple"
verify_ssl = false
name = "private"
[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"
This being in place, I use both mirrors to source packages:
[packages]
requests = "*"
some_private_package = {version="*", index="private"}
My issue is that this results in a failure to resolve some dependencies. Let's say that some_private_package depends on Flask -- which is available from public PyPI, but isn't hosted on the private repo; building some_private_package fail because Flask can't be found on the private repo, and no attemps are made to scan PyPI for it.
Is there any way to get Pipenv to search for dependencies on both available sources?
A bit of a latent answer to this: apparently the private host doesn't correctly handle a wildcard version specifier, preferring either a bare package name or a valid version specifier instead.
Explicitly pinning all packages seems to be the way to go when working with some of the self-hosted PyPI servers.
I'm having trouble uploading my package to pypi. I used to be able to just use python setup.py sdist upload -r pypi but this now causes an error:
Upload failed (400): requires: Must be a valid Python identifier.
error: Upload failed (400): requires: Must be a valid Python identifier.
I've tried a few things to get this working but everything has failed with the same error.
I removed the current dist, build and egg folders in my root directory. Then I increased my package version number by 1 micro version. I ensured my ~/.pypirc file is as it should be according to instructions:
[distutils]
index-servers =
pypi
[pypi]
username: c.welsh2
password: ...
and updated pip, twine and setuptools. I create a build using
python setuptools.py bdist_wheel
which created the build in /package_root/dist/* and I try uploading to pypi using
twine upload dist/*
And again I get:
HTTPError: 400 Client Error: requires: Must be a valid Python identifier. for url: https://upload.pypi.org/legacy/
Does anybody know what is causing this problem?
For completeness, here is my setup file:
from distutils.core import setup
import setuptools
#version
MAJOR = 4
MINOR = 0
MICRO = 5
#=======
__version__ = '%d.%d.%d' % (MAJOR, MINOR, MICRO)
setup(
name = 'PyCoTools',
packages = ['PyCoTools'], # this must be the same as the name above
version = __version__,
description = 'A python toolbox for COPASI',
author = 'Ciaran Welsh',
requires=['lxml','argparse','pandas','numpy','scipy','matplotlib.pyplot','scipy','seaborn','sklearn'],
package_data={'PyCoTools':['*.py','Documentation/*.pdf',
'logging_config.conf',
'Documentation/*.html','Licence.txt',
'ReadMe.md',
'Examples/KholodenkoExample/*',
'Examples/BioModelsWorkflowVersion1/*',
'Scripts/*.py',
'Tests/*.py',
'Tests/*.cps',
'PyCoToolsTutorial/*.pickle',
'PyCoToolsTutorial/*.py',
'PyCoToolsTutorial/*.ipynb',
'PyCoToolsTutorial/*.html',
'PyCoToolsTutorial/*.cps']},
author_email = '--<hidden>',
##
url = 'https://pypi.python.org/pypi/PyCoTools',
keywords = ['systems biology','modelling','biological',
'networks','copasi','identifiability analysis','profile likelihood'],
license='GPL4',
install_requires=['pandas','numpy','scipy','matplotlib',
'lxml'],
long_description='''Tools for using Copasi via Python and calculating profile likelihoods. See Github page and documentation for more details''')
Turns out there was quite an unforgiving typo. I couldn't give matplotlib.pyplot to the requires argument since its called matplotlib!
I get an 403 error saying that "you are not allowed to store 'tester' package information" when I try to upload my package to pypi.
I use the command c:\user\python3.exe setup.py register on windows and python3 setup.py register on linux and get the same error.
I have registered to the pypi website, and confirmed to the confirmation link that was provided to my email. I don't know what the fault is. What is being done wrong here?
EDIT:
from distutils.core import setup
setup(
name = 'tester',
version = '1.0.1',
py_modules = ['tester'],
author = 'lind',
author_email = 'lind#gmail.com',
url = 'http://www.lindl.com',
description = 'A simple printer of nested lists',
)
At first I made the tester package, but as per the suggestion, I built the recursive package and then the `test package. The packages
have built accurately(here the dist folder appears empty but it contains the zip file inside), but it gives me This specific error
well, it is because there is already package called tester and it looks like you are not its owner
you need to give another, unique name to your package
I have setup a pypiserver behind an nginx proxy which uses htpasswd for authentication. I am currently able to upload sdists, but I can't figure out how to download them. I want to be able to download them when running setup.py test and somehow by using pip. Is this possible?
[distutils]
index-servers =
private
[private]
repository = https://example.com/pypi
username = remco
password = mypass
To make it extra hard the server is currently using a non verified ssl connection.
I tried the following setup based on http://pythonhosted.org/setuptools/setuptools.html#setuptools-package-index, but the only documentation on this is 'XXX'
#!/usr/bin/env python2.7
from setuptools import setup
setup(
name='asd',
version='0.0.1',
package_index='https://example.com/pypi/simple',
test_suite='test',
tests_require=['foo==0.0.1'])
for using your index with pip create ~/.pip/pip.conf with this content:
[global]
index-url = https://remco:mypass#build.d-centralize.nl/pypi/simple
cert = /etc/ssl/certs/your_cert_CA.pem
A little bit documentation on pip.conf is here and on pypiserver here
Perhaps you can also try using package_index='https://user:pass#example.com/pypi/simple
in setup.py.
The server certificate had to be setup properly.
For uploading using pip one must create a valid ~/.pypirc file:
[distutils]
index-servers = example
[example]
repository = https://example.com/pypi
username = myname
password = mypass
For installing packages one needs to add the following section to .pip/pip.conf
[global]
extra-index-url = https://myname:mypass#example.com/pypi/simple
As knitti noted in a previous answer it is also possible to user index-url instead of extra-index-url. This does mean that the cheese shop is not used as a second server.
For using a private server with setuptools unittesting you need to add the following to your setup.py:
from setuptools import setup
setup(
...
dependency_links=[
'https://myname:mypass#example.com/pypi/packages/'
])