How can I re-upload package to pypi? - python

I upload a package to pypi, but I got some trouble after upload, so I delete it completely, and I tried to re-upload, but there are some error after upload again:
HTTP Error 400: This filename has previously been used, you should use a different version.
error: HTTP Error 400: This filename has previously been used, you should use a different version.
It seems pypi can track the upload activity, I delete project and account and upload again, but I can see the previous record. Why?
How can I solve the problem?

In short, you cannot reupload a distribution with the same name due to stability reasons. Here you can read more about this issue at https://github.com/pypa/packaging-problems/issues/74.
You need to change the distribution's file name, usually done by increasing the version number, and upload it again.

Yes you can reupload the package with same name.
I had faced similar issue what I did was increased the version number in setup.py and delete the folders generated by running python setup.py sdist i.e. dist and your_package_name-egg.info and again run the commands python setup.py sdist to make the package upload ready.
I think pypi tracks the repo from folder generated by sdist i.e. dist and your_package_name-egg.info so you have to delete it.

If you are running your local pypi server then you can use -o,--overwrite option which will allow overwriting existing package files.
pypi-server -p 8080 --overwrite ~/packages &

Related

How to put data in a PyPi python package and retrieve via pip?

I am trying to package some data alongside the scripts within a package of mine: https://pypi.org/project/taxon2wikipedia/0.0.4/
The source distribution seems to contain the files that I need, but when trying to use the package I get the error message:
FileNotFoundError: [Errno 2] No such file or directory: '~/my_venv/lib/python3.8/site-packages/taxon2wikipedia/dicts/phrase_start_dict.json'
The "data" and "dicts" folders are not there in "site-packages/taxon2wikipedia", but are present in the package when I manually download it? Any suggestions on why it might be happening?
Thanks!
Edit: extra information
I already have a MANIFEST.in file with recursive-include src *.rq *.py *.jinja *json, and even changing it to other similar options, it did not work.
I am using a pyproject.toml configuration and a mock setup.py to run setuptools.setup().
I run python3 -m build to build the package. Maybe the problem lies there?
This is the source repository by the way: https://github.com/lubianat/taxon2wikipedia
The files in PyPI are all correct, but don't seem to be downloaded when I pip install the package.
Edit 2 - Related question
Why does "pip install" not include my package_data files?
It seems to be some problem with how pip installs the package. The solution is different as in my case the files seem to be at the correct directory.
You need a MANIFEST.in file in your source directory to tell the packaging process which non-Python files need to come along in a built package.
In your case, a MANIFEST.in file with
recursive-include taxon2wikipedia *.json
should do the trick.
Since .whl files are zips, you can use zipinfo dist/yournewwheel.whl to see that the files are there too before pushing a release.
I came across this blog post which helped me solve the issue:
https://jwodder.github.io/kbits/posts/pypkg-data/
In the end, I was missing
[options.package_data] * = *.json *.jinja *.rq
from my setuf.cfg

Pipenv package hash does not match lock file

We have a lock file which has not changed since April 2021. Recently, we have started seeing the following error on pipenv install --deploy:
ERROR: THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
gunicorn==20.1.0 from https://files.pythonhosted.org/packages/e4/dd/5b190393e6066286773a67dfcc2f9492058e9b57c4867a95f1ba5caf0a83/gunicorn-20.1.0-py3-none-any.whl (from -r /tmp/pipenv-g7_1pdnq-requirements/pipenv-d64a8p6k-hashed-reqs.txt (line 32)):
Expected sha256 e0a968b5ba15f8a328fdfd7ab1fcb5af4470c28aaf7e55df02a99bc13138e6e8
Got 9dcc4547dbb1cb284accfb15ab5667a0e5d1881cc443e0677b4882a4067a807e
We have opened an issue in the project GitHub https://github.com/benoitc/gunicorn/issues/2889
We believe that it would be unsafe to use this new version without confirmation it is correct and safe in case someone has maliciously updated the package in the package repository.
Is there a way we can grab the wheel file from a previous docker build and force that to be used for the time being so we can safely build with the existing version and checksum?
Thanks
Thanks to #Ouroborus for the answer:
e0... is for the .tar.gz (source) package, 9d... is for the .whl package. (See the "view hashes" links on PyPI's gunicorn files page) I'm not sure why your systems are choosing to download the wheel now when they downloaded the source previously. However, those are both valid hashes for that module and version.

Why cant I upload my own package to PyPI when my credential are working?

I am looking to deploy a Module to PyPI and this error is thrown in the console:
HTTPError: 403 Client Error: The credential associated with user 'aloisdg' isn't allowed to upload to project 'example-pkg-your-username'. See https://test.pypi.org/help/#project-name for more information. for url: https://test.pypi.org/legacy/
It is possible to reproduce the error by following step by step the tutorial in the official documentation: Packaging Python Projects.
My credential works fine when I try to connect to the PyPI website directly.
Why cant I upload my own package?
This error means that you can't upload this package because you, as a user, are not allowed to. Why? Because it is not your package. Someone already created a package with this name. Your package is seen as an update to this already existing package. You won't have this error if the original creator would include you as maintainer of this package.
How to fix this error? Replace example-pkg-your-username with example-pkg-aloisdg (or any name absent from PyPI).
This answer was inspired by issue #4607.
You missed this step in the tutorial:
Open setup.py and enter the following content. Update the package name to include your username (for example, example-pkg-theacodes), this ensures that you have a unique package name and that your package doesn’t conflict with packages uploaded by other people following this tutorial.
Change the package name to be something unique and your upload will succeed.

pipenv install django without re-downloading it every time?

i have limited internet access temporarily, and every time i need to start a new django project i have to re-download django (and other dependencies) that i have already downloaded from other projects. is there an easier way to pipenv install django and other dependencies without downloading them over again every time?
i read that there is a cache of these dependencies, but if thats true my issue then becomes, i dont know how to access the cache so that the dependencies install correctly into the project. or if there was one location in my 'downloads' folder where i could install directly from it, instead of from the internet.
im hoping for something along the lines of:
pipenv install django==2.2.0 from=c:\downloads\dependencies
i expect that i will then install from previously downloaded files without the use of an internet connection.
you can enable the pip cache via the pip config file (pip config file user guide):
[global]
no-cache-dir = false
download-cache=/path/to/cache-dir (could be /usr/local/pip/cache)
(don't forget to actually create the this directory)
the file's locations:
$HOME/.pip/pip.conf on Unix
%HOME%\pip\pip.ini on Windows .
or for virtualenvs:
$VIRTUAL_ENV/pip.conf on Unix
%VIRTUAL_ENV%\pip.ini on Windows
you can create the file if its not there
You can create a shared pipenv, which includes packages: django, djangorestframework, psycopg2 etc, to be used among your django projects. You can find an answer for how to create a shared pipenv from here.

use a relative path in requirements.txt to install a tar.gz file with pip

We're using a requirements.txt file to store all the external modules needed. Every module but one is gathered from internet. The other one is stored on a folder under the one holding the requirements.txt file.
BTW, this module can be easily installed with pip install
I've tried using this:
file:folder/module
or this:
file:./folder/module
or even this:
folder/module
but always throws me an error.
Does anyone know which is the right way to do this?
Thanks
In the current version of pip (1.2.1) the way relative paths in a requirements file are interpreted is ambiguous and semi-broken. There is an open issue on the pip repository which explains the various problems and ambiguities in greater detail:
https://github.com/pypa/pip/issues/328
Long story short the current implementation does not match the description in the pip documentation, so as of this writing there is no consistent and reliable way to use relative paths in requirements.txt.
THAT SAID, placing the following in my requirements.txt:
./foo/bar/mymodule
works when there is a setup.py at the top level of the mymodule directory. Note the lack of the file:: protocol designation and the inclusion of the leading ./. This path is not relative to the requirements.txt file, but rather to the current working directory. Therefore it is necessary to navigate into the same directory as the requirements.txt and then run the command:
pip install -r requirements.txt
Its based off the current working directory (find with os.getcwd() if needed) and the relative path you provide in the requirements file.
Your requirements file should look like this:
fabric==1.13.1
./some_fir/some_package.whl
packaging==16.8
Note this will only work for .whl files not .exe
Remember to keep an eye on the pip install output for errors.
For me only the file: directive worked. This even works with AWS SAM, i.e. sam build. Here is my requirements.txt and the englishapps is my own custom Python package that I need in AWS Lambda
requests
file:englishapps-0.0.1-py3-none-any.whl
As was mentioned before, the files are relative to the current working directory, not the requirement.txt.
Since v10.0 requirements files support environment variables in the format: ${VAR_NAME}. This could be used as a mechanism to specify a file location relative to the requirements.txt. For example:
# Set REQUIREMENTS_DIRECTORY outside of pip
${REQUIREMENTS_DIRECTORY}/folder/module
Another option is to use the environment manager called Pipenv to manage this use case.
The steps after you do the pipenv install for a new project:
pipenv install -e app/deps/fastai (-e is editable, and is optional)
then you will see the following line in your Pipfile:
fastai = {editable = true,path = "./app/deps/fastai"}
here's similar issues:
https://github.com/pypa/pipenv/issues/209#issuecomment-337409290
https://stackoverflow.com/a/53507226/7032846
A solution that worked for me both for local and remote files (via Windows share). Here an example of requirements.txt
file:////REMOTE_SERVER/FOLDER/myfile.whl

Categories

Resources