pipenv install django without re-downloading it every time? - python

i have limited internet access temporarily, and every time i need to start a new django project i have to re-download django (and other dependencies) that i have already downloaded from other projects. is there an easier way to pipenv install django and other dependencies without downloading them over again every time?
i read that there is a cache of these dependencies, but if thats true my issue then becomes, i dont know how to access the cache so that the dependencies install correctly into the project. or if there was one location in my 'downloads' folder where i could install directly from it, instead of from the internet.
im hoping for something along the lines of:
pipenv install django==2.2.0 from=c:\downloads\dependencies
i expect that i will then install from previously downloaded files without the use of an internet connection.

you can enable the pip cache via the pip config file (pip config file user guide):
[global]
no-cache-dir = false
download-cache=/path/to/cache-dir (could be /usr/local/pip/cache)
(don't forget to actually create the this directory)
the file's locations:
$HOME/.pip/pip.conf on Unix
%HOME%\pip\pip.ini on Windows .
or for virtualenvs:
$VIRTUAL_ENV/pip.conf on Unix
%VIRTUAL_ENV%\pip.ini on Windows
you can create the file if its not there

You can create a shared pipenv, which includes packages: django, djangorestframework, psycopg2 etc, to be used among your django projects. You can find an answer for how to create a shared pipenv from here.

Related

Redirect pip, setuptools, and everything related to private PyPI repository

I want all my PyPI-related queries to be run against a private server. This server hosts some company packages and proxies requests to the real PyPI server when it cannot serve the packages directly.
I am able to make pip read this repository, via export PIP_INDEX_URL='https://example.org/pypi/simple'.
However, when I try to install a package that depends on other private packages (via python setup.py install), the queries go straight to pypi.python.org.
I tried setting up .pydistutils.cfg, this does nothing.
I tried editing setup.py to include dependency_links=['https://example.org/pypi/simple/pkgname'], - but here I have to specify full url for each package. I do not want to do this.
I tried editing .pypirc to have pypi point to the required url. No luck here either.
Which configuration file or environment variable controls the index url for setup.py?
Edit ~/.pip/pip.conf:
[global]
trusted-host = private-server
index = http://user:password#private-server
index-url = http://user:password#private-server

How can I re-upload package to pypi?

I upload a package to pypi, but I got some trouble after upload, so I delete it completely, and I tried to re-upload, but there are some error after upload again:
HTTP Error 400: This filename has previously been used, you should use a different version.
error: HTTP Error 400: This filename has previously been used, you should use a different version.
It seems pypi can track the upload activity, I delete project and account and upload again, but I can see the previous record. Why?
How can I solve the problem?
In short, you cannot reupload a distribution with the same name due to stability reasons. Here you can read more about this issue at https://github.com/pypa/packaging-problems/issues/74.
You need to change the distribution's file name, usually done by increasing the version number, and upload it again.
Yes you can reupload the package with same name.
I had faced similar issue what I did was increased the version number in setup.py and delete the folders generated by running python setup.py sdist i.e. dist and your_package_name-egg.info and again run the commands python setup.py sdist to make the package upload ready.
I think pypi tracks the repo from folder generated by sdist i.e. dist and your_package_name-egg.info so you have to delete it.
If you are running your local pypi server then you can use -o,--overwrite option which will allow overwriting existing package files.
pypi-server -p 8080 --overwrite ~/packages &

Check whether a python package has been installed in 'editable' (egg-link) mode or not?

Is there any way to check whether a Python package has been installed normally (pip install / setup.py install) or in editable/egg-link mode (pip install -e / setup.py develop)?
I know I could check whether the path to the package contains site-packages which would most likely mean it's a "non-editable" install, but this feels extremely dirty and I would rather avoid this.
The reason I'm trying to check this is that my application is checking for config files in various places, such as /etc/myapp.conf and ~/.myapp.conf. For developers I'd like to check in <pkgdir>/myapp.conf but since I show the list of possible locations in case no config was found, I really don't want to include the pkgdir option when the package has been installed to site-packages (since users should not create a config file in there).
pip contains code for this (it's used by pip freeze to prefix the line with -e). Since pip's API is not guaranteed to be stable, it's best to copy the code into the own application instead of importing it from pip:
def dist_is_editable(dist):
"""Is distribution an editable install?"""
for path_item in sys.path:
egg_link = os.path.join(path_item, dist.project_name + '.egg-link')
if os.path.isfile(egg_link):
return True
return False
The code is MIT-licensed so it should be safe to copy&paste into pretty much any project.

Use a filesystem directory instead of pypi behind a corporate firewall?

I am working on a product with a large number of python dependencies within a corporation that does not permit servers to contact external machines. Any attempt to circumvent this rule would be judged harshly.
The application is deployed via a batch-script (it's 32 bit windows) into a virtualenv. This batch script (ideally) should do nothing more than
# Precondition: Source code has been checked-out into myprog/src
cd myprog/src
setup.py install # <-- fails because of dependencies
myprog.exe
The problem comes with managing the dependencies - since it's impossible for the server to connect to the outside world my only solution is to have the script easy_install each of the dependencies before the setup starts, something like this:
cd myproc/deps/windows32
easy_install foo-1.2.3.egg
easy_install bar-2.3.4.egg
easy_install baz-3.4.5.egg <-- works but is annoying/wrong
cd ../../myprog/src
setup.py install
myprog.exe
What I'd like to do is make it so that the setup.py script knows where to fetch it's dependencies from. Ideally this should be set as a command-line argument or environment variable, that way I'm not going to hard-code the location of the dependencies into the project.
Ideally I'd like all of the eggs to be part of a 'distributions' directory: This can be on a network drive, shared on a web-server or possibly even be deployed to a local folder on each of the servers.
Can this be done?
I think what you are looking for is these options to pip: --no-index and --find-links:
--no-index
--find-links /my/local/archives
--find-links http://some.archives.com/archives
Docs are here.

use a relative path in requirements.txt to install a tar.gz file with pip

We're using a requirements.txt file to store all the external modules needed. Every module but one is gathered from internet. The other one is stored on a folder under the one holding the requirements.txt file.
BTW, this module can be easily installed with pip install
I've tried using this:
file:folder/module
or this:
file:./folder/module
or even this:
folder/module
but always throws me an error.
Does anyone know which is the right way to do this?
Thanks
In the current version of pip (1.2.1) the way relative paths in a requirements file are interpreted is ambiguous and semi-broken. There is an open issue on the pip repository which explains the various problems and ambiguities in greater detail:
https://github.com/pypa/pip/issues/328
Long story short the current implementation does not match the description in the pip documentation, so as of this writing there is no consistent and reliable way to use relative paths in requirements.txt.
THAT SAID, placing the following in my requirements.txt:
./foo/bar/mymodule
works when there is a setup.py at the top level of the mymodule directory. Note the lack of the file:: protocol designation and the inclusion of the leading ./. This path is not relative to the requirements.txt file, but rather to the current working directory. Therefore it is necessary to navigate into the same directory as the requirements.txt and then run the command:
pip install -r requirements.txt
Its based off the current working directory (find with os.getcwd() if needed) and the relative path you provide in the requirements file.
Your requirements file should look like this:
fabric==1.13.1
./some_fir/some_package.whl
packaging==16.8
Note this will only work for .whl files not .exe
Remember to keep an eye on the pip install output for errors.
For me only the file: directive worked. This even works with AWS SAM, i.e. sam build. Here is my requirements.txt and the englishapps is my own custom Python package that I need in AWS Lambda
requests
file:englishapps-0.0.1-py3-none-any.whl
As was mentioned before, the files are relative to the current working directory, not the requirement.txt.
Since v10.0 requirements files support environment variables in the format: ${VAR_NAME}. This could be used as a mechanism to specify a file location relative to the requirements.txt. For example:
# Set REQUIREMENTS_DIRECTORY outside of pip
${REQUIREMENTS_DIRECTORY}/folder/module
Another option is to use the environment manager called Pipenv to manage this use case.
The steps after you do the pipenv install for a new project:
pipenv install -e app/deps/fastai (-e is editable, and is optional)
then you will see the following line in your Pipfile:
fastai = {editable = true,path = "./app/deps/fastai"}
here's similar issues:
https://github.com/pypa/pipenv/issues/209#issuecomment-337409290
https://stackoverflow.com/a/53507226/7032846
A solution that worked for me both for local and remote files (via Windows share). Here an example of requirements.txt
file:////REMOTE_SERVER/FOLDER/myfile.whl

Categories

Resources