I need to install a python project on a machine without internet connection. During the creation of the deb package I downloaded all python requirements with pip download and I putted them into the deb.
When installing the deb on the machine I get errors from python packages not being found and I discovered that packages creating problems are the ones specified into the field setup_requires of setup.py files of the included packages.
For example the PyJWT package has setup_requires=['pytest-runner'] but pytest-runner is not downloaded by pip download and during the installation this gives error.
My question are:
is there a way of having pip downloading all dependencies (also those on setup_requires fields)?
Is this the correct workflow for creating a deb that has to be installed offline?
Related
I am behind corporate firewall and want to install a python package from .bz2 file.
I verified my python version to be 3.6.5 and downloaded appropriate package from anaconda cloud
This is how i am installing the package
conda install path_to_.bz2file
The error I get
I saw several examples where there is mention of extracting and running the setup.py file. I didnt find any setup.py file after extraction. Maybe its related to pypi not sure.
I dont have any internet connection on this machine so have to look for offline installation options.
Any help is appreciated.
PyPI distributions usually come with a setup.py. Here are the steps to download offline.
Find the package you want to download on PyPI
Download the latest distribution to a local directory, should be a tar.gz (tarball) file.
Open Anaconda Prompt/Terminal
cd to the tar.gz parent folder
pip install (filename)
Sometimes a package will have dependencies that need to be installed online. In this case you will need to do the same with the dependency before you can successfuly run the setup.py
I am trying to install a library in a virtualenv instance with pip. The library version I want (wxPython 3.0.2)
is not available on PyPi; it is only available for download from SourceForge. Thus, I have the source tarball downloaded on my machine and I am trying to install it in such a way that it will play nicely with virtualenv.
(I am on a Windows computer, running Python 2.7.)
I have tried the following:
doing a direct install: pip install wxPython-src-3.0.2.0.tar.bz2
extracting the files from the tarball to wxPython-src-3.0.2.0, then installing from the extracted directory: pip install wxPython-src-3.0.2.0
extracting the files from the tarball, then navigating into the extracted folder to the nested wxPython directory, which holds the setup.py file, and then installing from there: pip install wxPython
The last attempt seems the most promising, but I get the following traceback:
Processing \wxpython-src-3.0.2.0\wxpython
Complete output from command python setup.py egg_info:
Setuptools must be installed to build an egg
----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in c:\users\__MY_USERNAME__\appdata\local\temp\pip-req-build-q0pxlt\
This is also strange, because it suggests I don't have setuptools even though I can run pip list and see version 40.6.3 installed.
Any help appreciated.
Why not install a precompiled version? There are a lot of .exe files at SF. You probably need wxPython3.0-win64-3.0.2.0-py27.exe.
Also take a look at Christoph Gohlke's collection.
If you still insist on installing from sources please bear in mind that wxPython 3 is so old it predates pip. Forget about pip.
First, you need to install wxWidgets as wxPython is just a Python wrapper for wxWidgets C++ library. Extract wxPython-src-3.0.2.0.tar.bz2 and follow instructions in wxPython-src-3.0.2.0/docs/msw/install.txt.
After compiling and installing wxWidgets compile wxPython. See wxPython-src-3.0.2.0/wxPython/docs/BUILD.txt.
My eventual solution was the easy way out: installing my package (wxPython) locally as #phd suggested, and opting for local package access via either virtualenv --system-site-packages env or deleting the "no-global-site-packages.txt" file in an existing environment folder.
Not what I expected to do, but it works so no complaints.
I am trying install a Django app on Heroku. My app needs pyke3. The recommended way for installing pyke3 is to download pyke3-1.1.1.zip https://sourceforge.net/projects/pyke/files/pyke/1.1.1/ and then install (into a virtualenv if desired) using the instructions on http://pyke.sourceforge.net/about_pyke/installing_pyke.html. How do I install pyke3 on heroku? Is there a way to add this to the requirements.txt, and how will heroku know where to get the pyke3 zip file?
From pip's docs:
pip supports installing from PyPI, version control, local projects, and directly from distribution files.
So, pip supports installing packages directly from links. All you have to do is put the link to the required package in your requirements file.
To download the package pyke3-1.1.1.zip, add this link in your requirements:
https://sourceforge.net/projects/pyke/files/pyke/1.1.1/pyke3-1.1.1.zip/download
I'm in a situation, where I have internet access on a computer but do not have permisson to install anything and python is missing as well.
Python on the other hand is installed on another computer without internet access. Both are in separate networks but i can transfer files through a file server which is connected to each computer as a networkn drive.
My quesion is, if it is possible to download packages with all the dependencies without having python and pip installed, than transfer the file and finally install it.
I simply tried to install a downloaded package from the PyPi Website as a *.zip or *.tar.gz file using the cmd with
chdir \path\to\package\file
python setup.py install
Importing that package afterwards creates errors as the dependencies are missing.
It would be totally fine if I just download Anaconda as all needed packages are already included with the installation. But I would still have the same problem when I want to update the packages.
So I have published a conda package (link).
This package contains .c extensions (coming from cython code), which need to be compiled when the package is installed. My problem is that none of the extensions are compiled when running the install command
conda install -c nicolashug scikit-surprise
Compiling the extensions can be done by simply running
python setup.py install
which is exactly what pip does. The package is on PyPI and works fine.
As far as I understand, this setup.py command is only called when I build the conda package using conda build: the meta.yaml file (created with conda skeleton) contains
build:
script: python setup.py install --single-version-externally-managed--record=record.txt
But I need this to be done when the package is installed, not built.
Reading the conda docs, it looks like the install process is merely a matter of copying files:
Installing the files of a conda package into an environment can be thought of as changing the directory to an environment, and then downloading and extracting the .zip file and its dependencies
That would mean I would have to build the package for all platforms and architectures, and then upload them to conda... Which is impossible to me.
So, is there a way to build the package when it is installed, just like pip does?
As far as I know, there is no way to have the compilation happen on the user's machine when installing a conda package. Indeed, the whole idea of a conda package is that you do the compiling so that I don't have to on my machine, and all that's distributed is the compiled library. On Windows in particular, setting up compilers so they work properly (with Python) is a big big PITA, which is one of the biggest reasons for conda (and also wheels installed by pip).
If you don't have access to a particular OS directly, you can use Continuous Integration (CI) services, such as Appveyor (Windows), Travis CI (Linux/macOS), or CircleCI (Linux/macOS) to build packages and upload them to Anaconda cloud (or to PyPI for that matter). These services integrate directly with GitHub and other code-sharing services, and are generally free for FOSS projects. That way, you can build packages on each commit, on each tag, or some other variation that you desire.
In the end, you may save more time by setting up these services, because you won't have to provide compiler support for users who can't install a source package from PyPI.