I have code on an offline computer, so I needed to install dependency packages offline (which I myself figured out how to only yesterday). Now, I want to send my work to my teammates (it's a school project and half of them are first-time programmers), whether directly or through Github. I want to send them the packages I was using too, just in case they needed to do offline installation.
Is it possible to just zip up the dependency packages as-is and send it to them to unzip to the relevant directory? Where are these packages installed to? Are there other files that need to be sent? Or is there no other way than to run the python/pip install on the wheels/tar.gz I downloaded from PyPi (the solution I'm trying to avoid)?
You can make use of virtualenv for this purpose: https://virtualenv.pypa.io/en/latest/index.html
It will install dependencies inside a folder, so you can zip or share in many ways.
The folder you are looking for is 'site-packages' or 'dist-packages', you can put packages in here for use in python on a machine without a package manager.
The default directory for packages on windows is:
C:\Python\Lib\site-packages\
The default directory for packages on linux is:
/usr/lib/python2.7/dist-packages
Related
I want to download python libraries like NumPy, scipy, etc. in a separate folder. I want to include that folder in the python project so that whenever I switch to some other laptop, I don't need to install the libraries again rather I import libraries from that folder. Is there any way?
You can easily install python virtualenv.
Your libraries will be installed in directory created by virtualenv.
https://pypi.org/project/virtualenv/.
Other option, you can also use docker.
I suggest using virtual environment in this case. You could use pipenv so that the project hast exactly the libraries you need for it to run.
You can do it.
for numpy library: https://pypi.org/project/numpy/#files
You can download the files statically from pypi.
I would not recommend you go with this approach. There are several reasons to do that.
There would be a dependency on this kind of library. So you have to keep these dependencies along with the NumPy package.
These libraries are getting updated after some time with some newly added functionality and some bug fixes. So with the time other libraries might not compatible with this library.
Recommended way:
Just create an requirement.txt file that contains all the dependency with its version number.
whenever you want to use your project elsewhere, just install all these libraries with below command.
pip install -r requirement.txt
There are two major way you can install python libs to a separate folder: a virtual environment or a container.
Virtula environment (like venv, pipenv, etc) is good as this is the simplest way to your project's own liblaries set which is not impact any other pythonic script in your system. The downside of this case is thet you really have to set up an environment (including lib install) on every computer you move your script to. This can and should be autimated, of course, but this should be done either way.
The container, in other hand, requires additional resources to handle and to build, build, but it is exactly the box with a specific version of your script along with all libs and binaries it requires. No need to reinstall libs while moving to new laptop/desktop/server/cloud/whatever. For this case I would recommend the Docker/Kubernetes. But it's better to start with Docker.
I'm in a situation, where I have internet access on a computer but do not have permisson to install anything and python is missing as well.
Python on the other hand is installed on another computer without internet access. Both are in separate networks but i can transfer files through a file server which is connected to each computer as a networkn drive.
My quesion is, if it is possible to download packages with all the dependencies without having python and pip installed, than transfer the file and finally install it.
I simply tried to install a downloaded package from the PyPi Website as a *.zip or *.tar.gz file using the cmd with
chdir \path\to\package\file
python setup.py install
Importing that package afterwards creates errors as the dependencies are missing.
It would be totally fine if I just download Anaconda as all needed packages are already included with the installation. But I would still have the same problem when I want to update the packages.
I'm not sure what's the python way to include another library getting from github.
I'm planning to use this library https://github.com/nim901/gfycat which right now I just downloaded the zip and extract it and put that in lib folder. I have to checkin this library into the repo to work in Heroku. Is there a way to install the lib automatically from github?
Heroku has support for git-backed python dependencies via pip: https://devcenter.heroku.com/articles/python-pip#git-backed-distributions
I believe this fits your requirements better than checking the actual libraries into git. From the link above:
Anything that works with a standard pip requirements file will work as expected on Heroku.
Thanks to pip’s Git support, you can install a Python package that is hosted on a remote Git repository.
For example:
git+git://github.com/kennethreitz/requests.git
You can add the library as a submodule of your project. This will allow you to update it like any other git repository.
Is git clone https://github.com/nim901/gfycat.git and then git pull automatic enough? If this solution fits you and you need additional instructions, I will add them.
From how I'm reading your question, it sounds like you're trying to install the module to your system in order to be able to import it into projects and such.
Download the zip, extract it to wherever, and open up a terminal window to the same directory. Then just run python setup.py install from within the directory, and it should install into your system-wide site-packages directory for python.
I would have to recommend that you install it into it's own environment managed by virtualenv (https://virtualenv.pypa.io/en/latest/), but it's not necessary.
I have a django project set up in a virtual environment. I want to turn into a package (may be a tar), which I will just make available for download so that any one can just download, extract & run the project without any hassles of installing dependencies.
No, you shouldn't just simply make a tar of your django project and distribute it.
In the original documentation of Django there are instructions how reusable apps can be packaged and distributed. An explanation here would go beyond the scope of an answer. Please check the particular part of the documenation:
https://docs.djangoproject.com/en/1.8/intro/reusable-apps/
You can provide the dependencies for your package, which will be then automatically installed by pip.
I'm trying to use SVN to manage my python project.
I installed many external Libs (the path is like:"C:\Python27\Lib\site-packages")on Computer A,then I upload the project to the SVN Server.
and then I use Computer B which just has python(v2.7) been installed.I checkout from the SVN server
:here comes the problem..there is no external Libs in computer B.Is there any solution to solve this problem,I don't want to install the external Libs on Computer B again!
Thanks advance!
The normal Python way to deal with this is to use pip and requirements files. virtualenv, which lets you have multiple sets of installed packages, is also commonly used.
For example, if you have a project which depends on any version of itsdangerous and any version of Werkzeug over 0.9, you could have this requirements file:
Werkzeug>=0.9
itsdangerous
You would usually store that in a file named requirements.txt. You would then install the packages like this:
pip install -r requirements.txt
pip will find all of the packages needed not already installed and install them.
You could actually copy the package source code from site-packages to your project folder, and your project folder normally has a higher prority than site-packages.
Then you just need check-in library to your svn.