external Libs pack to python project - python

I'm trying to use SVN to manage my python project.
I installed many external Libs (the path is like:"C:\Python27\Lib\site-packages")on Computer A,then I upload the project to the SVN Server.
and then I use Computer B which just has python(v2.7) been installed.I checkout from the SVN server
:here comes the problem..there is no external Libs in computer B.Is there any solution to solve this problem,I don't want to install the external Libs on Computer B again!
Thanks advance!

The normal Python way to deal with this is to use pip and requirements files. virtualenv, which lets you have multiple sets of installed packages, is also commonly used.
For example, if you have a project which depends on any version of itsdangerous and any version of Werkzeug over 0.9, you could have this requirements file:
Werkzeug>=0.9
itsdangerous
You would usually store that in a file named requirements.txt. You would then install the packages like this:
pip install -r requirements.txt
pip will find all of the packages needed not already installed and install them.

You could actually copy the package source code from site-packages to your project folder, and your project folder normally has a higher prority than site-packages.
Then you just need check-in library to your svn.

Related

Artifactory: Download and unzip zip file using pip install

I created a repository, on Artifactory, which includes a zip containing 2 folders.
https://artifactory.healthcareit.net:443/artifactory/pi-generic-local/paymentintegrity-airflow-lib-plugins/paymentintegrity-airflow-libs-plugins-202211031330.zip
Is there a way to download that zip and extract the directories during a pip install? Basically, the developer just runs a pip install and gets the directories where needed.
I'm not looking for something in the [scripts] section since installing these directories would be the only thing currently needed in the Pipfile (its a weird project). Is this possible?
You can abuse the install command in the setup.py to call any arbitrary code, such as a one that unzips the assets and install it at the right place. I have seen this being done to package c++ binaries using python pypi in a place I worked for. See Post-install script with Python setuptools for hints on how to override the install command.
As per my observations, you have created generic repo and trying to fetch the packages using pip install. I would recommend creating a PyPI repository in the Artifactory and then try fetching the packages. This will. help in creating the metadata which will maintain all the package versions in the repository.
If you have the packages in your local then push them in the PyPI local repo and when you resolve them from Artifactory it will automatically download for the pip install based on your requirement.
If your requirement is to zip up multiple packages and push the archive file to the Artifactory and want the Artifactory to unzip and give the dependeciense during the pip install - then this is not possible from the Artifactory side we need to use a Post-install script with Python setup tools as mentioned .

Can I send others the third-party Python packages I installed?

I have code on an offline computer, so I needed to install dependency packages offline (which I myself figured out how to only yesterday). Now, I want to send my work to my teammates (it's a school project and half of them are first-time programmers), whether directly or through Github. I want to send them the packages I was using too, just in case they needed to do offline installation.
Is it possible to just zip up the dependency packages as-is and send it to them to unzip to the relevant directory? Where are these packages installed to? Are there other files that need to be sent? Or is there no other way than to run the python/pip install on the wheels/tar.gz I downloaded from PyPi (the solution I'm trying to avoid)?
You can make use of virtualenv for this purpose: https://virtualenv.pypa.io/en/latest/index.html
It will install dependencies inside a folder, so you can zip or share in many ways.
The folder you are looking for is 'site-packages' or 'dist-packages', you can put packages in here for use in python on a machine without a package manager.
The default directory for packages on windows is:
C:\Python\Lib\site-packages\
The default directory for packages on linux is:
/usr/lib/python2.7/dist-packages

How do I include library from github in python project without pip?

I'm not sure what's the python way to include another library getting from github.
I'm planning to use this library https://github.com/nim901/gfycat which right now I just downloaded the zip and extract it and put that in lib folder. I have to checkin this library into the repo to work in Heroku. Is there a way to install the lib automatically from github?
Heroku has support for git-backed python dependencies via pip: https://devcenter.heroku.com/articles/python-pip#git-backed-distributions
I believe this fits your requirements better than checking the actual libraries into git. From the link above:
Anything that works with a standard pip requirements file will work as expected on Heroku.
Thanks to pip’s Git support, you can install a Python package that is hosted on a remote Git repository.
For example:
git+git://github.com/kennethreitz/requests.git
You can add the library as a submodule of your project. This will allow you to update it like any other git repository.
Is git clone https://github.com/nim901/gfycat.git and then git pull automatic enough? If this solution fits you and you need additional instructions, I will add them.
From how I'm reading your question, it sounds like you're trying to install the module to your system in order to be able to import it into projects and such.
Download the zip, extract it to wherever, and open up a terminal window to the same directory. Then just run python setup.py install from within the directory, and it should install into your system-wide site-packages directory for python.
I would have to recommend that you install it into it's own environment managed by virtualenv (https://virtualenv.pypa.io/en/latest/), but it's not necessary.

Is there any way to package all python dependency into single executable

We are shipping our product to customers location who may or may not have python and other libraries installed, so can we reduce our python script into an independent executable with python and other required libraries included , so are there other ideas ?
You can use py2exe it does exactly what you need, and its very easy to use. I have used it on one of my projects which are online and used daily.
http://www.py2exe.org/
and here is their tutorial:
http://www.py2exe.org/index.cgi/Tutorial
You can deliver a package with Python and then apply one of these two methods:
Package With python + virtualenv
There's many solutions for that. One I like is virtualenv, which can allow you to deploy a specific configuration of a Python project (with dependencies) on another machines.
Package With python + pip
Another way is to use pip and write a requirements.txt file at the root of your project, which contains every dependency (1 per line), for example:
django>=1.5.4
pillow
markdown
django-compressor
By doing pip -r requirements.txt in the root dir, the program will install packages needed.
See also:
How do you use pip, virtualenv and Fabric to handle deployment?
Pip installer documentation
Virtualenv documentation

Using pip to install single-file python modules

I'm wondering if there's a way to "install" single-file python modules using pip (i.e. just have pip download the specified version of the file and copy it to site-packages).
I have a Django project that uses several 3rd-party modules which aren't proper distributions (django-thumbs and a couple others) and I want to pip freeze everything so the project can be easily installed elsewhere. I've tried just doing
pip install git+https://github.com/path/to/file.git
(and tried with the -e tag too) but pip complains that there's no setup.py file.
Edit: I should have mentioned - the reason I want to do this is so I can include the required module in a requirements.txt file, to make setting up the project on a new machine or new virtualenv easier.
pip requires a valid setup.py to install a python package. By definition every python package has a setup.py... What you are trying to install isn't a package but rather a single file module... what's wrong with doing something like:
git clone git+https://github.com/path/to/file.git /path/to/python/install/lib
I don't quite understand the logic behind wanting to install something that isn't a package with a package manager...

Categories

Resources