How to create a common environment for teamwork in Python - python

I would like to create a virtual environment for my team. My team, works in different places and everyone has their own environment, it causes a lot of problems, everyone has a different version of the libraries (Python, RobotFramework).
I thought about:
creating one common environment, I used virtualenv.
Installing the prepared libraries (python and robotframework) with one command pip install ...,
Prepared libraries will be in the git repository so that everyone can modify them, change the library version.
I have the first and third parts done, but I have a problem with the second. How to create such a package of libraries to be able to install it with one pip install command.
Should I create an environment locally, install all the libraries in it, and send them to git? Or should I package the project via setuptool (to tar.gz)?
Unfortunately, I cannot find the answer to this question, it seems to me that none of the above solutions is optimal.

The easiest way of doing it would be creating a text file of all the libraries you are using in pip with the command.
pip freeze > requirements.txt
This will create a file listing all the packages with their versions that are being used. To install that ask every team member to place that requirement file in their projects and use
pip install -r requirements.txt

With pip, you could download your dependencies. These will be .tar.gz, .whl or .zip files. Note that this could be complicated if your team uses multiple OS.
Here is an example which will download the dependencies into the directory named "dependencies", you can push this to git along with the requirements file.
pip freeze > req.txt
pip download -r req.txt -d dependencies
When someone clones your repository, they can install the dependencies offline with the following command.
pip install --no-index --find-links=dependencies -r req.txt

Related

Build a python package only with dependencies

Is there a way to build a python package that contains only the dependencies needed by another project? The goal here is to install this module in a different project using a single pip install command. The dependent project installs almost 30 packages (public and private)
If such a thing is possible,
How should it be structured ?
What other files are needed apart from requirements file ?
If such a thing is not possible, what are my options ?
For example when installing an offline environment you can utilize pip download functionality: https://pip.pypa.io/en/stable/cli/pip_download/
Basically instead of install you can download the package contents to a directory of your choice with
pip download -r requirements.txt -d <directory> SomePackage
then move that package to the offline deployment and install it with
pip install --no-index --find-links <directory>

Is there any way to package python code so that other machine does not need to install all the dependencies using pip

I have installed python in my system and wrote a simple script using GET REST API for Jenkins data.
I have installed all the required modules using pip. Now I want to package this script with all the dependencies and run on another machine. However, in another machine, I don't want to perform all the pip installation steps.
I know we can mention all the modules in the requirements.txt and use pip install -r requirements.txt. But, is there any way so that I don't need to install modules using pip for each dependency, such that I can install Python and all other dependencies must be installed when I run the zip file.
You can install pip dependencies to a certain directory using -t (target).
pip install -r requirements.txt -t .
That will install your pip modules to the current directory. You can zip the whole thing then and deploy. Make sure that the environment you install the dependencies in matches your intended deployment environment. For consistency you can run the command in a docker container, for example.
I think you should use virtualenv module which makes your project easily deploy-able.
Virtual Environment should be used whenever you work on any Python based project. It is generally good to have one new virtual environment for every Python based project you work on. So the dependencies of every project are isolated from the system and each other.
I came across a link which can help Virtual Env explained

Install python packages offline on server

I want to install some packages on the server which does not access to internet. so I have to take packages and send them to the server. But I do not know how can I install them.
Download all the packages you need and send them to the server where you need to install them. It doesn't matter if they have *whl or *tar.gz extension. Then install them one by one using pip:
pip install path/to/package
or:
python -m pip install path/to/package
The second option is useful if you have multiple interpreters on the server (e.g. python2 and python3 or multiple versions of either of them). In such case replace python with the one you want to use, e.g:
python3 -m pip install path/to/package
If you have a lot of packages, you can list them in a requirement file as you would normally do when you have access to the internet. Then instead of putting the names of the packages into the file, put the paths to the packages (one path per line). When you have the file, install all packages by typing:
python -m pip install -r requirements.txt
In the requirements file you can also mix between different types of the packages (*whl and *tar.gz). The only thing to take care about is to download the correct versions of the packages you need for the platform you have (64bit packages for 64bit platform etc.).
You can find more information regarding pip install in its documentation.
You can either download the packages from the website and run python setup.py install. Or you can run a pip install on a local dir, such as :
pip install path/to/tar/ball
https://pip.pypa.io/en/stable/reference/pip_install/#usage
Download the wheel packages from https://www.lfd.uci.edu/~gohlke/pythonlibs/ . You may install the .whl packages by pip install (package.whl) , refer installing wheels using pip for more.
Download the package from website and extract the tar ball.
run python setup.py install

Python: Packages that are not relevant for project

When I create a virtualenv for a Python project, it get's "polluted" by packages that I install for my convenience (like iPython or other packages that my editor "VS Code" depends on, like "pylint").
But these packages are not relevant for my project. So if I do pip freeze > requirements.txt, I see that only a few packages are relevant for my project.
What is the best way to clean up?
Install those packages in a global context so that I can use them in every project I begin? or
Do a pip freeze > requirements.txt, then edit the requirements file and remove not needed packages?
What we do here:
First we have the project's requirement file - the one used for deployments. This is not built using pip freeze but manually edited so it only contains relevant packages.
Then we have the "dev" requirement file with packages that are only useful for development but are required to work on the project (linters, additionnal testing stuff etc).
And finally each is free to maintain his own personal additional requirements (editor-related packages etc).
Note that using virtualenvwrapper (which really helps for development installs) you define hooks that will install packages when you create a new virtual env.
Here is an alternative solution for preparing requirements.txt manually.
The project I mentioned above prepares a requirements.txt
for your project based on the imports you did in your project's Python files.
Assuming all of your Python files in myproject, doing these in your terminal:
$ pip install pipreqs
$ pipreqs myproject
will generate a requirements.txt file for you.
This way, you can just pip install -r requirements.txt in your virtual environment instead of pip freeze > requirement.txt since you will have only the packages which are related to your project.

How to have `pip install --editable` to run sdist instead of develop?

This Python package install using pip or easy_install from repos points out a very interesting features of pip.
However, sometimes you just want it to install the source distribution; this is particularly true when
you are running in a virtualenv (so you don't care about messing up the python path, since you are deliberating doing it in an env),
when you are not the developer of that particular package, and you don't want to have it "editable",
when you cannot pip install package-name because the package is not in any index,
when there is no tar.gz available.
Thanks for your answers!
Have you tried just omitting the --editable? If I run
pip install hg+http://bitbucket.org/carljm/django-markitup/
it clones the repo to a temporary build directory and installs normally (via setup.py install rather than setup.py develop).
Of course, if you then freeze this environment, the generated requirement will not be fulfillable. If you need this, then just use --editable (there's really not much difference, works fine even if you don't actually need to edit the package) or just run your own instance of something like chishop and upload the sdists you need to it, then use the -i or --extra-index-url option.

Categories

Resources