How to create a python package built with all dependencies [isolation] - python

TL;DR;
The package ColorMe, depends on mypy 3.
The package ColorMe is installed in project Rainbow and Rainbow depends on mypy 2.
How to distribute *ColorMe with all the dependencies, so those who use ColorMe don't need to follow its dependencies versions.
Complete Story:
I have created a project validator (code linter, env checker, etc etc) in my company, which needs to be installed as a dependency in your python solution (normally via pip, or poetry). Currently, it's packaged and distributed as a whl file via poetry build.
Though it's a project that keeps growing and has a lot of dependencies, it starting to conflict with the packages existing in the developers' solutions.
I don't want that my package influences their project, as it must be used as a stand-alone CLI and not as a code source for their project.
Is it possible to create a build or something that isolates the dependencies?
So when devs install my CLI tool (with everything needed for it), which has a dependency on mypy, flake8, etc., don't make the main project dependent on those packages as well?

Related

Adding dependencies to deliverable package

I have a Python program that has dependencies on other Python libraries. I have used virtualenv and pip to get requirements.txt for all the libs needed to run the app, and thus keeping the environment clean of unnecessary libs. Things work great and I can make progress in developing the app.
This works on my machine, but the issue is that I need to package the app and deploy/distribute to an environment where the requirements.txt and pip cannot be used to simply download the dependencies. The target environment needs a fully functional application.
I'm a bit confused with all these tools offered by Python, such as setuptools and distutils, since none of them seem to offer this (at least easily).
I'm used to Java way, with Maven/Gradle etc. where one simply states dependencies and they are added to the distributable jar/war, unless explicitly stated otherwise.
The dependencies are install inside my virtual environment, under scripts/dir. Is there some easy way to get the dependencies bundled within my app with standard tools, or do I need to roll my own for this?

Install python package into directory relative to site-packages

I have a Python package that is one of a collection of company Python packages. When I run
python setup.py install
I want the package to be installed to a common company directory, along with other company packages. I want this directory to be relative to the default Python install directory, e.g.,
/usr/lib/python2.7/site-packages/<company_name>/<python_package_name>
That is, I want to insert <company_name> into the installation path at install time.
I've seen ways to prefix this path, but can't seem to work out how to do what I've described.
Unfortunately Python packaging doesn't work like that. You could probably bend it to work like that but that would be quite an effort for a person without experience in Python packaging and even for experienced persons the amount/output tradeoff would not make sense. You do not mention any motive to do this besides your personal preference.
Instead, to have well-managed and human-navigable package installation folder, I recommend you to study the following resources
PEP 0382 - Namespace Packages: How to create packages like companyname.foobar, companyname.moomoo
Installing packages into a virtualenv - Python packaging installation guide (official)
Scrambler: Symlink namespaced Python packages to a single folder

Cross platform package with bundled dependencies

I need an easy way to distribute a Python project with all its dependencies included. I don't want something like PyInstaller: I need to distribute the same cross-platform package for each user. Recompiling on each OS is not an option.
When copied and installed into another system, the dependencies should be picked up from within the package (and not, e.g., from the Internet). Ideally, they should be installed into an isolated environment relative to that package.
Is it possible?
For those who know Ruby:
In Ruby, with Bundler, running bundle package --all-platforms will copy all my dependencies to ./vendor/cache/ within the project folder.
Then, on another machine, I can run bundle install --deployment and it will install the dependencies from ./vendor/cache to the local ./vendor/bundle path.
I'm looking for a similar procedure, with Python.

Include run-time dependencies in Python wheels

I'd like to distribute a whole virtualenv, or a bunch of Python wheels of exact versions with their runtime dependencies, for example:
pycurl
pycurl.so
libcurl.so
libz.so
libssl.so
libcrypto.so
libgssapi_krb5.so
libkrb5.so
libresolv.so
I suppose I could rely on the system to have libssl.so installed, but surely not libcurl.so of the correct version and probably not Kerberos.
What is the easiest way to package one library in a wheel with all the run-time dependency?
Or is that a fool's errand and I should package entire virtualenv?
How to do that reliably?
P.S. compiling on the fly is not an option, some modules are patched.
AFAIK, there is no good standard way to portably install dependencies with your package. Continuum has made conda for precisely this purpose. The numpy guys wrote their own distutils submodule in their package to install some complicated dependencies, and now at least some of them advocate conda as a solution. Unfortunately, you may have to make conda packages for some of these dependencies yourself.
If you're fine without portability, then targeting the package manager of the target machines will obviously work. Otherwise, for a portable package manager, conda is the only option I know of.
Alternatively, from your post ("compiling on the fly is not an option") it sounds like portability may not be an issue for you, in which case you could also install all the requirements to a prefix directory (most installers I've come across support a configure --prefix=/some/dir/ option). If you have a guaranteed single architecture, you could probably prefix-install all your dependencies to a single directory and pass that around like a file. The conda approach would probably be cleaner, but I've used prefix installs quite a bit and they tend to be one of the easiest solutions to get going.
Edit:
As for conda, it is simultaneously a package-manager and a "virtualenv"-like environment/python install. While virtualenv is added on top of an existing python install, conda takes over the whole install, so you can be more sure that all the dependencies are accounted for. Compared to pip, it is designed for adding generalized non-Python dependencies, instead of just compiling C/Cpp exentions. For more info, I would see:
pip vs conda (also recommends buildout as a possibility)
conda as a python install
As for how to use conda for your purpose, the docs explain how to create a recipe:
Conda build framework
Building a package requires a recipe. A recipe is flat directory which
contains the following files:
meta.yaml (metadata file)
build.sh (Unix build script which is executed using bash)
bld.bat (Windows build script which is executed using cmd)
run_test.py (optional Python test file)
patches to the source (optional, see below)
other resources, which are not included in the source and cannot be
generated by the build scripts.
The same recipe should be used to build a package on all platforms.
When building a package, the following steps are invoked:
read the metadata
download the source (into a cache)
extract the source in a source directory
apply the patches
create a build environment (build dependencies are installed here)
run the actual build script. The current working directory is the source
directory with environment variables set. The build script installs into
the build environment
do some necessary post processing steps: shebang, rpath, etc.
add conda metadata to the build environment
package up the new files in the build environment into a conda package
test the new conda package:
create a test environment with the package (and its dependencies)
run the test scripts
There are example recipes for many conda packages in the conda-recipes
<https://github.com/continuumio/conda-recipes>_ repo.
The :ref:conda skeleton <skeleton_ref> command can help to make skeleton recipes for common
repositories, such as PyPI <https://pypi.python.org/pypi>_.
Then, as a client, you would install the package similar to how you would install from pip
Lastly, docker may also be interesting to you, though I haven't seen it much used for Python.
You may want to look into PEX: https://pex.readthedocs.io/en/stable/whatispex.html
'Files with the .pex extension – “PEX files” or ”.pex files” – are self-contained executable Python virtual environments. PEX files make it easy to deploy Python applications: the deployment process becomes simply scp.'

Can setuptools install dependencies when packaging as .exe?

I'm an author of a pure Python library that aims to be also convenient to use from a command line. For Windows users it would be nice just installing the package from an .exe or .msi package.
However I cannot get the installer to install package dependencies (especially the dependency on setuptools itself, so that running the software fails with an import error on pkg_resources). I don't believe that providing an easy .exe installer makes much sense, if the user then needs to manually install setuptools and other libraries on top. I'd rather tell them how to add easy_install to their PATH and go through this way (http://stackoverflow.com/questions/1449494/how-do-i-install-python-packages-on-windows).
I've build .exe packages in the past, but don't remember if that ever worked the way I'd preferred it to.
It is quite common to distribute packages that have dependencies, especially those as you have, but I understand your wish to make installation as simple as possible.
Have a look at deployment bootstrapper, a tool dedicated to solving the problem of delivering software including its prerequisites.
Regardless of what packaging method you eventually choose, maintain your sanity by staying away from including MSIs in other MSI in any way. That just does not work because of transactional installation requirements and locking of the Windows Installer database.

Categories

Resources