How to distribute a python project for personal use - python

So I've been working on a project.
I've reached the stage where everything works.
The only thing that's left for me to do is to deploy/distribute the project so I'll be able to install / run it on my own machines with minimal effort. Since this project is for personal use only, I wasn't planning on distributing it and putting on on PyPi.
So what is the best distribution scheme given this?
Is it to install using distutils and setup.py?
My original plan was to pull the entire project from my git repository on to the host machine and automatically install the dependencies (Which I understood can be done using Pip and a requirement file). Does this make more sense?
I'd appreciate some guidance since I've never done this before.

Related

Managing python project CI/CD with Conda, Poetry and pip

I have read through a couple of dozens of writeups on how to equip a modern python project to automate linting, testing, coverage, type checking etc (and eventually deploy it on cloud servers. Not interested in the latter part yet.)
I am thinking about using conda as my environment manager. This would give me the advantage of being able to install non python packages if needed and also by creating the project environment with a specified python version I believe it would replace pyenv.
Within the newly conda created environment I would use poetry to manage dependencies and to initialize its TOML and add other needed python packages which would be installed by pip from PyPI.
Did I get the above right?
To complete the "building" phase I am also looking into using pytest and pytest-cov for code coverage, mypy/pydantic for type checking and black for formatting.
All of this should work both on my local development machine but then when I push on GitHub trigger an Action and perform the same checks, so that any contributor will go through them. Currently managed to do it for pylint, pytest and coverage for very simple projects without requirements.
Does all of this make sense? Am I missing some important component/step? For example I'm trying to understand if tox would help me in this workflow by automating testing on different python versions, but haven't come to grips integrating this flood on (new to me) concepts. Thanks

How can I deploy a Python script and its dependencies (no Internet connection)?

I have a production PC that won't have a decent Internet connection (e.g. let's say just the bare 'apt-get install ....', definitely no pip3)
So I develop on a similar Linux OS/environment for the everyday work, but I'll push it to the final machine in the more portable way, if possible, in a standalone way (e.g. no need to install anything)
My Python environment was initiated with Poetry, Python 3, half a dozen of packages/lib/modules (snmp, hexdump, tftp, coloredlogs, pcap, etc.), trying to keep it PEP8 clean and now I would like to copy (rsync) my work to the production machine in a few weeks without having to unroll the whole packages & deps installation.
Did I just describe Docker? It sounds like such an overkill for what I want to achieve.
I was hoping to have poetry to take care of this, but even poetry seems to be only pip-available.
Shall I list and download my packages and dependencies manually and install them manually? (But the pip 3 installation on that machine must be so old, I feel I'll hit dependencies waterfalls.)
Did I google that badly that I missed the best practice for such a common scenario?
I am eager to read your best hints!

Debian build package: Add python virtualenv into dpkg-buildpackage to be uploaded to launchpad

I would like to pack a python program and ship it in a deb package.
For reasons (I know in 99% it is bad practice) I want to ship the program in a python virtual environment within a debian package.
I know I can do this using dh-virtualenv. This works great - generally no problem.
But the problem arises when I want to upload this to launchpad. Uploading to launchpad means uploading a source package. In terms of dh-virtualenv a source package is the package description, where the virtualenv has not been created, yet.
What happens when I upload this to launchpad is, that the package will not build, since the dh-virtualenv which is executed during the build process on launchpad will try to install python modules into the virtualenv, which means installing these from the PyPI, which will not work, since launchpad does not allow external network access.
So basically there are two possible solutions:
Approach A
Prepare the virtualenv and somehow incorporate it into the source package and having the dh build process simply "move" this prepared virtualenv to its end location. This could work with virtualenv --relocatable. BUT the relocation strips the utf-8 marker at the beginning of all python scripts, rendering all python scripts in the virtualenv broken.
Apporach B
Somehow cache all necessary python packages in the source package and have dh_virtualenv install from the cache instead of from PyPI.
This seems like to be doable with pip2pi, but certain experiements show, that it will not install packages, although they are located in the local package index.
Both approaches seem a bit clumsy and prone to errors.
What do you think of this?
What are your experiences?
What would you recommend?

How to distribute and deploy Python 3 code with dependency isolation

I'm not happy with the way that I currently deploy Python code and I was wondering if there is a better way. First I'll explain what I'm doing, then the drawbacks:
When I develop, I use virtualenv to do dependancy isolation and install all libraries using pip. Python itself comes from my OS (Ubuntu)
Then I build my code into a ".deb" debian package consisting of my source tree and a pip bundle of my dependancies
Then when I deploy, I rebuild the virtualenv environment, source foo/bin/activate and then run my program (under Ubuntu's upstart)
Here are the problems:
The pip bundle is pretty big and increases the size of the debian package significantly. This is not too big a deal, but it's annoying.
I have to build all the C libraries (PyMongo, BCrypt, etc) every time I deploy. This takes a little while (a few minutes) and it's a bit lame to do this CPU bound job on production
Here are my constraints:
Must work on Python 3. Preferably 3.2
Must have dependency isolation
Must work with libraries that use C (like PyMongo)
I've heard things about freezing, but I haven't been able to get this to work. cx_freeze out of Pypi doesn't seem to compile (on my Python, at least). The other freeze utilities don't seem to work with Python 3. How can I do this better?
Wheel is probably the best way to do this at the moment.
Create a virtualenv on the deployment machine, and deploy a wheel along with any dependencies (also built as wheels) to that virtualenv.
This solves the problems:
Having separate wheels for dependencies mean you don't have to redeploy dependencies that haven't changed, cutting the size of the deployment artefact
Building big packages (such as lxml or scipy) can be done locally, and then the compiled wheel pushed to production
Also, it works fine with libraries that use C.
Have you looked at buildout (zc.buildout)? With a custom recipe you may be able to automate most of this.

Distributing python code with virtualenv?

I want to distribute some python code, with a few external dependencies, to machines with only core python installed (and users that unfamiliar with easy_install etc.).
I was wondering if perhaps virtualenv can be used for this purpose? I should be able to write some bash scripts that trigger the virtualenv (with the suitable packages) and then run my code.. but this seems somewhat messy, and I'm wondering if I'm re-inventing the wheel?
Are there any simple solutions to distributing python code with dependencies, that ideally doesn't require sudo on client machines?
Buildout - http://pypi.python.org/pypi/zc.buildout
As sample look at my clean project: http://hg.jackleo.info/hyde-0.5.3-buildout-enviroment/src its only 2 files that do the magic, more over Makefile is optional but then you'll need bootstrap.py (Make file downloads it, but it runs only on Linux). buildout.cfg is the main file where you write dependency's and configuration how project is laid down.
To get bootstrap.py just download from http://svn.zope.org/repos/main/zc.buildout/trunk/bootstrap/bootstrap.py
Then run python bootstap.py and bin/buildout. I do not recommend to install buildout locally although it is possible, just use the one bootstrap downloads.
I must admit that buildout is not the easiest solution but its really powerful. So learning is worth time.
UPDATE 2014-05-30
Since It was recently up-voted and used as an answer (probably), I wan to notify of few changes.
First of - buildout is now downloaded from github https://raw.githubusercontent.com/buildout/buildout/master/bootstrap/bootstrap.py
That hyde project would probably fail due to buildout 2 breaking changes.
Here you can find better samples http://www.buildout.org/en/latest/docs/index.html also I want to suggest to look at "collection of links related to Buildout" part, it might contain info for your project.
Secondly I am personally more in favor of setup.py script that can be installed using python. More about the egg structure can be found here http://peak.telecommunity.com/DevCenter/PythonEggs and if that looks too scary - look up google (query for python egg). It's actually more simple in my opinion than buildout (definitely easier to debug) as well as it is probably more useful since it can be distributed more easily and installed anywhere with a help of virtualenv or globally where with buildout you have to provide all of the building scripts with the source all of the time.
You can use a tool like PyInstaller for this purpose. Your application will appear as a single executable on all platforms, and include dependencies. The user doesn't even need Python installed!
See as an example my logview package, which has dependencies on PyQt4 and ZeroMQ and includes distributions for Linux, Mac OSX and Windows all created using PyInstaller.
You don't want to distribute your virtualenv, if that's what you're asking. But you can use pip to create a requirements file - typically called requirements.txt - and tell your users to create a virtualenv then run pip install -r requirements.txt, which will install all the dependencies for them.
See the pip docs for a description of the requirements file format, and the Pinax project for an example of a project that does this very well.

Categories

Resources