How to version a Python program - python

I'm doing a Python program (for use in a Linux environment) and I want it to support version actualizations of the program. I read somewhere I should add a line like "__ Version __" but I'm not really sure where and how to put it. What I want to achieve is not having to erase my whole program every time I want to install a new version of it.
thanks!

I highly recommend you to use setuptools instead of manually versioning it. It is the de-facto stanrad now and very simple to use. All you need to do is to create a setup.py in your projects root directory:
from setuptools import setup, find_packages
setup(name='your_programms_name',
version='major.minor.patch',
packages=find_packages(),
)
and then just run:
python setup.py sdist
and then there will be eggs under your dist folder.

What you actually want to do is make your Python program into a package using distutils.
You would create a setup.py. A simple example would look something like this:
from distutils.core import setup
setup(name='foo',
version='1.0',
py_modules=['foo'],
)
This is the most standard way to version and distribute Python code. If your code is open source you can even register it with the cheese shop^M^M^M^MPyPi and install it on any machine in the world with a simple pip install mypackage.

It depends what you want to be versioned.
Single modules with independent versions, or the whole program/package.
You could theoretically add a __version__ string to every class and do dynamic imports, by testing these variables.
If you want to version the main, or whole program you could add a __version__ string somewhere at the top of your __init__.py file and import this string into you setup.py when generating packages. This way you wouldn't have to manually edit multiple files and setup.py could be left mostly untouched.
Also consider using a string, not a number or tuple. See PEP 396.

If you are only concerned with versions on your local machine, I would suggest becoming familiar with Git. Information about version control in Git can be found here.
If you are concerned with version control on a module that others would use, information can be found here.

Related

setuptools "eager_resources" to executable directory

I maintain a Python utility that allows bpy to be installable as a Python module. Due to the hugeness of the spurce code, and the length of time it takes to download the libraries, I have chosen to provide this module as a wheel.
Unfortunately, platform differences and Blender runtime expectations makes support for this tricky at times.
Currently, one of my big goals is to get the Blender addon scripts directory to install into the correct location. The directory (simply named after the version of Blender API) has to exist in the same directory as the Python executable.
Unfortunately the way that setuptools works (or at least the way that I have it configured) the 2.79 directory is not always placed as a sibling to the Python executable. It fails on Windows platforms outside of virtual environments.
However, I noticed in setuptools documentation that you can specify eager_resources that supposedly guarantees the location of extracted files.
https://setuptools.readthedocs.io/en/latest/setuptools.html#automatic-resource-extraction
https://setuptools.readthedocs.io/en/latest/pkg_resources.html#resource-extraction
There was a lot of hand waving and jargon in the documentation, and 0 examples. I'm really confused as to how to structure my setup.py file in order to guarantee the resource extraction. Currently, I just label the whole 2.79 directory as "scripts" in my setuptools Extension and ship it.
Is there a way to write my setup.py and package my module so as to guarantee the 2.79 directory's location is the same as the currently running python executable when someone runs
py -3.6.8-32 -m pip install bpy
Besides simply "hacking it in"? I was considering writing a install_requires module that would simply move it if possible but that is mangling with the user's file system and kind of hacky. However it's the route I am going to go if this proves impossible.
Here is the original issue for anyone interested.
https://github.com/TylerGubala/blenderpy/issues/13
My build process is identical to the process descsribed in my answer here
https://stackoverflow.com/a/51575996/6767685
Maybe try the data_files option of distutils/setuptools.
You could start by adding data_files=[('mydata', ['setup.py'],)], to your setuptools.setup function call. Build a wheel, then install it and see if you can find mydata/setup.py somewhere in your sys.prefix.
In your case the difficult part will be to compute the actual target directory (mydata in this example). It will depend on the platform (Linux, Windows, etc.), if it's in a virtual environment or not, if it's a global or local install (not actually feasible with wheels currently, see update below) and so on.
Finally of course, check that everything gets removed cleanly on uninstall. It's a bit unnecessary when working with virtual environments, but very important in case of a global installation.
Update
Looks like your use case requires a custom step at install time of your package (since the location of the binary for the Python interpreter relative to sys.prefix can not be known in advance). This can not be done currently with wheels. You have seen it yourself in this discussion.
Knowing this, my recommendation would be to follow the advice from Jan Vlcinsky in his comment for his answer to this question:
Post install script after installing a wheel.
Add an extra setuptools console entry point to your package (let's call it bpyconfigure).
Instruct the users of your package to run it immediately after installing your package (pip install bpy && bpyconfigure).
The purpose of bpyconfigure should be clearly stated (in the documentation and maybe also as a notice shown in the console right after starting bpyconfigure) since it would write into locations of the file system where pip install does not usually write.
bpyconfigure should figure out where is the Python interpreter, and where to write the extra data.
The extra data to write should be packaged as package_data, so that it can be found with pkg_resources.
Of course bpyconfigure --uninstall should be available as well!

Difference between adding path to PYTHONPATH and installing your own module

I'm working on a python project that contains a number of routines I use repeatedly. Instead of rewriting code all the time, I just want to update my package and import it; however, it's nowhere near done and is constantly changing. I host the package on a repo so that colleagues on various machines (UNIX + Windows) can pull it into their local repos and use it.
It sounds like I have two options, either I can keeping installing the package after every change or I can just add the folder directory to my system's path. If I change the package, does it need to be reinstalled? I'm using this blog post as inspiration, but the author there doesn't stress the issue of a continuously changing package structure, so I'm not sure how to deal with this.
Also if I wanted to split the project into multiple files and bundle it as a package, at what level in the directory structure does the PTYHONPATH need to be at? To the main project directory, or the .sample/ directory?
README.rst
LICENSE
setup.py
requirements.txt
sample/__init__.py
sample/core.py
sample/helpers.py
docs/conf.py
docs/index.rst
tests/test_basic.py
tests/test_advanced.py
In this example, I want to be able to just import the package itself and call the modules within it like this:
import sample
arg = sample.helper.foo()
out = sample.core.bar(arg)
return out
Where core contains a function called foo
PYTHONPATH is a valid way of doing this, but in my (personal) opinion it's more useful if you have a whole different place where you keep your python variables. Like /opt/pythonpkgs or so.
For projects where I want it to be installed and also I have to keep developing, I use develop instead of install in setup.py:
When installing the package, don't do:
python setup.py install
Rather, do:
python setup.py develop
What this does is that it creates a synlink/shortcut (I believe it's called egglink in python) in the python libs (where the packages are installed) to point to your module's directory. Hence, as it's a shortcut/symlink/egglink when ever you change a python file, it will immediately reflect the next time you import that file.
Note: Using this, if you delete the repository/directory you ran this command from, the package will cease to exist (as its only a shortcut)
The equivalent in pip is -e (for editable):
pip install -e .
Instead of:
pip install .

central path for python modules

I am starting to convert a lot of C stuff in python 3.
In C I defined a directory called "Toolbox", where i put all my functions i needed in different programs, so called libraries.
To use a specific library i had just to add the line
#include "/home/User/Toolbox/VectorFunctions.h"
into my source. So i was able to use the same library in different sources.
In python i tried to write some Toolbox functions and implement them into the source with import VectorFunctions, which works, as long as the file VectorFunctions.py is in the same directory as the source.
I there a way (I think there must be one...) telling python that VectorFunctions.py is located in a different directory, e.g. /home/User/Python_Toolbox?
Thanks for any comment!
What I would do is to organize these toolbox functions into one installable Python package bruno_toolbox, with its setup.py, and then install it into development mode to system site packages, using python setup.py develop, and then use the bruno_toolbox like any other package on the system, everywhere. Then if that package feels useful, I'd publish it to PyPI for the benefit of everyone.
You can use python path. Writing this code beginning of your program :
import sys
sys.path.append('/home/User/Python_Toolbox')
If you have VectorFunctions.py in this folder you can import it :
import VectorFunctions

Python path issue while importing my own modules. What is the best way to go about it?

I've created python modules but they are in different directories.
/xml/xmlcreator.py
/tasklist/tasks.py
Here, tasks.py is trying to import xmlcreator but both are in different paths. One way to do it is include xmlcreator.py in the Pythonpath. But, considering that I'll be publishing the code, this doesn't seem the right way to go about it as suggested here. Thus, how do I include xmlcreator or rather any module that might be written by me which would be in various directories and sub directories?
Are you going to publish both modules separately or together in one package?
If the former, then you'll probably want to have your users install your xml module (I'd call it something else :) so that it is, by default, already on Python's path, and declare it as a dependency of the tasklist module.
If both are distributed as a bundle, then relative imports seem to be the best option, since you can control where the paths are relative to each other.
The best way is to create subpackages in a single top-level package that you define. You then ship these together in one package. If you are using setuptools/Distribute and you want to distribute them separately then you may also define a "namspace package" that the packages will be installed in. You don't need to use any ugly sys.path hacks.
Make a directory tree like this:
mypackage/__init__.py
mypackage/xml/__init__.py
mypackage/xml/xmlcreator.py
mypackage/tasklist/__init__.py
mypackage/tasklist/tasks.py
The __init__.py files may be empty. They define the directory to be a package that Python will search in.
Except if you want to use namespace packages the mypackage/__init__.py should contains:
__import__('pkg_resources').declare_namespace(__name__)
And your setup.py file contain:
...
namespace_packages=["mypackage"],
...
Then in your code:
from mypackage.xml import xmlcreator
from mypackage.tasklist import tasks
Will get them anywhere you need them. You only need to make one name globally unique in this case, the mypackage name.
For developing the code you can put the package in "develop mode", by doing
python setup.py develop --user
This will set up the local python environment to look for your package in your workspace.
When I start a new Python project, I immediately write its setup.py and declare my Python modules/packages, so that then I just do:
python setup.py develop
and everything gets magically added to my PYTHONPATH. If you do it from a virtualenv it's even better, since you don't need to install it system-wide.
Here's more about it:
http://packages.python.org/distribute/setuptools.html#development-mode

How to package a command line Python script

I've created a python script that's intended to be used from the command line. How do I go about packaging it? This is my first python package and I've read a bit about setuptools, but I'm still not sure the best way to do this.
Solution
I ended up using setup.py with the key configurations noted below:
setup(
....
entry_points="""
[console_scripts]
mycommand = mypackage.mymodule:main
""",
....
)
Here's a good example in context.
Rather than using setuptools non standard way of proceeding, it is possible to directly rely on distutils setup's function, using the scripts argument, as stated here: http://docs.python.org/distutils/setupscript.html#installing-scripts
from distutils import setup
setup(
...,
scripts=['path/to/your/script',],
...
)
It allows you to stay compatible a) with all python versions and b) not having to rely on a setuptools as an external dependency.
#Zach, given your clarification in your comment to #soulmerge's answer, it looks like what you need is to write a setup.py as per the instructions regarding the distutils -- here in particular is how you register on pypi, and here on how to upload to pypi once you are registrered -- and possibly (if you need some extra functionality wrt what the distutils supply on their own) add setuptools, of which easy_install is part, via the instructions here.
Last month, I have written an article answering exactly your question. You can find it here: http://gehrcke.de/2014/02/distributing-a-python-command-line-application/
There, I am using only currently recommended methods (twine, pure setuptools instead of distutils, the console_scripts key in the entry_points dictionary, ...), which work for Python 2 and 3.
What do you mean by packaging? If it is a single script to be run on a box that already has python installed, you just need to put a shebang into the first line of the file and that's it.
If you want it to be executed under Windows or on a box without python, though, you will need something external, like pyinstaller.
If your question is about where to put configuration/data files, you'll need to work platform-dependently (like writing into the registry or the home folder), as far as I know.
For those who are beginners in Python Packaging, I suggest going through this Python Packaging Tutorial.
Note about the tutorial:
At this time, this documentation focuses on Python 2.x only, and may not be as applicable to packages targeted to Python 3.x

Categories

Resources