can someone point out a simple example of making a Python package using distutils that depends on other python modules being installed, e.g. numpy (a particular version) and scipy? I found very simple examples online but could not find an example that depends on a known package. I want to find the easiest system where such dependencies are installed for the user when the user installs the package I am defining using setup.py install. thanks.
Distutils by itself does not install dependencies. You need to use an add-on to Distutils, like pip or plain Distribute/setuptools.
Related
The pip tool makes python module install a breeze -- for the most part. But when the module requires some external C libraries that are not found in "standard locations", it can cause problem. Case in point: I was trying to install gmpy2 python module, and it needs to have access to gmp, mpfr, and mpc libraries. On the system I'm using (Linux HPC, running RHEL 6.9), the system-wide libraries are very old. The HPC system admin provides more up-to-date libraries but not in /usr/lib or /usr/local/lib. My question is: can we still use pip to build binary parts of the Python module? How to specify the custom include & library files location? I was forced to get back to python setup.py approach: first using build_ext subcommand, then invoking the install subcommand to finish the install. This is workable, but rather messy.
I need to build a python module from source. It is just my second build and I'm a bit confused regarding the interaction between built packages and binaries installed through package manager.
Do I need to uninstall the binary first?
If I don't need to Will it overwrite the installed version or will both be available?
If it will not overwrite how can I import the built version into python?
Thank you all!
p.s: If it is case sensitive I'm on fedora 24 and the package is matplotlib which is installed through a setup.py.
I strongly recommend to use virtualenv and build your package inside. Is it really necessary to install via setup.py? If not, you can consider using pip to install your package inside virtualenv.
I am packaging a python application that depends on several C libraries through gobject introspection. I would like to make sure that, at least, the python module from the glib is installed (that is the gi module, packaged as python-gi in Debian, I am not talking about the deprecated PyGObject module). Adding it as a regular dependency makes the install fail, since it is not in Pypi.
How should I declare this ? I looked a setuptool doc and nothing I see quite does the trick.
Thanks.
Related question:
Bundling GTK3+ with py2exe
You cannot specify non-Python dependencies using setuptools (AFAIK, that is ...).
The install_requires keywords to setuptools.setup can specify Python-style dependencies only; it targets the Python packaging infrastructure. Python-style installer programs (either pip, easy_install or python setup.py install) will resolve such dependencies using strategies to find and resolve Python-style packages only. One of those strategies is using a package index like PyPI.
If you want to create a package that has Debian-style dependencies, which are resolved by Debian-style installers, using Debian package repos, you have to create a Debian package. There are tools that support creating Debian packages from Python projects, for example easydeb and stdeb. However, most people recommend going the extra-mile and explicitly create a Debian package.
In the general case, packaging for and distributing Python projects via PyPI should be the way to go. It's platform and distro independent, and plays nicely with Python-specific installers like pip and tools like virtualenv or buildout. Having a dependency to PyGI would involve documenting the fact to users, like e.g. the pydbus packages does in its README:
It’s based on PyGI, the Python GObject Introspection bindings, which
is the recommended way to use GLib from Python. Unfortunately, PyGI is
not packaged on pypi, so you need to install it from your
distribution’s repository (usually called python-gi, python-gobject or
pygobject3).
Your project could also be defensive when importing from PyGI, presenting users a digestible error message like "please sudo apt-get install python-gi" or so, when the import fails.
I have read the documentation but I don't understand.
Why do I have to use distutils to install python modules ?
Why do I just can't save the modules in python path ?
You don't have to use distutils. You can install modules manually, just like you can compile a C++ library manually (compile every implementation file, then link the .obj files) or install an application manually (compile, put into its own directory, add a shortcut for launching). It just gets tedious and error-prone, as every repetive task done manually.
Moreover, the manual steps I listed for the examples are pretty optimistic - often, you want to do more. For example, PyQt adds the .ui-to-.py-compiler to the path so you can invoke it via command line.
So you end up with a stack of work that could be automated. This alone is a good argument.
Also, the devs would have to write installing instructions. With distutils etc, you only have to specify what your project consists of (and fancy extras if and only if you need it) - for example, you don't need to tell it to put everything in a new folder in site-packages, because it already knows this.
So in the end, it's easier for developers and for users.
what python modules ? for installing python package if they exist in pypi you should do :
pip install <name_of_package>
if not, you should download them .tar.gz or what so ever and see if you find a setup.py and run it like this :
python setup.py install
or if you want to install it in development mode (you can change in package and see the result without installing it again ) :
python setup.py develop
this is the usual way to distribute python package (the setup.py); and this setup.py is the one that call disutils.
to summarize this distutils is a python package that help developer create a python package installer that will build and install a given package by just running the command setup.py install.
so basically what disutils does (i will sit only important stuff):
it search dependencies of the package (install dependencies automatically).
it copy the package modules in site-packages or just create a sym link if it's in develop mode
you can create an egg of you package.
it can also run test over your package.
you can use it to upload your package to pypi.
if you want more detail see this http://docs.python.org/library/distutils.html
You don't have to use distutils to get your own modules working on your own machine; saving them in your python path is sufficient.
When you decide to publish your modules for other people to use, distutils provides a standard way for them to install your modules on their machines. (The "dist" in "distutils" means distribution, as in distributing your software to others.)
I am wanting to create a suite of interrelated packages in Python. I would like them all to be under the same package but installable as separate components.
So, for example, installing the base package would provide the mypackage but there would be nothing in mypackage.subpackage until I install it separately.
Is this possible with distutils and pip?
What you are looking for is called "namespace packages", see this SO question