I have a single Python3 .py module with a few dependencies (lockfile, python-daemon). Is there a simple way to package this with its' dependencies so that users do not need to download and install the other modules? An all included install is what I am trying to do.
I tried looking at setuptools, distribute, and distutils and ended up even more confused than when I started.
The simplest way I see often used is to put all your dependencies in a single file (usually named requirements.txt) and then you ask user to run the following command:
pip install -r requirements.txt
And here is an example for the content of the file (https://github.com/cenkalti/pypi-notifier/blob/master/requirements.txt):
Flask==0.10.1
Flask-Cache==0.12
Flask-SQLAlchemy==1.0
Flask-Script==0.5.3
GitHub-Flask==0.3.4
Jinja2==2.7
MarkupSafe==0.18
SQLAlchemy==0.8.2
...
cx_Freeze should do what you're looking for.
You could quite easily do this with something simple like a .zip file containing all the files; so long as all the files are exported to the same directory then they should all work! The downfall is if there are lots of dependancies for the modules, ie they have extra folders you would need to find.
I also think a fair amount of people/companies write their own packaging systems so that all the modules are in 1 .py file that opens in the console and exports everything to it's correct place. This would require a fair amount of work though so you may to try and find one prebuilt. I've gone down this method and it didn't prove too taxing unitl I had to unzip .zips with files in...
As another solution you could try PyExe (I think it's called that) to export everything to a single .exe file (Windows only though)
I personally havn't used setuptools, distribute or distutils so can't comment on those unfortunately.
1 other thing to bear in mind is the licences for each module, some may not be allowed to be redistributed so check first!
py2exe is fine, but will limit you to Windows.
Best way to do this without limiting your audience and following generally accepted best practices is to create a requirements.txt and setup.py file and then upload your project to github. See https://github.com/sourcegraph/python-deps as a reference. The requirements.txt lists the dependencies in a simple, easy-to-read format and you specify the commands and library modules your project installs using the scripts and py_modules options in setup.py.
Assuming your git repository is at github.com/foo/bar, your users can then do pip install git+https://github.com/foo/bar.
Read official Python packaging tutorial.
You create a Python package from your module with setup.py
In setup.py you can declare what other Python packages must be installed as dependencies
Furthermore you can narrow down application specific dependency version with requirements.txt.
Related
For my simple C library I have a simple Python wrapper using ctypes. I am now writing the Makefile for my library. In it under the install target I have:
install -m 644 mylib.py $(shell python3 -m site | grep $(PREFIX).*packages | cut -d \' -f2)/
This is something I contrived myself. I don't want to use distutils or all those multiple Python installation "solutions" because they are too convoluted for my simple requirements: Once the .so file is installed, all that is really needed for my library to be accessible from Python is for the .py interface to be in the correct import-able location. So I'm figuring a simple copy should do.
Does anyone foresee any problem with what I am doing? If so, is there a better way to do this?
Thanks!
By using setup-tools and uploading your source files to pipy.org, anyone on the world can install your project by simply typing pip install <yourmodule>.
Moreover, larger Python projects which happen to want to use your library will just have to list its name in a single line, either in a requirements.txt file, or on setup.py
if your project is not open source and you don't intend it to be accessible by "the world", the distutils solutions - no quotes needed - allow your project to be fetched from a private git repository, or even a prevate Python "Cheese Shop".
If you absolutely don't care about using the right way to do things, you might as well just drop a plain English line on your README telling the user to copy your ".py" file to any folder he likes - installation won't be as expected by Python users anyway.
Otherwise, just have a minimalist setup.py file and have a make line that goes python setup.py install - it certainly won't hurt (your library will still be short of usable on larger projects that uses several packages - all those projects need to be pip-installable )
The minimalist setup.py file is a two liner and can be found here:
https://docs.python.org/2/distutils/examples.html#pure-python-distribution-by-module - certainly far from "convoluted".
I am writing a program in python to be sent to other people, who are running the same python version, however these some 3rd party modules that need to be installed to use it.
Is there a way to compile into a .pyc (I only say pyc because its a python compiled file) that has the all the dependant modules inside it as well?
So they can run the programme without needing to install the modules separately?
Edit:
Sorry if it wasnt clear, but I am aware of things such as cx_freeze etc but what im trying to is just a single python file.
So they can just type "python myapp.py" and then it will run. No installation of anything. As if all the module codes are in my .py file.
If you are on python 2.3 or later and your dependencies are pure python:
If you don't want to go the setuptools or distutiles routes, you can provide a zip file with the pycs for your code and all of its dependencies. You will have to do a little work to make any complex pathing inside the zip file available (if the dependencies are just lying around at the root of the zip this is not necessary. Then just add the zip location to your path and it should work just as if the dependencies files has been installed.
If your dependencies include .pyds or other binary dependencies you'll probably have to fall back on distutils.
You can simply include .pyc files for the libraries required, but no - .pyc cannot work as a container for multiple files (unless you will collect all the source into one .py file and then compile it).
It sounds like what you're after is the ability for your end users to run one command, e.g. install my_custom_package_and_all_required_dependencies, and have it assemble everything it needs.
This is a perfect use case for distutils, with which you can make manifests for your own code that link out to external dependencies. If your 3rd party modules are available publicly in a standard format (they should be, and if they're not, it's pretty easy to package them yourself), then this approach has the benefit of allowing you to very easily change what versions of 3rd party libraries your code runs against (see this section of the above linked doc). If you're dead set on packaging others' code with your own, you can always include the required files in the .egg you create with distutils.
Two options:
build a package that will install the dependencies for them (I don't recommend this if the only dependencies are python packages that are installed with pip)
Use virtual environments. You use an existing python on their system but python modules are installed into the virtualenv.
or I suppose you could just punt, and create a shell script that installs them, and tell them to run it once before they run your stuff.
I am trying to write a python script that I could easily export to friends without dependency problems, but am not sure how to do so. Specifically, my script relies on code from BeautifulSoup, but rather than forcing friends to have to install BeautifulSoup, I would rather just package the src for BeautifulSoup into a Libraries/ folder in my project files and call to functions from there. However, I can no longer simply "import bs4." What is the correct way of going about this?
Thanks!
A common approach is ship a requirements file with a project, specifying which version of which library is required. This file is (by convention) often named requirements.txt and looks something like this:
MyApp
BeautifulSoup==3.2.1
SomeOtherLib==0.9.4
YetAnother>=0.2
(The fictional file above says: I need exactly BeautifulSoup 3.2.1, SomeOtherLib 0.9.4 and any version of YetAnother greater or equal to 0.2).
Then the user of this project can simply take you library, (create a virtualenv) and run
$ pip install -r requirements.txt
which then will fetch all libraries and makes them available either system-wide of project-wide (if virtualenv is used). Here's a random python project off github, having a requirements file:
https://github.com/laoqiu/pypress
https://github.com/laoqiu/pypress/blob/master/requirements.txt
The nice thing about this approach is that you'll get your transitive dependencies resolved automatically. Also, if you use virtualenv, you'll get a clean separation of your projects and avoid library version collisions.
You must add Libraries/ (converted to an absolute path first) to sys.path before attempting to import anything under it.
I want to distribute some python code, with a few external dependencies, to machines with only core python installed (and users that unfamiliar with easy_install etc.).
I was wondering if perhaps virtualenv can be used for this purpose? I should be able to write some bash scripts that trigger the virtualenv (with the suitable packages) and then run my code.. but this seems somewhat messy, and I'm wondering if I'm re-inventing the wheel?
Are there any simple solutions to distributing python code with dependencies, that ideally doesn't require sudo on client machines?
Buildout - http://pypi.python.org/pypi/zc.buildout
As sample look at my clean project: http://hg.jackleo.info/hyde-0.5.3-buildout-enviroment/src its only 2 files that do the magic, more over Makefile is optional but then you'll need bootstrap.py (Make file downloads it, but it runs only on Linux). buildout.cfg is the main file where you write dependency's and configuration how project is laid down.
To get bootstrap.py just download from http://svn.zope.org/repos/main/zc.buildout/trunk/bootstrap/bootstrap.py
Then run python bootstap.py and bin/buildout. I do not recommend to install buildout locally although it is possible, just use the one bootstrap downloads.
I must admit that buildout is not the easiest solution but its really powerful. So learning is worth time.
UPDATE 2014-05-30
Since It was recently up-voted and used as an answer (probably), I wan to notify of few changes.
First of - buildout is now downloaded from github https://raw.githubusercontent.com/buildout/buildout/master/bootstrap/bootstrap.py
That hyde project would probably fail due to buildout 2 breaking changes.
Here you can find better samples http://www.buildout.org/en/latest/docs/index.html also I want to suggest to look at "collection of links related to Buildout" part, it might contain info for your project.
Secondly I am personally more in favor of setup.py script that can be installed using python. More about the egg structure can be found here http://peak.telecommunity.com/DevCenter/PythonEggs and if that looks too scary - look up google (query for python egg). It's actually more simple in my opinion than buildout (definitely easier to debug) as well as it is probably more useful since it can be distributed more easily and installed anywhere with a help of virtualenv or globally where with buildout you have to provide all of the building scripts with the source all of the time.
You can use a tool like PyInstaller for this purpose. Your application will appear as a single executable on all platforms, and include dependencies. The user doesn't even need Python installed!
See as an example my logview package, which has dependencies on PyQt4 and ZeroMQ and includes distributions for Linux, Mac OSX and Windows all created using PyInstaller.
You don't want to distribute your virtualenv, if that's what you're asking. But you can use pip to create a requirements file - typically called requirements.txt - and tell your users to create a virtualenv then run pip install -r requirements.txt, which will install all the dependencies for them.
See the pip docs for a description of the requirements file format, and the Pinax project for an example of a project that does this very well.
I need to write, or find, a script to create a Debian package, using package python-support, from a Python package. The Python package will be pure Python without C extensions.
The Python package for testing purposes will just be a directory with an empty __init__.py file and a single Python module, package_test.py.
The packaging script must use python-support to provide the correct bytecode for possible multiple installations of Python on a target platform, i.e. v2.5 and v2.6 on Ubuntu 9.04 (Jaunty Jackalope).
Most advice I find while googling are just examples of nasty hacks that don't even use python-support or python-central.
I have spent hours researching this, and the best I can come up with is to hack around the script from an existing open source project, but I don't know which bits are required for what I'm doing.
Has anyone here made a Debian package out of a Python package in a reasonably non-hacky way?
I'm starting to think that it will take me more than a week to go from no knowledge of Debian packaging and python-support to getting a working script. How long has it taken others?
The right way of building a .deb package is using dpkg-buildpackage, but sometimes it is a little bit complicated. Instead you can use dpkg -b <folder>, and it will create your Debian package.
These are the basics for creating a Debian package with dpkg -b <folder> with any binary or with any kind of script that runs automatically without needing manual compilation (Python, Bash, Perl, and Ruby):
Create the files and folders in order to recreate the following structure:
ProgramName-Version/
ProgramName-Version/DEBIAN
ProgramName-Version/DEBIAN/control
ProgramName-Version/usr/
ProgramName-Version/usr/bin/
ProgramName-Version/usr/bin/your_script
The scripts placed at /usr/bin/ are directly called from the terminal. Note that I didn't add an extension to the script. Also you can notice that the structure of the .deb package will be the structure of the program once it's installed. So if you follow this logic, if your program has a single file, you can directly place it under ProgramName-Version/usr/bin/your_script, but if you have multiple files, you should place them under ProgramName-Version/usr/share/ProgramName/all your files and place only one file under /usr/bin/ that will call your scripts from /usr/share/ProgramName/.
Change all the folder permission to root:
chown root:root -R /path/to/ProgramName-Version
Change the script's permissions:
chmod 0755 /path/to/the/script
Finally, you can run: dpkg -b /path/to/the/ProgramName-Version and your .deb package will be created! (You can also add the post/pre install scripts and everything you want. It works like a normal Debian package.)
Here is an example of the control file. You only need to copy-paste it in to an empty file called "control" and put it in the DEBIAN folder.
Package: ProgramName
Version: VERSION
Architecture: all
Maintainer: YOUR NAME <EMAIL>
Depends: python2.7, etc , etc,
Installed-Size: in_kb
Homepage: http://example.com
Description: Here you can put a one line description. This is the short Description.
Here you put the long description, indented by one space.
The full article about Debian packages can be read here.
I would take the sources of an existing Debian package, and replace the actual package in it with your package. To find a list of packages that depend on python-support, do
apt-cache rdepends python-support
Pick a package that is Architecture: all, so that it is a pure-Python package. Going through this list, I found that e.g. python-flup might be a good starting point.
To get the source of one such package, do
apt-get source <package>
To build it, do
cd <packagesrc>
dpkg-buildpackage -rfakeroot
When editing it, expect that you only need the files in the debian folder; replace all references to flup with your own package name.
Once you get started, it should take you a day to complete.
I think you want http://pypi.python.org/pypi/stdeb:
stdeb produces Debian source packages
from Python packages via a new
distutils command, sdist_dsc.
Automatic defaults are provided for
the Debian package, but many aspects
of the resulting package can be
customized (see the customizing
section, below). An additional
command, bdist_deb, creates a Debian
binary package, a .deb file.
Most of the answers posted here are outdated, but fortunately a great Debian wiki post has been made recently, which explains the current best practices and describes how to build Debian packages for Python modules and applications.
http://wiki.debian.org/Python/Packaging
First off, there are plenty of Python packages already in Debian; you can download the source (including all the packaging) for any of them either using apt-get source or by visiting http://packages.debian.org.
You may find the following resources of use:
Debian New Maintainer's Guide
Debian Policy Manual
Debian Python Policy
Debian Python Modules Team