Scripts generated by zc.buildout using zc.recipe.egg, on our <package>/bin/ directory look like this:
#! <python shebang> -S
import sys
sys.path[0:0] = [
... # some paths derived from the eggs
... # some other paths included with zc.recipe.egg `extra-path`
]
# some user initialization code from zc.recipe.egg `initialization`
# import function, call function
What I have not been able to was to find a way to programmatically prepend a path at the sys.path construction introduced in every script. Is this possible?
Why: I have a version of my python project installed globally and another version of it installed locally (off-buildout tree). I want to be able to switch between these two versions.
Note: Clearly, one can use the zc.recipe.egg/initialization property to add something like:
initialization = sys.path[0:0] = [ /add/path/to/my/eggs ]
But, is there any other way? Extra points for an example!
Finally, I got a working environment by creating my own buildout recipe that you can find here: https://github.com/idiap/local.bob.recipe. The file that contains the recipe is this one: https://github.com/idiap/local.bob.recipe/blob/master/config.py. There are lots of checks which are specific to our software at the class constructor and some extra improvements as well, but don't get bothered with that. The "real meat (TM)" is on the install() method of that class. It goes like this more or less:
egg_link = os.path.join(self.buildout['buildout']['eggs-directory'], 'external-package.egg-link')
f = open(egg_link, 'wt')
f.write(self.options['install-directory'] + '\n')
f.close()
self.options.created(egg_link)
return self.options.created()
This will do the trick. My external (CMake-based) package now only has to create the right .egg-info file in parallel with the python package(s) it builds. Than, I can tie, using the above recipe, the usage of a specific package installation like this:
[buildout]
parts = external_package python
develop = .
eggs = my_project
external_package
recipe.as.above
[external_package]
recipe = recipe.as.above:config
install-directory = ../path/to/my/local/package/build
[python]
recipe = zc.recipe.egg
interpreter = python
eggs = ${buildout:eggs}
If you wish to switch installations, just change the install-directory property above. If you wish to use the default installation available system wide, just remove altogether the recipe.as.above constructions from your buildout.cfg file. Buildout will just find the global installation w/o requiring any extra configuration. Uninstallation will work properly as well. So, switching between builds will just work.
Here is a fully working buildout .cfg file that we use here: https://github.com/idiap/bob.project.example/blob/master/localbob.cfg
The question is: Is there an easier way to achieve the same w/o having this external recipe?
Well, what you miss is probably the most useful buildout extension, mr.developer.
Typically the package, let's say foo.bar will be in some repo, let's say git.
Your buildout will look like
[buildout]
extensions = mr.developer
[sources]
foo.bar = git git#github.com:foo/foo.bar.git
If you don't have your package in a repo, you can use fs instead of git, have a look at the documentation for details.
Activating the "local" version is done by
./bin/develop a foo.bar
Deactivating by
./bin/develop d foo.bar
There are quite a few other things you can do with mr.developer, do check it out!
Related
I have a very frustrating start learning Plone development. I would like to develop a dexterity based content type for Plone 4. I'm an experienced python developer, having some knowledge of Zope and Grok, being rather new to buildout. That said, I read "Professional Plone 4 Development" by Martin Aspeli, but quite some version information in the book seems to be outdated.
Using buildout I was able to get a Plone instance up and running. ZopeSkel is installed but when I try to create a new package, I get an error like this:
**************************************************************************
** Your new package supports local commands. To access them, change
** directories into the 'src' directory inside your new package.
** From there, you will be able to run the command `paster add
** --list` to see the local commands available for this package.
**************************************************************************
ERROR: No egg-info directory found (looked in ./domma.voucher/./domma.voucher.egg-info, ./domma.voucher/bootstrap.py/domma.voucher.egg-info, ./domma.voucher/bootstrap.pyo/domma.voucher.egg-info, ./domma.voucher/buildout.cfg/domma.voucher.egg-info, ./domma.voucher/CHANGES.txt/domma.voucher.egg-info, ./domma.voucher/CONTRIBUTORS.txt/domma.voucher.egg-info, ./domma.voucher/docs/domma.voucher.egg-info, ./domma.voucher/domma/domma.voucher.egg-info, ./domma.voucher/README.txt/domma.voucher.egg-info, ./domma.voucher/setup.cfg/domma.voucher.egg-info, ./domma.voucher/setup.py/domma.voucher.egg-info, ./domma.voucher/src/domma.voucher.egg-info)
If I try to run paster from within the given directory, it tells me, that the command "add" is not know. I tried different versions of ZopeSkel and tried the raw plone templates and also zopeskel.dexterity. The output changes slightly depending on version and template, but the result remains the same.
Obvisouly Plone development seems to be very sensible to version changes, which makes older documentation quite useless. http://plone.org/products/dexterity/documentation/manual/developer-manual tells me, that it has been updated last time 1114 ago.
Could somebody give me a starting point to develop a very simple dexterity content type for Plone 4 which really works?
For what it's worth, whilst there are a few newer versions of some packages, Professional Plone 4 Development is current with Plone 4.1. I would suggest you use it, and start from its sample code. Don't try to arbitrarily upgrade things until you know you have a working starting point, and you should be OK.
http://pigeonflight.blogspot.com/2012/01/dexterity-development-quickstart-using.html offers a nice quickstart. The most current Dexterity docs are at http://dexterity-developer-manual.readthedocs.org/en/latest/index.html. Yes, this is a little bit of a moving target, documentation-wise, not so much due to Dexterity, which is stable and in production, but mainly because Zopeskel is under heavy development/modernization right now. Sorry about that.
From [https://github.com/collective/templer.plone/blob/master/README.txt][1]
Templer cannot coexist with old ZopeSkel in the same buildout, or Python virtualenv.
Otherwise you will encounter the following error when trying to create packages::
IOError: No egg-info directory found (looked in ./mycompany.content/./mycompany.content.egg-info, ....
Templer is the latest incarnation of ZopeSkel(version 3). I am not sure what version of ZopeSkel you have or if you have mixed versions installed in buildout or virtualenv. But the conflicting installation of ZopeSkel is likely the culprit.
I would start from scratch and recreate vitualenv and just install the latest version of ZopeSkel 2 via buildout. ZopeSkel 3 or Templer is still in heavy development and not all templates have been migrated.
I was able to create a new Plone 4.1.4 site with a new Dexterity content-type using this buildout. This should not be an official answer but pasting the configuration to a volatile service like pastebin is not an option for permanent documentation.
# buildout.cfg file for Plone 4 development work
# - for production installations please use http://plone.org/download
# Each part has more information about its recipe on PyPi
# http://pypi.python.org/pypi
# ... just reach by the recipe name
[buildout]
parts =
instance
zopepy
i18ndude
zopeskel
test
# omelette
extends =
http://dist.plone.org/release/4.1-latest/versions.cfg
http://good-py.appspot.com/release/dexterity/1.2.1?plone=4.1.4
# Add additional egg download sources here. dist.plone.org contains archives
# of Plone packages.
find-links =
http://dist.plone.org/release/4.1-latest
http://dist.plone.org/thirdparty
extensions =
mr.developer
buildout.dumppickedversions
sources = sources
versions = versions
auto-checkout =
nva.borrow
# Create bin/instance command to manage Zope start up and shutdown
[instance]
recipe = plone.recipe.zope2instance
user = admin:admin
http-address = 16080
debug-mode = off
verbose-security = on
blob-storage = var/blobstorage
zope-conf-additional = %import sauna.reload
eggs =
Pillow
Plone
nva.borrow
sauna.reload
plone.app.dexterity
# Some pre-Plone 3.3 packages may need you to register the package name here in
# order their configure.zcml to be run (http://plone.org/products/plone/roadmap/247)
# - this is never required for packages in the Products namespace (Products.*)
zcml =
# nva.borrow
sauna.reload
# zopepy commands allows you to execute Python scripts using a PYTHONPATH
# including all the configured eggs
[zopepy]
recipe = zc.recipe.egg
eggs = ${instance:eggs}
interpreter = zopepy
scripts = zopepy
# create bin/i18ndude command
[i18ndude]
unzip = true
recipe = zc.recipe.egg
eggs = i18ndude
# create bin/test command
[test]
recipe = zc.recipe.testrunner
defaults = ['--auto-color', '--auto-progress']
eggs =
${instance:eggs}
# create ZopeSkel and paster commands with dexterity support
[zopeskel]
recipe = zc.recipe.egg
eggs =
ZopeSkel<=2.99
PasteScript
zopeskel.dexterity<=2.99
${instance:eggs}
# symlinks all Python source code to parts/omelette folder when buildout is run
# windows users will need to install additional software for this part to build
# correctly. See http://pypi.python.org/pypi/collective.recipe.omelette for
# relevant details.
# [omelette]
# recipe = collective.recipe.omelette
# eggs = ${instance:eggs}
# Put your mr.developer managed source code repositories here, see
# http://pypi.python.org/pypi/mr.developer for details on the format of
# this part
[sources]
nva.borrow = svn https://novareto.googlecode.com/svn/nva.borrow/trunk
# Version pindowns for new style products go here - this section extends one
# provided in http://dist.plone.org/release/
[versions]
SITUATION:
I have a python library, which is controlled by git, and bundled with distutils/setuptools. And I want to automatically generate version number based on git tags, both for setup.py sdist and alike commands, and for the library itself.
For the first task I can use git describe or alike solutions (see How can I get the version defined in setup.py (setuptools) in my package?).
And when, for example, I am in a tag '0.1' and call for 'setup.py sdist', I get 'mylib-0.1.tar.gz'; or 'mylib-0.1-3-abcd.tar.gz' if I altered the code after tagging. This is fine.
THE PROBLEM IS:
The problem comes when I want to have this version number available for the library itself, so it could send it in User-Agent HTTP header as 'mylib/0.1-3-adcd'.
If I add setup.py version command as in How can I get the version defined in setup.py (setuptools) in my package?, then this version.py is generated AFTER the tag is made, since it uses the tag as a value. But in this case I need to make one more commit after the version tag is made to make the code consistent. Which, in turns, requires a new tag for further bundling.
THE QUESTION IS:
How to break this circle of dependencies (generate-commit-tag-generate-commit-tag-...)?
You could also reverse the dependency: put the version in mylib/__init__.py, parse that file in setup.py to get the version parameter, and use git tag $(setup.py --version) on the command line to create your tag.
git tag -a v$(python setup.py --version) -m 'description of version'
Is there anything more complicated you want to do that I haven’t understood?
A classic issue when toying with keyword expansion ;)
The key is to realize that your tag is part of the release management process, not part of the development (and its version control) process.
In other word, you cannot include a release management data in a development repository, because of the loop you illustrates in your question.
You need, when generating the package (which is the "release management part"), to write that information in a file that your library will look for and use (if said file exists) for its User-Agent HTTP header.
Since this topic is still alive and sometimes gets to search results, I would like to mention another solution which first appeared in 2012 and now is more or less usable:
https://github.com/warner/python-versioneer
It works in different way than all mentioned solutions: you add git tags manually, and the library (and setup.py) reads the tags, and builds the version string dynamically.
The version string includes the latest tag, distance from that tag, current commit hash, "dirtiness", and some other info. It has few different version formats.
But it still has no branch name for so called "custom builds"; and commit distance can be confusing sometimes when two branches are based on the same commit, so it is better to tag & release only one selected branch (master).
Eric's idea was the simple way to go, just in case this is useful here is the code I used (Flask's team did it this way):
import re
import ast
_version_re = re.compile(r'__version__\s+=\s+(.*)')
with open('app_name/__init__.py', 'rb') as f:
version = str(ast.literal_eval(_version_re.search(
f.read().decode('utf-8')).group(1)))
setup(
name='app-name',
version=version,
.....
)
If you found versioneer excessively convoluted, you can try bump2version.
Just add the simple bumpversion configuration file in the root of your library. This file indicates where in your repository there are strings storing the version number. Then, to update the version in all indicated places for a minor release, just type:
bumpversion minor
Use patch or major if you want to release a patch or a major.
This is not all about bumpversion. There are other flag-options, and config options, such as tagging automatically the repository, for which you can check the official documentation.
Following OGHaza's solution in a similar SO question I keep a file _version.py that I parse in setup.py. With the version string from there, I git tag in setup.py. Then I set the setup version variable to a combination of version string plus the git commit hash. So here is the relevant part of setup.py:
from setuptools import setup, find_packages
from codecs import open
from os import path
import subprocess
here = path.abspath(path.dirname(__file__))
import re, os
VERSIONFILE=os.path.join(here,"_version.py")
verstrline = open(VERSIONFILE, "rt").read()
VSRE = r"^__version__ = ['\"]([^'\"]*)['\"]"
mo = re.search(VSRE, verstrline, re.M)
if mo:
verstr = mo.group(1)
else:
raise RuntimeError("Unable to find version string in %s." % (VERSIONFILE,))
if os.path.exists(os.path.join(here, '.git')):
cmd = 'git rev-parse --verify --short HEAD'
git_hash = subprocess.check_output(cmd)
# tag git
gitverstr = 'v' + verstr
tags = subprocess.check_output('git tag')
if not gitverstr in tags:
cmd = 'git tag -a %s %s -m "tagged by setup.py to %s"' % (gitverstr, git_hash, verstr)
subprocess.check_output(cmd)
# use the git hash in the setup
verstr += ', git hash: %s' % git_hash
setup(
name='a_package',
version = verstr,
....
As was mentioned in another answer, this is related to the release process and not to the development process, as such it is not a git issue in itself, but more how is your release work process.
A very simple variant is to use this:
python setup.py egg_info -b ".`date '+%Y%m%d'`git`git rev-parse --short HEAD`" build sdist
The portion between the quotes is up for customization, however I tried to follow the typical Fedora/RedHat package names.
Of note, even if egg_info implies relation to .egg, actually it's used through the toolchain, for example for bdist_wheel as well and has to be specified in the beginning.
In general, your pre-release and post-release versions should live outside setup.py or any type of import version.py. The topic about versioning and egg_info is covered in detail here.
Example:
v1.3.4dev.20200813gitabcdef0
The v1.3.4 is in setup.py or any other variation you would like
The dev and 20200813gitabcdef0 is generated during the build process (example above)
None of the files generated during build are checked in git (usually in .gitignore they are filtered by default); sometimes there is a separate "deployment" repository, or similar, completely separate from the source one
A more complex way would be to have your release work process encoded in a Makefile which is outside the scope of this question, however a good source of inspiration can be found here and here. You will find good correspondeces between Makefile targets and setup.py commands.
I have a project which I build using SCons (and MinGW/gcc depending on the platform). This project depends on several other libraries (lets call them libfoo and libbar) which can be installed on different places for different users.
Currently, my SConstruct file embeds hard-coded path to those libraries (say, something like: C:\libfoo).
Now, I'd like to add a configuration option to my SConstruct file so that a user who installed libfoo at another location (say C:\custom_path\libfoo) can do something like:
> scons --configure --libfoo-prefix=C:\custom_path\libfoo
Or:
> scons --configure
scons: Reading SConscript files ...
scons: done reading SConscript files.
### Environment configuration ###
Please enter location of 'libfoo' ("C:\libfoo"): C:\custom_path\libfoo
Please enter location of 'libbar' ("C:\libfoo"): C:\custom_path\libbar
### Configuration over ###
Once chosen, those configuration options should be written to some file and reread automatically every time scons runs.
Does scons provide such a mechanism ? How would I achieve this behavior ? I don't exactly master Python so even obvious (but complete) solutions are welcome.
Thanks.
SCons has a feature called "Variables". You can set it up so that it reads from command line argument variables pretty easily. So in your case you would do something like this from the command line:
scons LIBFOO=C:\custom_path\libfoo
... and the variable would be remembered between runs. So next time you just run scons and it uses the previous value of LIBFOO.
In code you use it like so:
# read variables from the cache, a user's custom.py file or command line
# arguments
var = Variables(['variables.cache', 'custom.py'], ARGUMENTS)
# add a path variable
var.AddVariables(PathVariable('LIBFOO',
'where the foo library is installed',
r'C:\default\libfoo', PathVariable.PathIsDir))
env = Environment(variables=var)
env.Program('test', 'main.c', LIBPATH='$LIBFOO')
# save variables to a file
var.Save('variables.cache', env)
If you really wanted to use "--" style options then you could combine the above with the AddOption function, but it is more complicated.
This SO question talks about the issues involved in getting values out of the Variables object without passing them through an Environment.
I am messing around with the combination of buildout and virtualenv to setup an isolated development environment in python that allows to do reproducible builds.
There is a recipe for buildout that let's you integrate virtualenv into buildout:
tl.buildout_virtual_python
With this my buildout.cfg looks like this:
[buildout]
develop = .
parts = script
virtualpython
[virtualpython]
recipe = tl.buildout_virtual_python
headers = true
executable-name = vp
site-packages = false
[script]
recipe = zc.recipe.egg:scripts
eggs = foo
python = virtualpython
This will deploy two executables into ./bin/:
vp
script
When I execute vp, I get an interactive, isolated python dialog, as expected (can't load any packages from the system).
What I would expect now, is that if I run
./bin/script
that the isolated python interpreter is used. But it doesn't, it's not isolated as "vp" is (meaning I can import libraries from system level). However I can run:
./bin/vp ./bin/script
Which will run the script in an isolated environment as I wished. But there must be a way to specify this to do so without chaining commands otherwise buildout only solves half of the problems I hoped :)
Thanks for your help!
Patrick
You don't need virtualenv: buildout already provides an isolated environment, just like virtualenv.
As an example, look at files buildout generates in the bin directory. They'll have something like:
import sys
sys.path[0:0] = [
'/some/thing1.egg',
# and other things
]
So the sys.path gets completely replaced with what buildout wants to have on the path: the same isolation method as virtualenv.
zc.buildout 2.0 and later does not provide the isolated environment anymore.
But virtualenv 1.9 and later provides complete isolation (including to not install setuptools).
Thus the easiest way to get a buildout in a complete controlled environment is to run the following steps (here i.e. for Python 2.7):
cd /path/to/buildout
rm ./bin/python
/path/to/virtualenv-2.7 --no-setuptools --no-site-packages --clear .
./bin/python2.7 bootstrap.py
./bin/buildout
Preconditions:
bootstrap.py has to be a recent one matching the buildout version you are using. You'll find the latest at http://downloads.buildout.org/2/
if there are any version pins in your buildout, ensure they do not pin buildout itself or recipes/ extensions to versions not compatible with zc.buildout 2 or later.
Had issue running buildout using bootstrap on ubuntu server, from then I use virtualenv and buildout together. Simply create virualenv and install buildout in it. This way only virtualenv has to be installed into system (in theory1).
$ virtualenv [options_you_might_need] virtual
$ source virtual/bin/activate
$ pip install zc.buildout
$ buildout -c <buildout.cfg>
Also tell buildout to put its scripts in to virtual/bin/ directory, that way scripts appear on $PATH.
[buildout]
bin-directory = ${buildout:directory}/virtual/bin
...
1: In practice you probably will need to eggs what require compilation to system level that require compilation. Eggs like mysql or memcache.
I've never used that recipe before, but the first thing I would try is this:
[buildout]
develop = .
parts = script
virtualpython
[virtualpython]
recipe = tl.buildout_virtual_python
headers = true
executable-name = vp
site-packages = false
[script]
recipe = zc.recipe.egg:scripts
eggs = foo
python = virtualpython
interpreter = vp
If that doesn't work, you can usually open up the scripts (in this case vp and script) in a text editor and see the Python paths that they're using. If you're on windows there will usually be a file called <script_name>-script.py. In this case, that would be vp-script.py and script-script.py.
I didn't want to install python modules using easy install, symlinks in site-packages or PYTHONPATH.
So, I am trying something that I do wants system wide, then any application installation is done locally. Note, the root password is required only once here.
First create a symblink of.../pythonX.Y/site-packages/mymodules -> /home/me/lib/python_related
So, I create a directory called
/home/me/lib/python_related/
In there:
/home/me/lib/python_related
/home/me/lib/python_related/__init__.py
/home/me/lib/python_related/django_related/
/home/me/lib/python_related/django_related/core
/home/me/lib/python_related/django_related/core/Django1.0
/home/me/lib/python_related/django_related/core/Django1.1
/home/me/lib/python_related/django_related/core/mycurrent_django -> Django1.1/django
/home/me/lib/python_related/django_related/apps
/home/me/lib/python_related/django_related/apps/tagging
/home/me/lib/python_related/django_related/apps/tagging/django-tagging-0.2
/home/me/lib/python_related/django_related/apps/tagging/django-tagging-0.3
/home/me/lib/python_related/django_related/apps/tagging/mycurrent_tagging -> django-tagging-0.3
Now, here is the content of:
/home/me/lib/python_related/__init__.py
==========================================
import sys, os
# tell us where you keep all your modules and this didn't work as it gave me
# the location of the site-packages
#PYTHON_MODULE_PATH = os.path.dirname(__file__)
PYTHON_MODULE_PATH = "/home/me/libs/python_bucket"
def run_cmd(cmd):
"""
Given a command name, this function will run the command and returns the output
in a list.
"""
output = []
phdl = os.popen(cmd)
while 1:
line = phdl.readline()
if line == "":
break
output.append(line.replace("\n", ""))
return output
def install():
"""
A cheesy way of installing and managing your python apps locally without
a need to install them in the site-package. All you'd need is to install
the directory containing this file in the site-package and that's it.
Anytime you have a python package you want to install, just put it in a
proper sub-directory and make a symlink to that directory called mycurrent_xyz
and you are done. (e.g. mycurrent_django, mycurrent_tagging .. etc)
"""
cmd = "find %s -name mycurrent_*" % PYTHON_MODULE_PATH
modules_to_be_installed = run_cmd(cmd)
sys.path += modules_to_be_installed
install()
=======================================================
Now in any new python project, just import your mymodules and that pulls in any apps that you have in the above directory with the proper symbolic link. This way you can have multiple copies of apps and just use the mycurrent_xyz to the one you want to use.
Now here is question. Is this a good way of doing it?
Have a look at virtualenv.
It may do what you are after.