I would like to gather many libraries I have made while working on my projects in some kind of container, so that I can easily use any of them in future projects of mine. It is pretty clear to me how to do this, except one part.
I am assuming that every service will have its own config file (for instance, the Cache service, will have a config file with cache host and port, and so on). Now the problem is: when I want to use this container in an arbitrary project I will have to make assumptions about the project directory structure to know where to find these config files.
For instance, one might assume that on the same path of my library there is a config folder where I will find the config files of my services. However, this might conflict with the project's directory structure (i.e. the project might already have its own config directory for instance).
So, all in all, my question is: is there a safe, standard way to ship a library which might assume to find some config files someplace, or for which example config files are shipped along with the library itself?
well, you should not keep config files, or anything that you want to modify along with code in python (or actually in any language). Each OS have folders for that purpose.
Either it's system wide, and on Unix it's /etc or it's for an user it's in ~/.config. You have theLibrary folders for OSX, and I'm sure there's something alike for windows beyond \Windows\SYSTEM32 😉.
What that means is that the path to your configuration files shall not be considered relative to your code at any point. Never. Ever.
You can include some assets in a python package, using the MANIFEST.in but, as it'll be within your python package, you shall assume you won't have rights to write where it'll be (installed by admin, ran by user).
You can also specify some of those assets to install at specific places using setup.py's data_files directive, which will be installed relatively to sys.prefix.
Common practice is to provide configuration files examples using a link from the documentation, or better generate those files when starting the application.
Also, another trend for desktops, is to use the XDG directory specification, to decide where to look for, or where to place your configuration files.
To sum it up:
make a list of default paths your code expects to find the configuration,
make it possible to specify manually at command line the path to the configuration python foo.py --config bar.ini
write a feature for your tool to generate the configuration (with a series of questions)
deploy your default configurations to standard places (XDG paths, $prefix/etc…)
Related
For the last two days I am struggling with the following question:
Given an absolute path (string) to a directory (inside my file system or the server, it doesn't matter), determine if this dir contains a valid Django project.
First, I thought of looking for the manage.py underneath, but what if some user omit or rename this file?
Secondly, I thought of locating the settings module but I want the project root, what if the settings is 2 or more levels deep?
Third, I thought of locating the (standard) BASE_DIR name inside settings, but what if the user has not defined it or renamed it?
Is there a way to properly identify a directory as a valid Django project? Am I missing something?
One way you can try is searching/reading the.py files in the directory and match with regex a pattern describing
the distinctive django main function and package names.
Might be lucrative, but... eh
As bruno desthuilliers mentioned in the comments, there is no fail-safe solution.
But it depends on what do you need it for. I mean, how strict it has to be (or how good?). I have seen a few times the usage of PROJECT_DIR instead of BASE_DIR. And about the settings, I've seen single settings.py file and several settings files within a settings directory.
The main challenge would be Django not having that much hard naming rules for files/modules. You can pretty much name your files whatever, as far as you put the pieces together properly.
If I'm not mistaken, the only hard rule Django has is the models.py file. A Django app must have a file named models.py. But, your Django application doesn't necessarily need to have any app installed.
If you only need a good enough solution, I would say manage.py is a good candidate. You could double check and open the file and see if there's a django import there. If no manage.py, check if there's a requirements.txt and check if Django is listed there. If no requirements.txt, check for a directory named requirements and for text files within.
Something you could do to find some patterns is browsing the repositories tagged as with "django" tag on GitHub. Maybe use their API and clone all repositories (it will only list the top 2000 repos) to your local machine or to a server. Clone some rails, javascript, etc. repos as well and write your code step by step. Just search for a manage.py, if the result is satisfatory, that's it. If not, add a couple of more rules and test against the cloned repositories until you find a good enough solution for your problem.
I am creating a Python package whose features include logging certain actions in a database. Thus I would like to store the database in a location such as ~/.package-name/database.sqlite. Additionally, ~/.package-name could be a directory to hold configuration files.
What is the best practice for doing this? I am using setuptools to handle package installation. I imagine that within one of my modules, I would have code that checks for the existence of the database file and config file(s), creating them if necessary.
Reading the documentation, it states
you can’t actually install data files to some arbitrary location on a user’s machine; this is a feature, not a bug. You can always include a script in your distribution that extracts and copies your the documentation or data files to a user-specified location, at their discretion.
It seems that I cannot create the location ~/.package-name using setuptools. So should I create this directory the first time the user runs the program by checking for the directory and invoking a script or function?
Is there a standard sort of example I might look at? I had some trouble searching for my problem.
There are python tools like check-manifest, to verify that all your files under your vcs are included also in your MANIFEST.in. And releasing helpers like zest.releaser recommend you to use them.
I think files in tests or docs directories are never used directly from the python package. Usually services like read the docs or travis-ci are going to access that files, and they get the files from the vcs, not from the package. I have seen also packages including .travis.yml files, what makes even less sense to me.
What is the advantage of including all the files in the python package?
I have a python module which generates large data files which I want to cache on disk for future use. The cache is likely to end up some hundreds of MB for a normal user, but save a lot of computation time.
The files aren't distributed with the module, but are generated the first time the code is run with a given set of parameters.
So far I've just been using a single file module myself and putting them in a hardcoded path relative to the module (data/). But I now need to distribute this module in a Python package with distutils and I was wondering if there is a standard way to do that.
I was thinking of something like the compiled cache of scipy.weave - but wondering if there is a more modern supported way of doing it. On *nix platforms I would expect it to go in ~/.something but I'm not sure what the windows equivalent would be. Also this should configurable so that users can point it somewhere else if it's more convenient, or to share the cache dir between users. How should such a config file work? Where should it go?
Or should I just have it as an install option, either through a config file next to setup.py or set by manually editing setup.py, then hard code the directory in the module before installation?
Any pointers greatfully received...
You can use the standard library module ConfigParser to parse an ini file (or .rc file depending on your culture). To find the file, os.path.expanduser is a useful function that does the right thing on all platforms for paths like "~/.mytoolrc". To let the user override the location of things, you can use environment variables via os.environ.
There is an emerging standard in the free OS world: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html
This module can help you for Windows and Max OS X, but it seems to be broken with respect the the XDG Base Dir Spec: http://pypi.python.org/pypi/appdirs
This should be a common scenario, but could not find any relevant post yet..
I plan to deploy a Python library (I guess the same applies to regular applications) which makes use of some images and other resource files. What is the standard location for such items? I imagine, for project Foo, the choices would be
Have resources directory in the source repository and then move files to /usr/share/foo/
Place resources directly inside the python package that goes under /usr/lib/python-<version>/foo/
Any suggestions?
Edit: As suggested, clarifying that the main platform this will be running on is Linux.
This question is somewhat incomplete, because a proper answer would depend on the underlying operating system, as each has its own modus operandi. In linux (and most unix based OSs) for example /usr/share/foo or /usr/local/share/foo would be the standard. In OS X you can do the same, but I would think "/Library/Application Support/Foo" (although that's usually for storing settings and whatnot) would be the place to put such things, though if you're writing libraries following the "Framework" idea, all the resources would be included in the /Library/Frameworks/Foo.Framework" ... Apps on OS X on the other hand should keeps all there resources within the Resources directory inside Foo.app
We put non .py files in /opt/foo/foo-1.2/...
Except, of course, for static media that is served by Apache, that goes to /var/www/html/foo/foo-1.1/media/...
Except, of course, for customer-specific configuration files. They go to
/var/opt/customer/foo/...
Those follow the Linux standards as I understand them.
We try to stay away from /usr/lib/ and /lib kinds of locations because those feel like they're part of the distribution. We lean toward /opt and /var because they're clearly separated from the linux distro directories.
The standard location is where your standard libs goes. But it doesn't sound to me from what you've written, that you'll want your python lib there. I think you should try out Virtualenv.
If you don't want to go through all the trouble (well, it really just amounts to sudo easy_install virtualenv for you), you could try to just dump your python lib in any dir in your ~/ and do something along the lines of
import sys
sys.path.append( '/full/path/to/your/lib/goes/here')
to any given application that uses your lib.
Please bear in mind, that the examples given are for test-purposes only. For anything live-ish, I would recommend that you use distutil. Examples of use are given here.