Python ConfigParser - usage across modules - python

I'm trying to understand the best way to implement Python's ConfigParser so that elements are accessible to multiple program modules. I'm using:-
import ConfigParser
import os.path
config = ConfigParser.RawConfigParser()
config.add_section('paths')
homedir = os.path.expanduser('~')
config.set('paths', 'user', os.path.join(homedir, 'users'))
[snip]
config.read(configfile)
In the last line the configfile variable is referenced. This has to be passed to the Config module so the user can override a default config file. How should I implement this in a module/class in such a manner that other modules can use config.get(spam, eggs)?

Let's say the code you have written in your question is inside a module configmodule.py. Then other modules get access to it in the following way:
from configmodule import config
config.get(spam, eggs)

Related

import a dictionary built at run-time

My code is structured as follows:
main.py
utils.py
blah.py
The main module uses argparse to read in the location of a configurations yaml file which is then loaded as a dictionary. Is there a way for utils and blah to import this built-up dictionary?
Edit: I tried using from main import config (config being the dictionary I built) but I get ImportError: cannot import name 'config' from 'main'
Edit2: Main imports the other 2 modules - apologies for leaving out this very important detail
I would recommend making another file, say, globals.py. Import this in main, utils, and blah, and set properties in it to be recalled by the other modules. For example:
globals.py
configs = {}
main.py
import .globals
...
user_configs = yaml.load('user/entered/path.yml')
globals.configs.update(user_configs) # modifies the global `configs` variable
utils.py
import .globals
...
# need to use one of the configs for something:
try:
relevant_config = globals.configs['relevant_config']
except KeyError:
print("User did not input the config field 'relevant_config'")
All modules will be able to see the same globals instance, thus allowing you to use what are effectively global variables across your program.
You could simply save configs as a gobal variable in main.py and have utils.py and blah.py import .main, but having a designated module for this is cleaner and clearer than to have other modules importing the main module.
Just do
import main
and use it as
main.dictionary
That should do it!

Question of understanding Modules/Packages

so I'm new to Python and Flask and I'm currently playing around with some CRUD-statements within Flask/Python
I want to know if I fully understand what's going on but I'm a little bit unsecure regarding the following topic: Modules, Packages import
I want to connect to my SQLite database with Flask. Doing so, I have to do some imports:
import os
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
First thing after the imports are done is to set a basedirectory (=basedir):
basedir = os.path.abspath(os.path.dirname(__file__))
And regarding those steps I have some questions:
Question:
import os
from flask import Flask
Does the first import ("import os") mean that I'm only using a Module called "os"? It's a standalone .py - "file" including a class, some attributes and methods, right?
Does the second import ("from flask import Flask") mean that I'm using the package "flask" and import the module "Flask"? If, e.g., there would be another import like "render_template", does that mean I'm using this module or is it a method from the module "Flask"?
Second question:
basedir = os.path.abspath(os.path.dirname(__file__))
I'd like to understand this code. First of all, I'm declaring a variable called basedir. Then I am going to set the value of that variable to the absolute path for the current .py-script. Now to the single steps:
os => means that I'm using the already imported module "os", right?
path => means that I'm using an attribute from that module?
abspath => means that I'm using a method within the "os" module called "abspath(value)"?
The next thing would be clear if I get an answer to the other things: "
(os.path.dirname(__filename__))
__filename__ => that's a built-in Python attribute, right?
Does the first import ("import os") mean that I'm only using a Module called "os"?
As the statement implies, you're importing the OS module, so you can use the functions in the os module in your python script.
So, now you can make os.function() statements in your script. The OS module is installed with Python by default. Here is info on the os module.
Does the second import ("from flask import Flask") mean that I'm using the package "flask" and import the module "Flask"? If, e.g., there would be another import like "render_template", does that mean I'm using this module or is it a method from the module "Flask"?
This can be confusing since the function name and the import statement have the same name. You're only importing the function flask from the module Flask, not all the functions present in the Flask module.
This can be done for multiple reasons. On is to simplify calling the function. Another could be to save system resources, since you're only
os => means that I'm using the already imported module "os", right? path => means that I'm using an attribute from that module? abspath => means that I'm using a method within the "os" module called "abspath(value)"?
Exactly, read the docs for an explanation by the developers of the module.
Filename
Here is an explanation of the filename usage in Python.
Im gonna answer the first question. Basically when you do just an import, python imports the entire file with all of its modules and functions. Like when you import math you can use math.ceil and other functions. However when you say from math import add you only get a specific module which is ceil like ceil(2.7).
For further details read up here

Adding Command Line Arguments to Python Twisted

I am still new to Python so keep that in mind when reading this.
I have been hacking away at an existing Python script that was originally "put" together by a few different people.
The script was originally designed to load it's 'configuration' using a module named "conf/config.py" which is basically Python code.
SETTING_NAME='setting value'
I've modified this to instead read it's settings from a configuration file using ConfigParser:
import ConfigParser
config_file_parser = ConfigParser.ConfigParser()
CONFIG_FILE_NAME = "/etc/service_settings.conf"
config_file_parser.readfp(open(r'' + CONFIG_FILE_NAME))
SETTING_NAME = config_file_parser.get('Basic', 'SETTING_NAME')
The problem I am having is how to specify the configuration file to use. Currently I have managed to get it working (somewhat) by having multiple TAC files and setting the "CONFIG_FILE_NAME" variable there using another module to hold the variable value. For example, I have a module 'conf/ConfigLoader.py":
global CONFIG_FILE_NAME
Then my TAC file has:
import conf.ConfigLoader as ConfigLoader
ConfigLoader.CONFIG_FILE_NAME = '/etc/service_settings.conf'
So the conf/config.py module now looks like:
import ConfigLoader
config_file_parser = ConfigParser.ConfigParser()
config_file_parser.readfp(open(r'' + ConfigLoader.CONFIG_FILE_NAME))
It works, but it requires managing two files instead of a single conf file. I attempted to use the "usage.Options" feature as described on http://twistedmatrix.com/documents/current/core/howto/options.html. So I have twisted/plugins/Options.py
from twisted.python import usage
global CONFIG_FILE_NAME
class Options(usage.Options):
optParameters = [['conf', 'c', 'tidepool.conf', 'Configuration File']]
# Get config
config = Options()
config.parseOptions()
CONFIG_FILE_NAME = config.opts['conf']
That does not work at all. Any tips?
I don't know if I understood your problem.
If you want to load the configuration from multiple locations you could pass a list of filenames to the configparser: https://docs.python.org/2/library/configparser.html#ConfigParser.RawConfigParser.read
If you were trying to make a generic configuration manager, you could create a class of a functions the receives the filename or you could use set the configuration file name in an environment variable and read that variable in your script using something like os.environ.get('CONFIG_FILE_NAME').

Python - Importing a global/site-packages module rather than the file of the same name in the local directory

I'm using python and virtualenv/pip. I have a module installed via pip called test_utils (it's django-test-utils). Inside one of my django apps, I want to import that module. However I also have another file test_utils.py in the same directory. If I go import test_utils, then it will import this local file.
Is it possible to make python use a non-local / non-relative / global import? I suppose I can just rename my test_utils.py, but I'm curious.
You can switch the search order by changing sys.path:
del sys.path[0]
sys.path.append('')
This puts the current directory after the system search path, so local files won't shadow standard modules.
My problem was even more elaborate:
importing a global/site-packages module from a file with the same name
Working on aero the pm recycler I wanted access to the pip api, in particular pip.commands.search.SearchCommand from my adapter class Pip in source file pip.py.
In this case trying to modify sys.path is useless, I even went as far as wiping sys.path completely and adding the folder .../site-packages/pip...egg/ as the only item in sys.path and no luck.
I would still get:
print pip.__package__
# 'aero.adapters'
I found two options that did eventually work for me, they should work equally well for you:
using __builtin__.__import__() the built-in function
global_pip = __import__('pip.commands.search', {}, {}, ['SearchCommand'], -1)
SearchCommand = global_pip.SearchCommand
Reading the documentation though, suggests using the following method instead.
using importlib.import_module() the __import__ conv wrapper.
The documentation explains that import_module() is a minor subset of functionality from Python 3.1 to help ease transitioning from 2.7 to 3.1
from importlib import import_module
SearchCommand = import_module('pip.commands.search').SearchCommand
Both options get the job done while import_module() definitely feels more Pythonic if you ask me, would you agree?
nJoy!
I was able to force python to import the global one with
from __future__ import absolute_import
at the beginning of the file (this is the default in python 3.0)
You could reset your sys.path:
import sys
first = sys.path[0]
sys.path = sys.path[1:]
import test_utils
sys.path = first + sys.path
The first entry of sys.path is "always" (as in "per default": See python docs) the current directory, so if you remove it you will do a global import.
Since my test_utils was in a django project, I was able to go from ..test_utils import ... to import the global one.
Though, in first place, I would always consider keeping the name of local file not matching with any global module name, an easy workaround, without modifying 'sys.path' can be to include global module in some other file and then import this global module from that file.
Remember, this file must be in some other folder then in the folder where file with name matching with global module is.
For example.
./project/root/workarounds/global_imports.py
import test_utils as tutil
and then in
./project/root/mycode/test_utils.py
from project.root.workarounds.global_imports import tutil
# tutil is global test_utils
# you can also do
from project.root.workarounds.global_imports import test_utils

Is it possible to 'import * from DIRECTORY', then somehow (anyhow) iterate over the loaded modules?

Let me explain the use case...
In a simple python web application framework designed for Google App Engine, I'd like to have my models loaded automatically from a 'models' directory, so all that's needed to add a model to the application is place a file user.py (for example), which contains a class called 'User', in the 'models/' directory.
Being GAE, I can't read from the file system so I can't just read the filenames that way, but it seems to me that I must be able to 'import * from models' (or some equivalent), and retrieve at least a list of module names that were loaded, so I can subject them to further processing logic.
To be clear, I want this to be done WITHOUT having to maintain a separate list of these module names for the application to read from.
You can read from the filesystem in GAE just fine; you just can't write to the filesystem.
from models import * will only import modules listed in __all__ in models/__init__.py; there's no automatic way to import all modules in a package if they're not declared to be part of the package. You just need to read the directory (which you can do) and __import__() everything in it.
As explained in the Python tutorial, you cannot load all .py files from a directory unless you list them manually in the list named __all__ in the file __init__.py. One of the reasons why this is impossible is that it would not work well on case-insensitive file systems -- Python would not know in which case the module names should be used.
Let me start by saying that I'm not familiar with Google App Engine, but the following code demonstrates how to import all python files from a directory. In this case, I am importing files from my 'example' directory, which contains one file, 'dynamic_file.py'.
import os
import imp
import glob
def extract_module_names(python_files):
module_names = []
for py_file in python_files:
module_name = (os.path.basename(py_file))[:-3]
module_names.append(module_name)
return module_names
def load_modules(modules, py_files):
module_count = len(modules)
for i in range(0, module_count):
globals()[modules[i]] = imp.load_source(modules[i], py_files[i])
if __name__ == "__main__":
python_files = glob.glob('example/*.py')
module_names = extract_module_names(python_files)
load_modules(module_names, python_files)
dynamic_file.my_func()
Also, if you wish to iterate over these modules, you could modify the load_modules function to return a list of the loaded module objects by appending the 'imp.load_source(..)' call to a list.
Hope it helps.

Categories

Resources