Sharing Variables across new object instances - python

Background:
I am writing a module in order to set-up an embedded system. In this context I need to load some modules and perform some system settings.
Context:
I have a parent class holding some general code (load the config file, build ssh connection etc.) used for several child classes. One of them is the module class that sets up the module and therefore uses among otherthings the ssh connection and the configuration file.
My goal is to share the configuration file and the connection with the next module, that will be setup. For the connection its just a waste to build and destroy it all the time, but for the configuration file, changes during setup can lead to undefined behavior.
Research/ approaches:
I tried using class variables, however they aren't passed when initaiting
a new module object.
Futher, I tried using global variables, but since the parent class and the
child classes are in different files, This won't work (Yes, i can put them
all in one file but this will be a mess) Also using a getter function from
the file where I defined the global variable didn't work.
I am aware of the 'builtin' solution from
How to make a cross-module variable?
variable, but feel this would be a bit of an overkill...
Finally, I can keep the config file and and the connection in a central
script and pass them to each of the instances, but this will lead to loads
of dependencies and I don't think it's a good solution.
So here is a bit of code with an example method, to get some file paths.
The code is set up according to approach 1 (class vaiables)
An example config file:
Files:
Core:
Backend:
- 'file_1'
- 'file_2'
Local:
File_path:
Backend: '/the/path/to'
The Parent class in setup_class.py
import os
import abc
import yaml
class setup(object):
__metaclass__ = abc.ABCMeta
configuration = []
def check_for_configuration(self, config_file):
if not self.configuration:
with open(config_file, "r") as config:
self.configuration = yaml.safe_load(config)
def get_configuration(self):
return self.configuration
def _make_file_list(self, path, names):
full_file_path = []
for file_name in names:
temp_path = path + '/' + file_name
temp_path = temp_path.split('/')
full_file_path.append(os.path.join(*temp_path))
return full_file_path
#abc.abstractmethod
def install(self):
raise NotImplementedError
The module class in module_class.py
from setup_class import setup
class module(setup):
def __init__(self, module_name, config_file = ''):
self.name = module_name
self.check_for_configuration(config_file)
def install(self):
self._backend()
def _backend(self):
files = self._make_file_list(
self.configuration['Local']['File_path']['Backend'],
self.configuration['Files'][self.name]['Backend'])
if files:
print files
And finally a test script:
from module_class import module
Analysis = module('Analysis',"./example.yml")
Core = module('Core', "./example.yml")
Core.install()
Now, when running the code, the config file is loaded everytime, a new module object is initaiated. I would like to avoid this. Are there approaches I have not considdered? What's the neatest way to achive this?

Save your global values in a global dict, and refer to that inside your module.
cache = {}
class Cache(object):
def __init__(self):
global cache
self.cache = cache

Related

How can I redirect module imports with modern Python?

I am maintaining a python package in which I did some restructuring. Now, I want to support clients who still do from my_package.old_subpackage.foo import Foo instead of the new from my_package.new_subpackage.foo import Foo, without explicitly reintroducing many files that do the forwarding. (old_subpackage still exists, but no longer contains foo.py.)
I have learned that there are "loaders" and "finders", and my impression was that I should implement a loader for my purpose, but I only managed to implement a finder so far:
RENAMED_PACKAGES = {
'my_package.old_subpackage.foo': 'my_package.new_subpackage.foo',
}
# TODO: ideally, we would not just implement a "finder", but also a "loader"
# (using the importlib.util.module_for_loader decorator); this would enable us
# to get module contents that also pass identity checks
class RenamedFinder:
#classmethod
def find_spec(cls, fullname, path, target=None):
renamed = RENAMED_PACKAGES.get(fullname)
if renamed is not None:
sys.stderr.write(
f'WARNING: {fullname} was renamed to {renamed}; please adapt import accordingly!\n')
return importlib.util.find_spec(renamed)
return None
sys.meta_path.append(RenamedFinder())
https://docs.python.org/3.5/library/importlib.html#importlib.util.module_for_loader and related functionality, however, seem to be deprecated. I know it's not a very pythonic thing I am trying to achieve, but I would be glad to learn that it's achievable.
On import of your package's __init__.py, you can place whatever objects you want into sys.modules, the values you put in there will be returned by import statements:
from . import new_package
from .new_package import module1, module2
import sys
sys.modules["my_lib.old_package"] = new_package
sys.modules["my_lib.old_package.module1"] = module1
sys.modules["my_lib.old_package.module2"] = module2
If someone now uses import my_lib.old_package or import my_lib.old_package.module1 they will obtain a reference to my_lib.new_package.module1. Since the import machinery already finds the keys in the sys.modules dictionary, it never even begins looking for the old files.
If you want to avoid importing all the submodules immediately, you can emulate a bit of lazy loading by placing a module with a __getattr__ in sys.modules:
from types import ModuleType
import importlib
import sys
class LazyModule(ModuleType):
def __init__(self, name, mod_name):
super().__init__(name)
self.__mod_name = name
def __getattr__(self, attr):
if "_lazy_module" not in self.__dict__:
self._lazy_module = importlib.import(self.__mod_name, package="my_lib")
return self._lazy_module.__getattr__(attr)
sys.modules["my_lib.old_package"] = LazyModule("my_lib.old_package", "my_lib.new_package")
In the init file of the old module, have it import from the newer modules
Old (package.oldpkg):
foo = __import__("Path to new module")
New (package.newpkg):
class foo:
bar = "thing"
so
package.oldpkg.foo.bar is the same as package.newpkg.foo.bar
Hope this helps!
I think that this is what you are looking for:
RENAMED_PACKAGES = {
'my_package.old_subpackage.foo': 'my_package.new_subpackage.foo',
}
class RenamedFinder:
#classmethod
def find_spec(cls, fullname, path, target=None):
renamed = RENAMED_PACKAGES.get(fullname)
if renamed is not None:
sys.stderr.write(
f'WARNING: {fullname} was renamed to {renamed}; please adapt import accordingly!\n')
spec = importlib.util.find_spec(renamed)
spec.loader = cls
return spec
return None
#staticmethod
def create_module(spec):
return importlib.import_module(spec.name)
#staticmethod
def exec_module(module):
pass
sys.meta_path.append(RenamedFinder())
Still, IMO the approach that manipulates sys.modules is preferable as it is more readable, more explicit, and provides you much more control. It might become useful especially in further versions of your package when my_package.new_subpackage.foo starts to diverge from my_package.old_subpackage.foo while you would still need to provide the old one for backward compatibility. For that reason, you would maybe need to preserve the code of both anyway.
Consolidate all the old package names into my_package.
Old packages (old_package):
image_processing (class) Will be deleted and replaced by better_image_processing
text_recognition (class) Will be deleted and replaced by better_text_recognition
foo (variable) Will be moved to better_text_recognition
still_there (class) Will not move
New packages:
super_image_processing
better_text_recognition
Redirector (class of my_package):
class old_package:
image_processing = super_image_processing # Will be replaced
text_recognition = better_text_recognition # Will be replaced
Your main new module (my_package):
#imports here
class super_image_processing:
def its(gets,even,better):
pass
class better_text_recognition:
def now(better,than,ever):
pass
class old_package:
#Links
image_processing = super_image_processing
text_recognition = better_text_recognition
still_there = __import__("path to unchanged module")
This allows you to delete some files and keep the rest. If you want to redirect variables you would do:
class super_image_processing:
def its(gets,even,better):
pass
class better_text_recognition:
def now(better,than,ever):
pass
class old_package:
#Links
image_processing = super_image_processing
text_recognition = better_text_recognition
foo = text_recognition.foo
still_there = __import__("path to unchanged module")
Would this work?

Is there a "proper" way to load config files using ConfigParser? Or, is there an equivalent to logging.getLogger for config files?

I have a particularly large python project that has modules upon modules and classes that call other classes and so on. It's an organized mess.
I want to be able to just read the config file once and get/set key-value pairs out of it from any part of my program.
Right now, my setup looks like this: I have a module with these function
def initialize():
config_file = configparser.ConfigParser()
config_file_path = os.path.join(Path(__file__).resolve().parents[2], 'config.ini')
try:
config_file.read(config_file_path)
except FileNotFoundError:
raise FileNotFoundError
else:
return config_file, config_file_path
def get_config_data(section, key):
config_file, _ = initialize()
return config_file[section][key]
def set_config_data(section, key, value):
config_file, config_file_path = initialize()
config_file.set(section, key, str(value))
with open(config_file_path, 'w') as f:
config_file.write(f)
and whenever I need a config key-value pair, I just import it as CFG and use CFG.get_config_data(KEY, VALUE). Which means I have to run initialize every single time I need something. I don't think it's ideal (or is it? I genuinely don't know)
Is there a "proper and standard" method for reading config files in large python projects? Something that I can just import and get in the beginning? Or there's nothing wrong with my set-up as it is?

Use project config variables across different Python scripts

I am working on a project with multiple directories, each having a number of python scripts. And it involves use of certain key parameters I pass using a yaml config file.
Currently the method used is, (I'd say it is naive as) it simply parses the yaml to a python dictionary, which is then imported in other scripts and values are accessed.
From what I could find, there is:
Abseil library that can be used for accessing flags across different scripts but using it is cumbersome.
Another approach using a Class (preferably singleton), putting all global variables in it and exposing instance of that class in other scripts.
I wanted to ask, is there any other library that can be used for this purpose? And what is the most pythonic methodolgy to deal with it?
Any help would be really appreciated. Thanks!
To make global values accessible across modules I use the Class (singleton) Method.
The code I list below is also in my Gisthub https://gist.github.com/auphofBSF/278206afff675cd30377f4894a5b2b1d
My generic GlobalValues singleton class and usage is as follows. This class is located in a subdirectory below the main. In the example of use that I also attach I place the GlobalValues class in a file globals.py in the folder myClasses
class GlobalValues:
"""
a Singleton class to serve the GlobalValues
USAGE: (FirstTime)
from myClasses.globals import GlobalValues
global_values = GlobalValues()
global_values.<new value> = ...
... = global_values.<value>
USAGE: (Second and n'th time, in same module or other modules)
NB adjust `from myClasses.globals` dependent on relative path to this module
from myClasses.globals import GlobalValues
global_values = GlobalValues.getInstance()
global_values.<new value> = ...
... = global_values.<value>
"""
__instance = None
DEFAULT_LOG_LEVEL="CRITICAL"
#staticmethod
def get_instance():
""" Static access method. """
if GlobalValues.__instance == None:
GlobalValues()
return GlobalValues.__instance
def __init__(self):
""" Virtually private constructor. """
if GlobalValues.__instance != None:
raise Exception("This class is a singleton! once created use global_values = Glovalvalues.get_instance()")
else:
GlobalValues.__instance = self
my Example of use is as follows
Example File layout
<exampleRootDir>
Example_GlobalValues_Main.py #THIS is the main
myClasses # A folder
globals.py #for the singleton class GlobalValues
exampleSubModule.py # demonstrates use in submodules
Example_GlobalValues_Main.py
print(
"""
----------------------------------------------------------
Example of using a singleton Class as a Global value store
The files in this example are in these folders
file structure:
<exampleRootDir>
Example_GlobalValues_Main.py #THIS is the main
myClasses # A folder
globals.py #for the singleton class GlobalValues
exampleSubModule.py # demonstrates use in submodules
-----------------------------------------------------------
"""
)
from myClasses.globals import GlobalValues
globalvalues = GlobalValues() # THe only place an Instance of GlobalValues is created
print(f"MAIN: global DEFAULT_LOG_LEVEL is {globalvalues.DEFAULT_LOG_LEVEL}")
globalvalues.DEFAULT_LOG_LEVEL = "DEBUG"
print(f"MAIN: global DEFAULT_LOG_LEVEL is now {globalvalues.DEFAULT_LOG_LEVEL}")
#Add a new global value:
globalvalues.NEW_VALUE = "hello"
#demonstrate using global modules in another module
from myClasses import exampleSubModule
print(f"MAIN: globalvalues after opening exampleSubModule are now {vars(globalvalues)}")
print("----------------- Completed -------------------------------")
exampleSubModule.py is as follows and is located in the myClasses folder
"""
Example SubModule using the GlobalValues Singleton Class
"""
# observe where the globals module is in relation to this module . = same directory
from .globals import GlobalValues
# get the singleton instance of GlobalValues, cannot instantiate a new instance
exampleSubModule_globalvalues = GlobalValues.get_instance()
print(f"exampleSubModule: values in GlobalValues are: {vars(exampleSubModule_globalvalues)}")
#Change a value
exampleSubModule_globalvalues.NEW_VALUE = "greetings from exampleSubModule"
#add a new value
exampleSubModule_globalvalues.SUBMODULE = "exampleSubModule"

Python: generate secure temporary file name

In the context of writing unit tests for a backend class, I need a secure way to generate a temporary file name. My current approach is:
fp = tempfile.NamedTemporaryFile(delete=False)
fp.close()
with Backend(fp.name) as backend:
...run the test...
os.unlink(fp.name)
This is a bit awkward. Does there exist a standard library context manager which allows to achieve the same by:
with TempFileName() as name:
with Backend(name) as backend:
...run the test...
Current Solution
It appears that no pre-made context manager exists. I am now using:
class TemporaryBackend(object):
def __init__(self):
self.fp = tempfile.NamedTemporaryFile(delete=False)
self.fp.close()
self.backend = Backend(self.fp.name)
def __enter__(self):
return self.backend
def __exit__(self, exc_type, exc_value, traceback):
self.backend.close()
os.unlink(self.fp.name)
Which can then be used with:
with TemporaryBackend() as backend:
...run the test...
Rather than creating a temporary file, create a temporary directory which only you have access to. Once you have that, you can simply use an arbitrary string as the name of a file in that directory.
d = tempfile.mkdtemp()
tmp_name = "somefile.txt"
with Backend(os.path.join(d, tmp_name)) as backend:
... run test ...
os.remove(tmp_name) # If necessary
os.rmdir(d)
Depending on your needs, you may just want a random string of characters:
with Backend(''.join(random.sample(string.lowercase, 8))) as backend:
... run test ...
The creation of unique file names relays on the ability of file systems to grant exclusive access to files. So one have to create a file, not only a file name.
Another way to have a place, where you can create files safely, is, to create a temporary directory and put your files inside this directory. This would be my preferred way for test cases.

Dynamically reload a class definition in Python

I've written an IRC bot using Twisted and now I've gotten to the point where I want to be able to dynamically reload functionality.
In my main program, I do from bots.google import GoogleBot and I've looked at how to use reload to reload modules, but I still can't figure out how to do dynamic re-importing of classes.
So, given a Python class, how do I dynamically reload the class definition?
Reload is unreliable and has many corner cases where it may fail. It is suitable for reloading simple, self-contained, scripts. If you want to dynamically reload your code without restart consider using forkloop instead:
http://opensourcehacker.com/2011/11/08/sauna-reload-the-most-awesomely-named-python-package-ever/
You cannot reload the module using reload(module) when using the from X import Y form. You'd have to do something like reload(sys.modules['module']) in that case.
This might not necessarily be the best way to do what you want, but it works!
import bots.google
class BotClass(irc.IRCClient):
def __init__(self):
global plugins
plugins = [bots.google.GoogleBot()]
def privmsg(self, user, channel, msg):
global plugins
parts = msg.split(' ')
trigger = parts[0]
if trigger == '!reload':
reload(bots.google)
plugins = [bots.google.GoogleBot()]
print "Successfully reloaded plugins"
I figured it out, here's the code I use:
def reimport_class(self, cls):
"""
Reload and reimport class "cls". Return the new definition of the class.
"""
# Get the fully qualified name of the class.
from twisted.python import reflect
full_path = reflect.qual(cls)
# Naively parse the module name and class name.
# Can be done much better...
match = re.match(r'(.*)\.([^\.]+)', full_path)
module_name = match.group(1)
class_name = match.group(2)
# This is where the good stuff happens.
mod = __import__(module_name, fromlist=[class_name])
reload(mod)
# The (reloaded definition of the) class itself is returned.
return getattr(mod, class_name)
Better yet subprocess the plugins, then hypervise the subprocess, when the files change reload the plugins process.
Edit: cleaned up.
You can use the sys.modules to dynamically reload modules based on user-input.
Say that you have a folder with multiple plugins such as:
module/
cmdtest.py
urltitle.py
...
You can use sys.modules in this way to load/reload modules based on userinput:
import sys
if sys.modules['module.' + userinput]:
reload(sys.modules['module.' + userinput])
else:
' Module not loaded. Cannot reload '
try:
module = __import__("module." + userinput)
module = sys.modules["module." + userinput]
except:
' error when trying to load %s ' % userinput
When you do a from ... import ... it binds the object into the local namespace, so all you need to is re-import it. However, since the module is already loaded, it will just re-import the same version of the class so you would need to reload the module too. So this should do it:
from bots.google import GoogleBot
...
# do stuff
...
reload(bots.google)
from bots.google import GoogleBot
If for some reason you don't know the module name you can get it from GoogleBot.module.
def reload_class(class_obj):
module_name = class_obj.__module__
module = sys.modules[module_name]
pycfile = module.__file__
modulepath = string.replace(pycfile, ".pyc", ".py")
code=open(modulepath, 'rU').read()
compile(code, module_name, "exec")
module = reload(module)
return getattr(module,class_obj.__name__)
There is a lot of error checking you can do on this, if your using global variables you will probably have to figure out what happens then.

Categories

Resources