I'm making a Notepad program for linux with python. I want to implement a main module so that other modules can fetch and set data to that module.
I've tried implementing singleton class in the main module.
But whenever I import it in other modules, it keeps getting initialized, ruining the data inside the class.
MainModule.py
class Database(object):
__instance = None
def __new__(cls):
if cls.__instance == None:
cls.__instance = object.__new__(cls)
cls.__instance.name = "Database"
return cls.__instance
def __init__(self):
self.__title = "initial"
SubModule.py
from MainModule import Database
class example(object):
def __init__(self):
self.database = Database()
I changed the self.__title to "untitled", then after importing this module to other modules, I just get an initial value.
How do I make it so that it initialize only once even after getting instantiated in other modules?
Related
I have a number of objects that share a particular property. Let's call this property "Application". At the beginning of my program, I do not know the value of Application. At startup, I will run a routine that reveals the value of Application. After this point, the value of Application will never change. It is convenient for my objects to know the value of Application. So my desired architecture is:
Create a Parent Class that holds the Application property. Since there are many different objects that want to point to Application, I figured making them Children of another object would be convenient so all Child objects inherently point to Application.
Have a functionality such that I set Application one time and then every time I create an instance of a Child, they point to Application without the need to explicitly tell them to.
Below is an example that appears to work as I want. The ParentClass creates a class method called set_application so that the calling program must call it at some point before invoking an instance of ParentClass. Then the ChildClass will inherit the value of Application.
I am new to Python, so I'm curious if there are better ways to do this than what I have designed.
class ParentClass:
#classmethod
def set_application(cls, value):
cls.Application = value
def __init__(self):
attr = list(__class__.__dict__.keys())
if 'Application' in attr:
self.Application = __class__.Application
else:
raise ValueError('Application must be set by using set_application method')
class ChildClass(ParentClass):
def __init__(self, height):
self.height = height
super().__init__()
def print(self):
return vars(self)
Running this code:
Child = ChildClass(3)
gives:
ValueError: Application must be set by using set_application method
But running this:
ParentClass.set_application('ABC')
Child = ChildClass(3)
Child.print()
gives:
{'height': 3, 'Application': 'ABC'}
You can just do this:
class Foo:
application = ""
def __init__(self):
pass
if __name__ == "__main__":
Foo.application = "whatever"
foo = Foo()
print(foo.application) # whatever
And if you want the inheriting classes to know application, too, you get that for free:
class Foo:
application = ""
def __init__(self):
pass
class SubFoo(Foo):
def __init__(self):
pass
if __name__ == "__main__":
Foo.application = "whatever"
foo = Foo()
print(foo.application) # whatever
subfoo = SubFoo()
print(subfoo.application) # whatever
Another way I can think of would be to use a factory object:
class FooFactory:
def __init__(self, application):
self.application = application
def new_instance(self):
return Foo(application=self.application)
class Foo:
def __init__(self, application):
self.application = application
if __name__ == "__main__":
application = "whatever"
# instantiate the factory
factory = FooFactory(application)
# then use that factory to create new instances of Foo
foo_0 = factory.new_instance()
foo_1 = factory.new_instance()
TL;DR: Whats the difference between Dependency Injection and Singleton Pattern if the injected object is a Singleton?
I'm getting mixed results for how to resolve the design problem I am currently facing.
I would like to have a configuration that is application wide so that different objects and alter the configuration.
I thought to resolve this using a Singleton:
class ConfigMeta(type):
_instance = None
def __call__(cls, *args, **kwargs):
if not cls._instance:
cls._instance = super().__call__(*args, **kwargs)
return cls._instance
class Config(metaclass=ConfigMeta):
def __init__(self) -> None:
pass
But searching has shown this to be prone to errors and considered bad practice (when managing class states). Just about every other post suggests using Dependency Injection, but they confuse me on how they do it. They all state "your implementation can be a Singleton, but inject it into other objects in thier constructors".
That would be something along the lines of:
# foo.py
from config import Config
class Foo:
def __init__(self):
self.config = Config()
# bar.py
from config import Config
class Bar:
def __init__(self):
self.config = Config()
However, each one of those self.config refers to the same instance. Hence my confusion...
How is this considered Dependency Injection and not Singleton Pattern?
If it is considered Dependency Injection, what would it look like as just Singleton Pattern?
With Dependency Injection (DI) you leave it to the DI system to resolve how to get a specific object. You just declare what kind of object you require. This is complementary to the Singleton Pattern where there whole application is served by a single instance of a specific type. So for example:
class Config:
pass
config = Config() # singleton
class Foo:
def __init__(self):
config = config
Here the Foo class handles the logic how to get a Config object itself. Imagine this object has dependencies itself then this also needs to be sorted out by Foo.
With Dependency Injection on the other hand there is a central unit to handle these sort of things. The user class just declares what object it requires. For example:
class DI:
config = Config()
#classmethod
def get_config_singleton(cls):
return cls.config
#classmethod
def get_config(cls):
return Config()
#classmethod
def inject(cls, func):
from functools import partialmethod
# The DI system chooses what to use here:
return partialmethod(func, config=cls.get_config())
class Foo:
#DI.inject # it's up to the DI system to resolve the declared dependencies
def __init__(self, config: Config): # declare one dependency `config`
self.config = config
Dependency injection means providing the constructor the initialized object, in this case the config.
The code in your question isn't using dependency injection since the constructor __init__ doesn't receive the config as an argument, so you're only using the singleton pattern here.
See more info here about dependency injection in Python.
I have defined a python context class and a Test class in a file:
class Test(object):
pass
class MyContext(object):
def __init(self):
self._vars = []
def __enter__(self):
pass
def __exit(self, ....):
pass
In another file using that context:
from somewhere import Test, MyContext
with MyContext() as ctx:
mytest = Test()
So what I want to achieve is that when I exit the context, I want to be aware of the mytest instance created and add it in the ctx._vars = [<instance of Test >].
I don't want to have a ctx.add_var(mytest) method, I want those Test instances to be added automatically to the ctx instance.
That is possible of being done, using Python's introspection capabilities, but you have to be aware this is not what the with context block was created for.
I agree it is a useful syntax construction that can be "deviated" to do things like what you want: annotate the objects created inside a code block in a "registry".
Before showing how to do that with a context manager consider if a class body would not suffice you. Using a class body this way also deviates from its primary purpose, but you have your "registry" for free:
from somewhere import Test, MyContext
class ctx:
mytest = Test()
vars = ctx.__dict__.values()
In order to do that with a context manager, you have to inspect the local variables at the start and at the end of the with block. While that is not hard to do, it wuld not cover all instances of Test created - because if the code is like this:
mytests = []
with Mycontext as ctx:
mytests.append(Test())
No new variable is created - so code tracking the local variables would not find anything. Code could be written to look recursively into variables with containers, such as dictionaries and lists - but then mytest() instances could be added to a container referenced as a global variable, or a variable in other module.
It turns out that a reliable way to track Test instances would be to instrument the Test class itself to annotate new instances ina registry. That is far easier and less depentend on "local variable introspection" tricks.
The code for that is somewhat like:
class Test(object):
pass
class MyContext(object):
def __init(self, *args):
self.vars = []
self.track = args
self.original_new = {}
def patch(self, cls_to_patch):
cls_new = getattr(cls_to_patch, "__new__")
if "__new__" in cls.__dict__:
self.original_new[cls_to_patch] = cls_new
def patched_new(cls, *args, **kwargs):
instance = cls_new(*args, **kwags)
self.vars.append(instance)
return instance
cls_to_patch.__new__ = patched_new
def restore(self, cls):
if cls in self.original_new:
# class had a very own __new_ prior to patching
cls.__new__ = self.original_new[cls]
else:
# just remove the wrapped new method, restores access to superclass `__new__`
del cls.__new__
def __enter__(self):
for cls in self.track:
self.patch(cls)
return self
def __exit(self, ....):
for cls in self.track:
self.restore(cls)
...
from somewhere import Test, MyContext
with MyContext(Test) as ctx:
mytest = Test()
I'm facing some difficulties unittest my project, mainly due to the fact that the controllers reference a singleton produced by a factory.
A simple demonstration of this problem would be:
databasefactory.py
class DataBaseFactory(object):
# Lets imagine we support a number of databases. The client implementation all gives us a similar interfaces to use
# This is a singleton through the whole application
_database_client = None
#classmethod
def get_database_client(cls):
# type: () -> DataBaseClientInterFace
if not cls._database_client:
cls._database_client = DataBaseClient()
return cls._database_client
class DataBaseClientInterFace(object):
def get(self, key):
# type: (any) -> any
raise NotImplementedError()
def set(self, key, value):
# type: (any, any) -> any
raise NotImplementedError()
class DataBaseClient(DataBaseClientInterFace):
# Mock some real world database - The unittest mocking should be providing another client
_real_world_data = {}
def get(self, key):
return self._real_world_data[key]
def set(self, key, value):
self._real_world_data[key] = value
return value
model.py
from .databasefactory import DataBaseFactory
class DataModel(object):
# The DataBase type never changes so its a constant
DATA_BASE_CLIENT = DataBaseFactory.get_database_client()
def __init__(self, model_name):
self.model_name = model_name
def save(self):
# type: () -> None
"""
Save the current model into the database
"""
key = self.get_model_key()
data = vars(self)
self.DATA_BASE_CLIENT.set(key, data)
#classmethod
def load(cls):
# type: () -> DataModel
"""
Load the model
"""
key = cls.get_model_key()
data = cls.DATA_BASE_CLIENT.get(key)
return cls(**data)
#staticmethod
def get_model_key():
return 'model_test'
datacontroller.py
from .databasefactory import DataBaseFactory
from .model import DataModel
class DataBaseController(object):
"""
Does some stuff with the databaase
"""
# Also needs the database client. This is the same instance as on DataModel
DATA_BASE_CLIENT = DataBaseFactory.get_database_client()
_special_key = 'not_model_key'
#staticmethod
def save_a_model():
a_model = DataModel('test')
a_model.save()
#staticmethod
def load_a_model():
a_model = DataModel.load()
return a_model
#classmethod
def get_some_special_key(cls):
return cls.DATA_BASE_CLIENT.get(cls._special_key)
#classmethod
def set_some_special_key(cls):
return cls.DATA_BASE_CLIENT.set(cls._special_key, 1)
And finally the unittest itself:
test_simple.py
import unittest
from .databasefactory import DataBaseClientInterFace
from .datacontroller import DataBaseController
from .model import DataModel
class MockedDataBaseClient(DataBaseClientInterFace):
_mocked_data = {DataBaseController._special_key: 2,
DataModel.get_model_key(): {'model_name': 'mocked_test'}}
def get(self, key):
return self._mocked_data[key]
def set(self, key, value):
self._mocked_data[key] = value
return value
class SimpleOne(unittest.TestCase):
def test_controller(self):
"""
I want to mock the singleton instance referenced in both DataBaseController and DataModel
As DataBaseController imports DataModel, both classes have the DATA_BASE_CLIENT attributed instantiated with the factory result
"""
# Initially it'll throw a keyerror
with self.assertRaises(KeyError):
DataBaseController.get_some_special_key()
# Its impossible to just change the DATA_BASE_CLIENT in the DataBaseController as DataModel still points towards the real implementation
# Should not be done as it won't change anything to data model
DataBaseController.DATA_BASE_CLIENT = MockedDataBaseClient()
self.assertEqual(DataBaseController.get_some_special_key(), 2)
# Will fail as the DataModel still uses the real implementation
# I'd like to mock DATA_BASE_CLIENT for both classes without explicitely giving inserting a new class
# The project I'm working on has a number of these constants that make it a real hassle to inject it a new one
# There has to be a better way to tackle this issue
model = DataBaseController.load_a_model()
The moment the unittest imports the DataBaseController, DataModel is imported through the DataBaseController module.
This means that both DATA_BASE_CLIENT class variables are instantiated.
If my factory were to catch it running inside a unittest, it still would not matter as the import happens outside the unittest.
My question is: is there a way to mock this singleton and replace across the whole application at once?
Replacing the cached instance on the factory is not an option as the references in the classes point to the old object.
It might be a design flaw to put these singleton instances as class variables in the first place. But I'd rather retrieve a class variable than calling the factory each time for the singleton.
In your use case, a single module is in charge of providing the singleton to the whole application. So I would try to inject the mock in that module before it is used by anything else. The problem is that the mock cannot be fully constructed before the other classes are declared. A possible way is to construct the singleton in 2 passes: first pass does not depend on anything, then that minimal object is used to construct the classes and then its internal dictionnary is populated. Code could be:
import unittest
from .databasefactory import DataBaseClientInterFace
class MockedDataBaseClient(DataBaseClientInterFace):
_mocked_data = {} # no dependance outside databasefactory
def get(self, key):
return self._mocked_data[key]
def set(self, key, value):
self._mocked_data[key] = value
return value
# inject the mock into DataBaseFactory
from .databasefactory import DataBaseFactory
DataBaseFactory._database_client = MockedDataBaseClient()
# use the empty mock to construct other classes
from .datacontroller import DataBaseController
from .model import DataModel
# and populate the mock
DataBaseFactory._database_client._mocked_data.update(
{DataBaseController._special_key: 2,
DataModel.get_model_key(): {'model_name': 'mocked_test'}})
class SimpleOne(unittest.TestCase):
def test_controller(self):
"""
I want to mock the singleton instance referenced in both DataBaseController and DataModel
As DataBaseController imports DataModel, both classes have the DATA_BASE_CLIENT attributed instantiated with the factory result
"""
self.assertEqual(DataBaseController.get_some_special_key(), 2)
model = DataBaseController.load_a_model()
self.assertEqual('mocked_test', model.model_name)
But beware: this assumes that the test procedure does not load model.py or datacontroller.py before test_simple.py
I have one module as singleton class. In that singleton class module, I want to import other module, which is not singleton. Is it possible to import ?
While I am importing module and run to singleton module then I got error that singleton module is not defined.
for e.g:
first_file.py:
class first(object):
def __init__(self):
print "first class"
second_file.py
from Libs.first_file import * #here libs is my folder /module
#singleton
class second(self):
def __init__(self):
print "Second class"
when I ran:
python second_file.py
I got error NameError: name 'second_file()' is not defined
but when commented out import, then second_file() module is working as expected.
Thanks,
Maybe you should provide more detail.For example,the detail of your #singleton
I had a test on the code you provided.I added the decorator myself,and change class second(self): to class second(first):,as following:
first_file.py:
class first(object):
def __init__(self):
print "first class"
second_file.py:
from first_file import * # here libs is my folder /module
def singleton(class_):
instances = {}
def getinstance(*args, **kwargs):
if class_ not in instances:
instances[class_] = class_(*args, **kwargs)
return instances[class_]
return getinstance
#singleton
class second(first):
def __init__(self):
print "Second class"
if __name__ == '__main__':
test = second()
test1 = second()
When I ran python second_file.py:
Second class
[Finished in 0.0s]
It seems to be ok.
Maybe I don't get the point of your problem,but hope it helps.