Is this Dependency Injection and/or Singleton Pattern? - python

TL;DR: Whats the difference between Dependency Injection and Singleton Pattern if the injected object is a Singleton?
I'm getting mixed results for how to resolve the design problem I am currently facing.
I would like to have a configuration that is application wide so that different objects and alter the configuration.
I thought to resolve this using a Singleton:
class ConfigMeta(type):
_instance = None
def __call__(cls, *args, **kwargs):
if not cls._instance:
cls._instance = super().__call__(*args, **kwargs)
return cls._instance
class Config(metaclass=ConfigMeta):
def __init__(self) -> None:
pass
But searching has shown this to be prone to errors and considered bad practice (when managing class states). Just about every other post suggests using Dependency Injection, but they confuse me on how they do it. They all state "your implementation can be a Singleton, but inject it into other objects in thier constructors".
That would be something along the lines of:
# foo.py
from config import Config
class Foo:
def __init__(self):
self.config = Config()
# bar.py
from config import Config
class Bar:
def __init__(self):
self.config = Config()
However, each one of those self.config refers to the same instance. Hence my confusion...
How is this considered Dependency Injection and not Singleton Pattern?
If it is considered Dependency Injection, what would it look like as just Singleton Pattern?

With Dependency Injection (DI) you leave it to the DI system to resolve how to get a specific object. You just declare what kind of object you require. This is complementary to the Singleton Pattern where there whole application is served by a single instance of a specific type. So for example:
class Config:
pass
config = Config() # singleton
class Foo:
def __init__(self):
config = config
Here the Foo class handles the logic how to get a Config object itself. Imagine this object has dependencies itself then this also needs to be sorted out by Foo.
With Dependency Injection on the other hand there is a central unit to handle these sort of things. The user class just declares what object it requires. For example:
class DI:
config = Config()
#classmethod
def get_config_singleton(cls):
return cls.config
#classmethod
def get_config(cls):
return Config()
#classmethod
def inject(cls, func):
from functools import partialmethod
# The DI system chooses what to use here:
return partialmethod(func, config=cls.get_config())
class Foo:
#DI.inject # it's up to the DI system to resolve the declared dependencies
def __init__(self, config: Config): # declare one dependency `config`
self.config = config

Dependency injection means providing the constructor the initialized object, in this case the config.
The code in your question isn't using dependency injection since the constructor __init__ doesn't receive the config as an argument, so you're only using the singleton pattern here.
See more info here about dependency injection in Python.

Related

Type hints and code completion. How to specialize a generic class with types

In C++ I would use templates. I'm checking whether the hints work using the PyCharm auto-completion feature.
I'm working on a "generalized" FSM library (Finite-State Machine). I would like to preserve the type-hints that I had before. They worked when I had the FSM "tailored" to a specific subject.
FSM is just an example. It can be anything that is generic and by specializing it I would like to pass the specializing types down to the generic class.
The FSM looks as follows:
There is the owner, the subject that uses the FSM
There are the states that have a common state
There is the state manager that holds the current state
The lib (simplified):
class BaseState:
def __init__(self, owner: "???", state_mgr: "StateManager"):
self.owner = owner
self.state_mgr = state_mgr
class StateManager:
def __init__(self, owner, base_state_type):
self.__owner = owner
self.__base_state_type = base_state_type
self.current_state: Type[base_state_type] = base_state_type(self.__owner, self) # Doesn't work
def transition_to(self, next_state_type):
assert issubclass(next_state_type, self.__base_state_type)
self.current_state = next_state_type(self.__owner, self)
The new subject that uses the lib:
class FileSender_BaseState(fsm.BaseState):
def on_event(self, msg):
pass
class FileSender_Idle(FileSender_BaseState):
def on_event(self, msg):
self.owner.foo()
class FileSender_Sending(FileSender_BaseState):
def on_event(self, msg):
pass
class FileSender:
def __init__(self):
self._state_mgr = fsm.StateManager(self, FileSender_BaseState)
self._state_mgr.transition_to(FileSender_Idle)
self._state_mgr.current_state.on_event()
def foo(self):
pass
How to pass to the BaseState that in this scenario the owner is of type FileSender?
How to pass to the StateManager that the current_state is of type FileSender_BaseState?
(optional) How to "type-hint" the next_state_type parameter of the transition_to() method that it is a (sub)type of FileSender_BaseState
For the first point, I have tried to be more precise with the fsm.BaseState initialization, but it doesn't improve.
class FileSender_BaseState(fsm.BaseState):
def __init__(self, owner: "FileSender", state_mgr: "fsm.StateManager"): # Specifing owner type didn't improve anything
super().__init__(owner, state_mgr)
For the second point, I've tried to use the typing.Type with the argument that holds the type of the owner, but it also doesn't work, surprisingly
class StateManager:
def __init__(self, owner, base_state_type):
self.current_state: Type[base_state_type] = base_state_type()
The third attempt failed miserably because I cannot do
def transition_to(self, next_state_type: typing.Type["self.__base_state_type"]): # Unresolved reference 'self' of course
I would suspect that PyCharm auto-completion:
class FileSender_Idle(FileSender_BaseState):
def on_event(self, msg):
self.owner. # propose the foo() method
self._state_mgr.current_state. # propose the on_event() method
Here is how it was working before: https://pastebin.com/MekPqbnJ
I think we might run into limitations of the currently available type system here.
The way I see it, you essentially want two generic classes here, that depend on each other.
BaseState should be generic in terms of its owner. This would allow the type checker to infer the owner's interface by binding an owner type variable in the BaseState constructor.
And StateManager should be generic in terms of its state (type) and by extension also its owner. This would at least allow to infer the interface of the state after initialization.
But type variables do not allow unspecified generic types as upper bounds/constraints. (see discussion here) And in this case, that would be a proper way to express the that the StateManager is parameterized by its state type, which in turn is parameterized by its owner, and that owner is bound by the owner of the state manager.
If it were possible, I would do something like this:
from __future__ import annotations
from typing import Generic, TypeVar
OwnerT = TypeVar("OwnerT")
StateT = TypeVar("StateT", bound="BaseState[OwnerT]") # not valid
class BaseState(Generic[OwnerT]):
def __init__(self, owner: OwnerT, state_mgr: StateManager[OwnerT]) -> None:
self.owner = owner
self.state_mgr = state_mgr
class StateManager(Generic[OwnerT, StateT]):
__owner: OwnerT
__base_state_type: type[StateT[OwnerT]]
current_state: StateT[OwnerT]
def __init__(self, owner: OwnerT, base_state_type: type[StateT[OwnerT]]) -> None:
self.__owner = owner
self.__base_state_type = base_state_type
self.current_state = base_state_type(self.__owner, self)
...
Alas, the type system has no expression for this.
Workaround
I think the closest thing to what you are trying to achieve could be something like this:
from __future__ import annotations
from typing import Generic, TypeVar
OwnerT = TypeVar("OwnerT")
StateT = TypeVar("StateT")
class BaseState(Generic[OwnerT]):
def __init__(self, owner: OwnerT, state_mgr: StateManager[OwnerT, StateT]) -> None:
self.owner = owner
self.state_mgr = state_mgr
class StateManager(Generic[OwnerT, StateT]):
_owner: OwnerT
_base_state_type: type[StateT]
current_state: StateT
def __init__(self, owner: OwnerT, base_state_type: type[StateT]) -> None:
self._owner = owner
self._base_state_type = base_state_type
self.current_state = base_state_type(self._owner, self) # type: ignore[call-arg]
def transition_to(self, next_state_type: type[StateT]) -> None:
assert issubclass(next_state_type, self._base_state_type)
self.current_state = next_state_type(self._owner, self) # type: ignore[call-arg]
class FileSenderBaseState(BaseState["FileSender"]):
def on_event(self, msg: object) -> None:
pass
class FileSenderIdle(FileSenderBaseState):
def on_event(self, msg: object) -> None:
self.owner.foo()
class FileSenderSending(FileSenderBaseState):
def on_event(self, msg: object) -> None:
pass
class FileSender:
def __init__(self) -> None:
self._state_mgr = StateManager(self, FileSenderBaseState)
self._state_mgr.transition_to(FileSenderIdle)
# Type checker still infers current state as `FileSenderBaseState`
assert isinstance(self._state_mgr.current_state, FileSenderIdle)
self._state_mgr.current_state.on_event(...)
def foo(self) -> None:
pass
Caveats
Since we have no upper bound on StateT, type[StateT] is treated by the type checker as object, which raises an error, when we initialize it with arguments (because the object constructor does not take arguments). Thus, we utilize a specific type: ignore directive, whenever we initialize type[StateT].
The second limitation is that there is not really a way to let the type checker know that the current_state type on an instance of StateManager has changed without explicitly setting that attribute on that instance or asserting the type explicitly. Since your transition_to method changes the type, this will remain opaque to the outside. The only workaround I can think of is that assert statement.
Other than that, this setup satisfies mypy and should give you the desired auto-suggestions from your IDE.
If you add reveal_type(self._state_mgr._owner) and reveal_type(self._state_mgr.current_state) at the end in that FileSender.__init__ method, mypy will output the following:
Revealed type is "FileSender"
Revealed type is "FileSenderIdle"
Depending on what your goals are, it may be worth reconsidering the entire design. It is hard to comment on that without more context though. I hope this illustrates at least a few concepts and tricks you can use for type safety.
I would suggest keeping this question open, i.e. not accepting my answer (if you were so inclined in the first place) in the hopes that someone comes along with a better idea.

How to access the variables created within a `with` statement

I have defined a python context class and a Test class in a file:
class Test(object):
pass
class MyContext(object):
def __init(self):
self._vars = []
def __enter__(self):
pass
def __exit(self, ....):
pass
In another file using that context:
from somewhere import Test, MyContext
with MyContext() as ctx:
mytest = Test()
So what I want to achieve is that when I exit the context, I want to be aware of the mytest instance created and add it in the ctx._vars = [<instance of Test >].
I don't want to have a ctx.add_var(mytest) method, I want those Test instances to be added automatically to the ctx instance.
That is possible of being done, using Python's introspection capabilities, but you have to be aware this is not what the with context block was created for.
I agree it is a useful syntax construction that can be "deviated" to do things like what you want: annotate the objects created inside a code block in a "registry".
Before showing how to do that with a context manager consider if a class body would not suffice you. Using a class body this way also deviates from its primary purpose, but you have your "registry" for free:
from somewhere import Test, MyContext
class ctx:
mytest = Test()
vars = ctx.__dict__.values()
In order to do that with a context manager, you have to inspect the local variables at the start and at the end of the with block. While that is not hard to do, it wuld not cover all instances of Test created - because if the code is like this:
mytests = []
with Mycontext as ctx:
mytests.append(Test())
No new variable is created - so code tracking the local variables would not find anything. Code could be written to look recursively into variables with containers, such as dictionaries and lists - but then mytest() instances could be added to a container referenced as a global variable, or a variable in other module.
It turns out that a reliable way to track Test instances would be to instrument the Test class itself to annotate new instances ina registry. That is far easier and less depentend on "local variable introspection" tricks.
The code for that is somewhat like:
class Test(object):
pass
class MyContext(object):
def __init(self, *args):
self.vars = []
self.track = args
self.original_new = {}
def patch(self, cls_to_patch):
cls_new = getattr(cls_to_patch, "__new__")
if "__new__" in cls.__dict__:
self.original_new[cls_to_patch] = cls_new
def patched_new(cls, *args, **kwargs):
instance = cls_new(*args, **kwags)
self.vars.append(instance)
return instance
cls_to_patch.__new__ = patched_new
def restore(self, cls):
if cls in self.original_new:
# class had a very own __new_ prior to patching
cls.__new__ = self.original_new[cls]
else:
# just remove the wrapped new method, restores access to superclass `__new__`
del cls.__new__
def __enter__(self):
for cls in self.track:
self.patch(cls)
return self
def __exit(self, ....):
for cls in self.track:
self.restore(cls)
...
from somewhere import Test, MyContext
with MyContext(Test) as ctx:
mytest = Test()

Python Implementing a module that is shared by all other modules

I'm making a Notepad program for linux with python. I want to implement a main module so that other modules can fetch and set data to that module.
I've tried implementing singleton class in the main module.
But whenever I import it in other modules, it keeps getting initialized, ruining the data inside the class.
MainModule.py
class Database(object):
__instance = None
def __new__(cls):
if cls.__instance == None:
cls.__instance = object.__new__(cls)
cls.__instance.name = "Database"
return cls.__instance
def __init__(self):
self.__title = "initial"
SubModule.py
from MainModule import Database
class example(object):
def __init__(self):
self.database = Database()
I changed the self.__title to "untitled", then after importing this module to other modules, I just get an initial value.
How do I make it so that it initialize only once even after getting instantiated in other modules?

How to inject mock objects into instance attributes in Python unit testing

I am learning python
I'm wondering if there is a mechanism to "inject" an object (a fake object in my case) into the class under test without explicitly adding it in the costructor/setter.
## source file
class MyBusinessClass():
def __init__(self):
self.__engine = RepperEngine()
def doSomething(self):
## bla bla ...
success
## test file
## fake I'd like to inkject
class MyBusinessClassFake():
def __init__(self):
pass
def myPrint(self) :
print ("Hello from Mock !!!!")
class Test(unittest.TestCase):
## is there an automatic mechanism to inject MyBusinessClassFake
## into MyBusinessClass without costructor/setter?
def test_XXXXX_whenYYYYYY(self):
unit = MyBusinessClass()
unit.doSomething()
self.assertTrue(.....)
in my test I'd like to "inject" the object "engine" without passing it in the costructor. I've tried few option (e.g.: #patch ...) without success.
IOC is not needed in Python. Here's a Pythonic approach.
class MyBusinessClass(object):
def __init__(self, engine=None):
self._engine = engine or RepperEngine()
# Note: _engine doesn't exist until constructor is called.
def doSomething(self):
## bla bla ...
success
class Test(unittest.TestCase):
def test_XXXXX_whenYYYYYY(self):
mock_engine = mock.create_autospec(RepperEngine)
unit = MyBusinessClass(mock_engine)
unit.doSomething()
self.assertTrue(.....)
You could also stub out the class to bypass the constuctor
class MyBusinessClassFake(MyBusinessClass):
def __init__(self): # we bypass super's init here
self._engine = None
Then in your setup
def setUp(self):
self.helper = MyBusinessClassFake()
Now in your test you can use a context manager.
def test_XXXXX_whenYYYYYY(self):
with mock.patch.object(self.helper, '_engine', autospec=True) as mock_eng:
...
If you want to inject it with out using the constuctor then you can add it as a class attribute.
class MyBusinessClass():
_engine = None
def __init__(self):
self._engine = RepperEngine()
Now stub to bypass __init__:
class MyBusinessClassFake(MyBusinessClass):
def __init__(self):
pass
Now you can simply assign the value:
unit = MyBusinessClassFake()
unit._engine = mock.create_autospec(RepperEngine)
After years using Python without any DI autowiring framework and Java with Spring I've come to realize plain simple Python code often doesn't need frameworks for dependency injection without autowiring (autowiring is what Guice and Spring both do in Java), i.e., just doing something like this is enough:
def foo(dep = None): # great for unit testing!
self.dep = dep or Dep() # callers can not care about this too
...
This is pure dependency injection (quite simple) but without magical frameworks for automatically injecting them for you (i.e., autowiring) and without Inversion of Control.
Unlike #Dan I disagree that Python doesn't need IoC: Inversion of Control is a simple concept that the framework takes away control of something, often to provide abstraction and take away boilerplate code. When you use template classes this is IoC. If IoC is good or bad it is totally up to how the framework implements it.
Said that, Dependency Injection is a simple concept that doesn't require IoC. Autowiring DI does.
As I dealt with bigger applications the simplistic approach wasn't cutting it anymore: there was too much boilerplate code and the key advantage of DI was missing: to change implementation of something once and have it reflected in all classes that depend on it. If many pieces of your application cares on how to initialize a certain dependency and you change this initialization or wants to change classes you would have to go piece by piece changing it. With an DI framework that would be way easier.
So I've come up with injectable a micro-framework that wouldn't feel non-pythonic and yet would provide first class dependency injection autowiring.
Under the motto Dependency Injection for Humans™ this is what it looks like:
# some_service.py
class SomeService:
#autowired
def __init__(
self,
database: Autowired(Database),
message_brokers: Autowired(List[Broker]),
):
pending = database.retrieve_pending_messages()
for broker in message_brokers:
broker.send_pending(pending)
# database.py
#injectable
class Database:
...
# message_broker.py
class MessageBroker(ABC):
def send_pending(messages):
...
# kafka_producer.py
#injectable
class KafkaProducer(MessageBroker):
...
# sqs_producer.py
#injectable
class SQSProducer(MessageBroker):
...
Be explicit, use a setter. Why not?
class MyBusinessClass:
def __init__(self):
self._engine = RepperEngine()
from unittest.mock import create_autospec
def test_something():
mock_engine = create_autospec(RepperEngine, instance=True)
object_under_test = MyBusinessClass()
object_under_test._engine = mock_engine
Your question seems somewhat unclear, but there is nothing preventing you from using class inheritance to override the original methods. In that case the derivative class would just look like this:
class MyBusinessClassFake(MyBusinessClass):
def __init__(self):
pass

Sphinx document module properties

I have a module that should have a #property, I solved this by setting a class as the module. I got the idea from this answer: Lazy module variables--can it be done?
I wanted this to be repeatable and easy to use so I made a metaclass for it. This works like a charm.
The problem is that when using Sphinx to generate documentation properties don't get documented. Everything else is documented as expected. I have no idea how to fix this, maybe this is a problem with Sphinx?
The module:
import sys
import types
class ClassAsModule(type):
def __new__(cls, name, bases, attrs):
# Make sure the name of the class is the module name.
name = attrs.pop('__module__')
# Create a class.
cls = type.__new__(cls, name, bases, attrs)
# Instantiate the class and register it.
sys.modules[name] = cls = cls(name)
# Update the dict so dir works properly
cls.__dict__.update(attrs)
class TestClass(types.ModuleType):
"""TestClass docstring."""
__metaclass__ = ClassAsModule
#property
def some_property(self):
"""Property docstring."""
pass
def meth():
"""meth doc"""
pass
And a copy-paste to generate/view Sphinx documentation:
sphinx-apidoc . -o doc --full
sphinx-build doc html
xdg-open html/module.html
The most essential part is to document the class' properties. Bonus points to also document original module members.
EDIT: The class should be documented as the module it is in. The class is used this way and should thus appear this way in Sphinx.
Example of desired output:
Module Foo
TestClass docstring.
some_property
Property docstring.
meth()
meth doc
EDIT 2: I found something that may aid in finding a solution. When having a regular module foo with the following content:
#: Property of foo
prop = 'test'
Sphinx documents this like:
foo.prop = 'test'
Property of foo
The same works if prop is an attribute of a class. I haven't figured out why it doesn't work in my special case.
Here's my understanding.
The theory is: making a mutant your class act like a module this (a bit hacky) way makes sphinx think that he doesn't need (to parse) properties from modules (because it's a class-level paradigm). So, for sphinx, TestClass is a module.
First of all, to make sure that the culprit is the code for making a class act like a module - let's remove it:
class ClassAsModule(type):
pass
we'll see in docs:
package Package
script Module
class package.script.ClassAsModule
Bases: type
class package.script.TestClass
Bases: module
TestClass docstring.
meth()
meth doc
some_property
Property docstring.
As you see, sphinx read the property without any problems. Nothing special here.
Possible solution for your problem is to avoid using #property decorator and replace it with calling property class constructor. E.g.:
import sys
import types
class ClassAsModule(type):
def __new__(cls, name, bases, attrs):
# Make sure the name of the class is the module name.
name = attrs.pop('__module__')
# Create a class.
cls = type.__new__(cls, name, bases, attrs)
# Instantiate the class and register it.
sys.modules[name] = cls = cls(name)
# Update the dict so dir works properly
cls.__dict__.update(attrs)
class TestClass(types.ModuleType):
"""TestClass docstring."""
__metaclass__ = ClassAsModule
def get_some_property(self):
"""Property docstring."""
pass
some_property = property(get_some_property)
def meth(self):
"""meth doc"""
pass
For this code sphinx generates:
package Package
script Module
TestClass docstring.
package.script.get_some_property(self)
Property docstring.
package.script.meth(self)
meth doc
May be the answer is a piece of nonsense, but I hope it'll point you to the right direction.
The way I've found that works best is to keep the file contents the same as if you were writing a regular module, then at the end replace the embryonic module in sys.modules:
"""Module docstring. """
import sys
import types
def _some_property(self):
pass
some_property = property(_some_property)
"""Property docstring."""
def meth():
"""meth doc"""
pass
def _make_class_module(name):
mod = sys.modules[name]
cls = type('ClassModule', (types.ModuleType,), mod.__dict__)
clsmod = cls(name)
clsmod.__dict__.update(mod.__dict__)
clsmod.__wrapped__ = mod
sys.modules[name] = clsmod
_make_class_module(__name__)
Text documentation:
mymod Module
************
Module docstring.
mymod.meth()
meth doc
mymod.some_property = None
Property docstring.
For the version of Sphinx I'm using (v1.1.3), it looks like you have to apply the property constructor explicitly (you can't use it as a decorator), and the docstring has to go in the file at the top level, on the line after the constructor call that creates the property (it doesn't work as a docstring inside the property getter). The source is still fairly readable, though.

Categories

Resources