Python injection - python

i am in the current situation. I have a Service class, which uses a Logic component, which uses a Storage object.
So:
Service ---> Logic ---> Storage
The arrows represent a 'uses' relation.
Of course i am trying to use dependency injection, and i am using the injector package (https://github.com/alecthomas/injector).
The objective here is to have different versions of the Service instances, which will be using different Logic instances, that differ in the Storage instance used.
I think it's something that is presented here.
Now, the point is the following: i have not understood how to apply injection to several levels. Here's a little example:
from injector import inject, Injector, Module, Key
# My example
# Standard service using a ConsumerLogic which is requesting a Storage object
class Storage:
pass
class Storage2:
pass
class Logic:
def __init__(self, storage):
self.storage = storage
class Service:
def __init__(self, logic):
self.logic = logic
By using the BoundedKey class, i have worked something out:
from injector import inject, Injector, Module, Key
# My example
# Standard service using a ConsumerLogic which is requesting a Storage object
LogicKey = Key('Logic')
class Storage:
pass
class Storage2:
pass
class Logic:
#inject
def __init__(self, storage: Storage):
self.storage = storage
class Service:
#inject(logic=LogicKey)
def __init__(self, logic):
self.logic = logic
class Configuration(Module):
def configure(self, binder):
binder.bind(
LogicKey,
to=Logic(storage=Storage())
)
And to use the service instace i can do:
inj = Injector()
inj.get(Service)
obtaining that specific version of the service.
How can I make the get method to return me different version of the Service ? I have not been able to work out something with Module approach. But i must be missing something.
Thanks in advance for help.

Related

Dependency Injection problem with FastAPI on Python

Good day! Please tell me how you can solve the following problem in Python + FastAPI.
There is a test project:
app / main.py - main file
app / routes / users.py -set of api methods
app / repos / factory.py - repository factory
app / repos / user_repository.py - repositories
app / handlers / factory.py - handler factory
app / handlers / users.py - handlers
app / domain / user.py - data class
The main and routes structure is the same as in the example https://fastapi.tiangolo.com/tutorial/bigger-applications/
In the routes/users.py file:
from fastapi import APIRouter, Depends
from ..handlers import factory
router = APIRouter()
#router.get("/users/", tags=["users"])
def read_users(handler=Depends(factory.get_handler)):
return handler.get_all()
In the handlers/factory.py:
from fastapi import Depends
from .users import UserHandler1
def get_handler(handler=Depends(UserHandler1)):
return handler
In the handlers/users.py:
from fastapi import Depends
from ..repos import factory
class UserHandler1:
def __init__(self):
pass
def get_all(self, repo=Depends(factory.get_repo)):
return repo.get_all()
repos/factory.py:
from fastapi import Depends
from ..repos.user_repository import UserRepository
def get_repo(repo=Depends(UserRepository)):
return repo
repos/user_repository.py:
from ..domain.user import User
class UserRepository:
def __init__(self):
pass
def get_all(self):
return [User(1, 'A'), User(2, 'B'), User(3, 'C')]
domain/user.py:
class User:
id: int
name: str
def __init__(self, id, name):
self.id = id
self.name = name
Then I run hypercorn server: app.main:app --reload
Try call api method: http://127.0.0.1:8000/users/
And get the error AttributeError: 'Depends' object has no attribute 'get_all'
If you remove the handlers layer and do this, then everything will work.
routes/users.py:
from fastapi import APIRouter, Depends
from ..repos import factory
router = APIRouter()
#router.get("/users/", tags=["users"])
def read_users(repo=Depends(factory.get_repo)):
return repo.get_all()
It also works if you completely remove all Depends and create
UserRepository and UserHandler1 directly in factories.
Question 1: How do I use "Depends" in this case and why doesn't it work?
In general, the factory does not look like a good solution to this problem. I saw an example of DI implementation using multiple inheritance but as for me it is the same as factory method.
I also tried to use the Pinject library, but it requires the initial construction of a graph, which needs to be saved somewhere in order to access it in api handlers.
Question 2 (more important): How Dependency Injection can be applied in this case ?
The __call__ method must be implemented in the class.
As noted in the comments, a dependency can be anything that is a callable and thus a class as well. The only caveat in the latter case is that the class will only be initialized (i.e. only the __init__(...) function will be called).
So, in order to have a class as dependency, as in the example of https://fastapi.tiangolo.com/tutorial/dependencies/classes-as-dependencies/#shortcut you just need to call the target functions within the init and set the values as attributes of the class.
from ..domain.user import User
class UserRepository:
def __init__(self):
self.get_all()
def get_all(self):
self.users = [User(1, 'A'), User(2, 'B'), User(3, 'C')]
from fastapi import Depends
from ..repos.user_repository import UserRepository
def get_repo(repo=Depends(UserRepository)):
print(repo.users) # This will print the list of users
return repo
QUESTION 2
NB
This is a modelling question. Here I propose what I believe is
suitable from my point of view. It does not necessarily have to
be best or simplest approach.
Answering your second question, I would not advice for such complex dependencies. If the dependencies are at the router level, you can simply add them to the router, using the parameter depends=[...] and providing a list of dependency classes/functions.
Alternatively, you could declare all of the dependencies as function parameters of the endpoint, as you did for the factory. This method may lead to big chunks of code getting copied and pasted, so I advise for the above approach.
If you need to process the data parameters, then you add them to the request and access them from within the endpoint. See FastAPI get user ID from API key for a minimal example.

Decorator for SecurityManager in flask appbuilder for superest

I'm trying to add a custom user information retrieval from OAuth in superset, which is build on top of flask-appbuilder.
Official doc provides following information:
Decorate your method with the SecurityManager oauth_user_info_getter
decorator. Make your method accept the exact parameters as on this
example, and then return a dictionary with the retrieved user
information.
http://flask-appbuilder.readthedocs.io/en/latest/security.html#authentication-oauth
The example in the doc also does not help much, as decorator was put in the comments.
I am where to put custom decorator in Superset? I've put the custom decorator in superset_config.py but I didn't work for me.
The approach that I use boils down to the following:
# For superset version >= 0.25.0
from superset.security import SupersetSecurityManager
class CustomSecurityManager(SupersetSecurityManager):
def __init__(self, appbuilder):
super(CustomSecurityManager, self).__init__(appbuilder)
def whatever_you_want_to_override(self, ...):
# Your implementation here
CUSTOM_SECURITY_MANAGER = CustomSecurityManager
# For superset version < 0.25.0
from flask_appbuilder.security.sqla.manager import SecurityManager
class CustomSecurityManager(SecurityManager):
def __init__(self, appbuilder):
super(CustomSecurityManager, self).__init__(appbuilder)
def whatever_you_want_to_override(self, ...):
# Your implementation here
CUSTOM_SECURITY_MANAGER = CustomSecurityManager

How can I refactor classes in Python?

I have test code written in Python, using classes.
The test environment has two types of hosts - app hosts, where applications run, and storage hosts, where storage components run.
I have two classes, each representing the type of host:
class AppHost_Class(object):
def __init_(self, ip_address):
# etc.
# This method handles interfacing with the application
def application_service(self):
# This method handles the virtual storage component
def virtual_storage(self):
# This method handles caching
def cache_handling(self):
class Storage_Server_Class(object):
def __init_(self, ip_address):
# This method handles interfacing with the storage process
def storage_handling(self):
# This method handles interfacing with the disk handling processes
def disk_handling(self):
The problem is that the topology can change.
Topology #1 is this:
- Application Host runs
* Application processes
* Virtual storage processes
* Cache processes
Storage Host runs
Storage processes
Disk handling processes
My current test code handles Topology #1
However, we also want to support another Topology (Topology #2)
Application Host runs
Application processes
Storage Host runs
Virtual storage processes
Cache processes
Storage processes
Disk handling processes
How can I refactor the classes so that for Topology 1, the classes and its methods are the same, but for Topology 2, the Storage_Server_Class gets some of the methods from the AppHost_Class?
I was thinking of making a child class like this:
class Both_Class(AppHost_Class, Storage_Server_Class):
But I don't want to do this because I don't want the applcation_service method to be available to Both_Class.
Is there a way to just map a few methods in AppHost_Class into the Storage_Server_Class?
Here is an example of a class B that shares exactly one method defined in class A:
class A:
def a1(self):
pass
def a2(self):
pass
class B:
def __init__(self, instance_of_a):
self.a2 = instance_of_a.a2
a = A()
B(a)
It sounds to me like you want three base classes. One for App stuff, one for VirtualStorage (and cache) stuff and one for Storage (and disk) stuff. Then you can make child classes for your two topologies that mix the desired methods together.
For topology 1, you have a class that inherits from both the App and the VirtualStorage base classes (and you use the Storage base class unmodified). For topology 2, you create a class that inherits from the VirtualStorage and the Storage base classes and use the App base class unmodified.
Example code:
class App:
def do_app_stuff(self):
pass
class VirtualStorage:
def do_virtual_storage_stuff(self):
pass
class Storage:
def do_storage_stuff(self):
pass
# topology 1
class Top1App(App, VirtualStorage):
pass
Top1Storage = Storage
# topology 2
Top2App = App
class Top2Storage(VirtualStorage, Storage):
pass
You might not need the aliased names for the base classes you're using directly in the different topologies, I just threw those in to make it look extra nice.
Split the methods up into three classes then combine as needed.
#class NetworkObject(object): # Python 2.7
class NetworkObject:
def __init__(self, ip_address):
self.ip_address = ip_address
class AppHost(NetworkObject):
def application_service(self):
print('app service', self.ip_address)
class Storage_Server(NetworkObject):
def storage_handling(self):
print('storage handler', self.ip_address)
def disk_handling(self):
print('disk handler', self.ip_address)
class Foo(object):
def virtual_storage(self):
print('virtual storage', self.ip_address)
def cache_handling(self):
print('cache handling', self.ip_address)
topology_1, topology_2 = True, False
# Topology 1
if topology_1:
class AppHost_Class(AppHost, Foo):
pass
class Storage_Server_Class(Storage_Server):
pass
# Topology 2
if topology_2:
class AppHost_Class(AppHost):
pass
class Storage_Server_Class(Storage_Server, Foo):
pass
Another option would be to define the two classes with the methods they will always include,
#class NetworkObject(object): # Python 2.7
class NetworkObject:
def __init__(self, ip_address):
self.ip_address = ip_address
class A(NetworkObject):
def application_service(self):
print('app service', self.ip_address)
class B(NetworkObject):
def storage_handling(self):
print('storage handler', self.ip_address)
def disk_handling(self):
print('disk handler', self.ip_address)
... define methods you would like to mix and match
def virtual_storage(self):
print('virtual storage', self.ip_address)
def cache_handling(self):
print('cache handling', self.ip_address)
... and conditionally add the methods to the classes
topology = 1
if topology == 1:
A.virtual_storage = virtual_storage
A.cache_handling = cache_handling
if topology == 2:
B.virtual_storage = virtual_storage
B.cache_handling = cache_handling
You may want to define the extra methods in the parent/base class but have them raise an exception unless a topology has been applied
#class NetworkObject(object): # Python 2.7
class NetworkObject:
def __init__(self, ip_address):
self.ip_address = ip_address
def virtual_storage(self):
raise NotImplementedError
def cache_handling(self):
raise NotImplementedError

Inheriting setUp method Python Unittest

I have a question regarding unittest with Python! Let's say that I have a docker container set up that handles a specific api endpoint (let's say users, ex: my_site/users/etc/etc/etc). There are quite a few different layers that are broken up and handled for this container. Classes that handle the actual call and response, logic layer, data layer. I am wanting to write tests around the specific calls (just checking for status codes).
There are a lot of different classes that act as Handlers for the given endpoints. There are a few things that I would have to set up differently per one, however, each one inherits from Application and uses some methods from it. I am wanting to do a setUp class for my unittest so I don't have to re-establish this each time. Any advice will help. So far I've mainly seen that inheritance is a bad idea with testing, however, I am only wanting to use this for setUp. Here's an example:
class SetUpClass(unittest.TestCase):
def setUp(self):
self._some_data = data_set.FirstOne()
self._another_data_set = data_set.SecondOne()
def get_app(self):
config = Config()
return Application(config,
first_one=self._some_data,
second_one=self._another_data_set)
class TestFirstHandler(SetUpClass, unittest.TestCase):
def setUp(self):
new_var = something
def tearDown(self):
pass
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users')
self.assertEqual(res.code, 200)
class TestSecondHandler(SetUpClass, unittest.TestCase):
def setUp(self):
different_var_thats_specific_to_this_handler = something_else
def tearDown(self):
pass
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users/account/?something_custom={}'.format('WOW'))
self.assertEqual(res.code, 200)
Thanks again!!
As mentioned in the comments, you just need to learn how to use super(). You also don't need to repeat TestCase in the list of base classes.
Here's the simple version for Python 3:
class TestFirstHandler(SetUpClass):
def setUp(self):
super().setUp()
new_var = something
def tearDown(self): # Easier to not declare this if it's empty.
super().tearDown()
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users')
self.assertEqual(res.code, 200)

How to inject mock objects into instance attributes in Python unit testing

I am learning python
I'm wondering if there is a mechanism to "inject" an object (a fake object in my case) into the class under test without explicitly adding it in the costructor/setter.
## source file
class MyBusinessClass():
def __init__(self):
self.__engine = RepperEngine()
def doSomething(self):
## bla bla ...
success
## test file
## fake I'd like to inkject
class MyBusinessClassFake():
def __init__(self):
pass
def myPrint(self) :
print ("Hello from Mock !!!!")
class Test(unittest.TestCase):
## is there an automatic mechanism to inject MyBusinessClassFake
## into MyBusinessClass without costructor/setter?
def test_XXXXX_whenYYYYYY(self):
unit = MyBusinessClass()
unit.doSomething()
self.assertTrue(.....)
in my test I'd like to "inject" the object "engine" without passing it in the costructor. I've tried few option (e.g.: #patch ...) without success.
IOC is not needed in Python. Here's a Pythonic approach.
class MyBusinessClass(object):
def __init__(self, engine=None):
self._engine = engine or RepperEngine()
# Note: _engine doesn't exist until constructor is called.
def doSomething(self):
## bla bla ...
success
class Test(unittest.TestCase):
def test_XXXXX_whenYYYYYY(self):
mock_engine = mock.create_autospec(RepperEngine)
unit = MyBusinessClass(mock_engine)
unit.doSomething()
self.assertTrue(.....)
You could also stub out the class to bypass the constuctor
class MyBusinessClassFake(MyBusinessClass):
def __init__(self): # we bypass super's init here
self._engine = None
Then in your setup
def setUp(self):
self.helper = MyBusinessClassFake()
Now in your test you can use a context manager.
def test_XXXXX_whenYYYYYY(self):
with mock.patch.object(self.helper, '_engine', autospec=True) as mock_eng:
...
If you want to inject it with out using the constuctor then you can add it as a class attribute.
class MyBusinessClass():
_engine = None
def __init__(self):
self._engine = RepperEngine()
Now stub to bypass __init__:
class MyBusinessClassFake(MyBusinessClass):
def __init__(self):
pass
Now you can simply assign the value:
unit = MyBusinessClassFake()
unit._engine = mock.create_autospec(RepperEngine)
After years using Python without any DI autowiring framework and Java with Spring I've come to realize plain simple Python code often doesn't need frameworks for dependency injection without autowiring (autowiring is what Guice and Spring both do in Java), i.e., just doing something like this is enough:
def foo(dep = None): # great for unit testing!
self.dep = dep or Dep() # callers can not care about this too
...
This is pure dependency injection (quite simple) but without magical frameworks for automatically injecting them for you (i.e., autowiring) and without Inversion of Control.
Unlike #Dan I disagree that Python doesn't need IoC: Inversion of Control is a simple concept that the framework takes away control of something, often to provide abstraction and take away boilerplate code. When you use template classes this is IoC. If IoC is good or bad it is totally up to how the framework implements it.
Said that, Dependency Injection is a simple concept that doesn't require IoC. Autowiring DI does.
As I dealt with bigger applications the simplistic approach wasn't cutting it anymore: there was too much boilerplate code and the key advantage of DI was missing: to change implementation of something once and have it reflected in all classes that depend on it. If many pieces of your application cares on how to initialize a certain dependency and you change this initialization or wants to change classes you would have to go piece by piece changing it. With an DI framework that would be way easier.
So I've come up with injectable a micro-framework that wouldn't feel non-pythonic and yet would provide first class dependency injection autowiring.
Under the motto Dependency Injection for Humans™ this is what it looks like:
# some_service.py
class SomeService:
#autowired
def __init__(
self,
database: Autowired(Database),
message_brokers: Autowired(List[Broker]),
):
pending = database.retrieve_pending_messages()
for broker in message_brokers:
broker.send_pending(pending)
# database.py
#injectable
class Database:
...
# message_broker.py
class MessageBroker(ABC):
def send_pending(messages):
...
# kafka_producer.py
#injectable
class KafkaProducer(MessageBroker):
...
# sqs_producer.py
#injectable
class SQSProducer(MessageBroker):
...
Be explicit, use a setter. Why not?
class MyBusinessClass:
def __init__(self):
self._engine = RepperEngine()
from unittest.mock import create_autospec
def test_something():
mock_engine = create_autospec(RepperEngine, instance=True)
object_under_test = MyBusinessClass()
object_under_test._engine = mock_engine
Your question seems somewhat unclear, but there is nothing preventing you from using class inheritance to override the original methods. In that case the derivative class would just look like this:
class MyBusinessClassFake(MyBusinessClass):
def __init__(self):
pass

Categories

Resources