How to implement observer pattern the way django signals implements it - python

there's a design issue i have with a problem i'm currently working that will require a certain observable to emit values whenever it reaches a computation milestone,so i need values emitted intermediately,i discovered django implements something similar with signals which is just an implementation of the observer pattern but i really wanted to implement it the way django signals is implemented because observers subscribing to an observable( e.g after saving a model to the db i.e post_save signal) is very much decoupled in the sense that the observing function only needs to be annotated with the #receiver decorator.
So i took a look at django's source code and tried creating a simple version of the signal mechanism i created a signals class in one module with the decorator :
class Signal:
def __init__(self):
self.observers=[]
def connect(self,receiver):
self.observers.append(receiver.__name__)
def send(self,sender,**named):
return [
(receiver,mem_dec(signal=self,sender=sender,**named))
for receiver in self.observers
]
class Child(Signal):
def __init__(self):
super().__init__()
def mem_dec(signal,**kwargs):
def wrapper(func):
signal.connect(func, **kwargs)
return func
return wrapper
if __name__ == "__main__":
send_signal=Child()
send_signal.send(sender="class_child",a="henry")
and i placed the decorated function in another module to serve as the observer:
from .signals import mem_dec
from .signals import Child
#mem_dec(Child,sender=Child)
def hello(a,**kwargs):
print(f'hey {a}')
i noticed in my implementation doesn't work which i expected but the issue is that's what i could reproduce from the source code,i'm very confused about how they implemented their send() and connect() mechanism,all i want to achieve with this is for the signal subclass call all decorated functions, i made the Child class the signal and sender at the same time cause i'm just trying to reproduce the simplest example.

Related

Extending functionality of a Python library class which is part of a structure

I am working with the Python canmatrix library (well, presently my Python3 fork) which provides a set of classes for an in-memory description of CAN network messages as well as scripts for importing and exporting to and from on-disk representations (various standard CAN description file formats).
I am writing a PyQt application using the canmatrix library and would like to add some minor additional functionality to the bottom level Signal class. Note that a CanMatrix organizes it's member Frames which in turn organize it's member Signals. The whole structure is created by an import script which reads a file. I would like to retain the import script and sub-member finder functions of each layer but add an extra 'value' member to the Signal class as well as getters/setters that can trigger Qt signals (not related to the canmatrix Signal objects).
It seems that standard inheritance approaches would require me to subclass every class in the library and override every function which creates the library Signal to use mine instead. Ditto for the import functions. This just seems horribly excessive to add non-intrusive functionality to a library.
I have tried inheriting and replacing the library class with my inherited one (with and without the pass-through constructor) but the import still creates library classes, not mine. I forget if I copied this from this other answer or not, but it's the same structure as referenced there.
class Signal(QObject, canmatrix.Signal):
_my_signal = pyqtSignal(int)
def __init__(self, *args, **kwargs):
canmatrix.Signal.__init__(self, *args, **kwargs)
# TODO: what about QObject
print('boo')
def connect(self, target):
self._my_signal.connect(target)
def set_value(self, value):
self._my_value = value
self._my_signal.emit(value)
canmatrix.Signal = Signal
print('overwritten')
Is there a direct error in my attempt here?
Am I doing this all wrong and need to go find some (other) design pattern?
My next attempt involved shadowing each instance of the library class. For any instance of the library class that I want to add the functionality to I must construct one of my objects which will associate itself with the library-class object. Then, with an extra layer, I can get from either object to the other.
class Signal(QObject):
_my_signal = pyqtSignal(int)
def __init__(self, signal):
signal.signal = self
self.signal = signal
# TODO: what about QObject parameters
QObject.__init__(self)
self.value = None
def connect(self, target):
self._my_signal.connect(target)
def set_value(self, value):
self.value = value
self._my_signal.emit(value)
The extra layer is annoying (library_signal.signal.set_value() rather than library_signal.set_value()) and the mutual references seem like they may keep both objects from ever getting cleaned up.
This does run and function, but I suspect there's still a better way.

Django how to use the ``receiver`` decorator on a class instead on a function

Using Django signals receiver decorator I have the following function.
#receiver(post_save)
def action_signal(sender, instance, created, **kwargs):
pass
Is it possible to use the receiver decorator on a class instead on a function? The reason for this is I would like to have a __init__ method etc.
i.e. How can I get something like this to work...
class ActionSignals
def __init__(self):
self.stuff
#receiver(post_save)
def action_signal(sender, instance, created, **kwargs):
print(self.stuff)
Using the receiver decorator on a class method doesn't really make sense. When do you expect the object to be instantiated and the __init__ method to run?
You can use the manual method for connecting signals, instead of the receiver decorator.
First instantiate the object:
action_signal = ActionSignals()
Then you can use the connect method to connect the signal:
post_save.connect(action_signal.action_signal)
Adding a class-based signal handler
You can do this:
class ActionSignals(object):
def __init__(self, *args, **kwargs):
# ...
def __call__(self, *args, **kwargs):
print(self.stuff)
Then to connect the signal handler:
handler = ActionSignals()
post_save.connect(handler)
This makes use of python's "magic" __call__ method that allows you to use an instance of a class as a function.
Avoiding duplicates
Be careful about where you are adding the handlers in your code, as you might create duplicates. For example if you were to put the second bit of code in the module root, it would add a handler every time the module is imported.
To avoid this you can do the following:
post_save.connect(handler, dispatch_uid="my_unique_identifier")
As #Alasdair pointed out, you can add handlers in AppConfig.ready() (and this is the recommended place to do it), although generally it can be done anywhere if you take care not to create undesired duplicates.
See "Where should this code live?" in this doc.

How do I design this procedural code as class based (object oriented)?

I'm a beginner-intermediate self taught Python developer,
In most of the projects I completed, I can see the following procedure repeats. I don't have any outside home code experiences, I think the below code is not so professional as it is not reusable, and seems like it is not fitting all together in a container, but loosely coupled functions on different modules.
def get_query():
# returns the query string
pass
def make_request(query):
# makes and returns the request with query
pass
def make_api_call(request):
# calls the api and returns response
pass
def process_response(response):
# process the response and returns the details
pass
def populate_database(details):
# populates the database with the details and returns the status of population
pass
def log_status(status):
# logs the status so that developer knows whats happening
pass
query = get_query()
request = make_request(query)
response = make_api_call(request)
details = process_response(response)
status = populate_database(details)
log_status(status)
How do I design this procedure as a class based design?
If I understand correctly, you want these group of functions to be reused. Good approach to this would be create Abstract base class with these methods as shown below:
from abc import ABCMeta
class Generic(object):
__metaclass__ = ABCMeta
def get_query(self):
# returns the query string
pass
def make_request(self, query):
# makes and returns the request with query
pass
def make_api_call(self, request):
# calls the api and returns response
pass
def process_response(self, response):
# process the response and returns the details
pass
def populate_database(self, details):
# populates the database with the details and returns the status of population
pass
def log_status(self, status):
# logs the status so that developer knows whats happening
pass
Now whenever you need to use any of these methods in your project, inherit your class from this abstract class.
class SampleUsage(Generic):
def process_data(self):
# In any of your methods you can call these generic functions
self.get_query()
And then you can create object to actually get results which you want.
obj = SampleUsage()
obj.process_data()
You may have several classes here. To name a few, Query, Request, Response, Database, Logger
Some of your functions may map as follows:
make_query -> Query.make() constructor or class method
make_request -> Request.make(query) constructor or class method
make_api_call -> Request.make_api_call()
process_response -> Response.process()
populate_database -> Database.populate()
log_status -> Logger.status Consider using logging module
You have to think about your application and design it as an interaction of cooperating objects. This is just a starting point in order for you to be partition the functionality of the application between the classes.
Some of these Classes may be Singletons, meaning they are instantiated only once at the beginning of the application and accessed everywhere else. Database and Logger fit that role.
Here is some skeleton definitions:
class Query(object):
#classmethod
def make(cls, *args, **kwargs):
pass
class Request(object):
#classmethod
def make(cls, query):
pass
def make_api_call(self, *args, **kwargs):
# possibly return Response
pass
class Response(object):
def process_response(self):
pass
class Database(object):
_the_db = None
#classmethod
def get_db(cls):
# Simple man's singleton
if not cls._the_db:
cls._the_db = Database()
return cls._the_db
def populate(self):
pass
class Logger(object):
def log(self):
# consider using logging module
pass
I think what lacks in your question is the sense of purpose. You don't switch a perfectly fine procedural code to object-oriented code without a reason. Depending on the reason, there are several ways to do it. As this problem is quite a common one, there are some common techniques that are known to work well for some common reasons.
So, let's assume you encapsulate the main procedure in an object. What are your needs?
Allow re-using the procedure, possibly overriding some parts? See below the template method pattern.
Allow dynamically altering the behavior of the procedure at runtime depending on external factors? Look into the Strategy pattern.
Allow dynamically altering the behavior of the procedure at runtime depending on internal factors? For example, if some request may switch the procedure into "maintenance mode"? Look into the State pattern.
I'll just describe the template method pattern, which looks the closest to Marty's concerns. I cut down the example to 3 steps so it's easier to explain, but I made you a fully working example gist.
The template method
You want to provide a way to re-use the procedure, while allowing to override some well-defined parts? Let's create an empty, fill-in-the-blanks-style template:
class BaseRequestProcesor(object):
def get_query(self):
raise NotImplementedError()
def process_query(self, query):
raise NotImplementedError()
def log_status(self, status):
raise NotImplementedError()
def process(self): # main procedure
query = self.get_query()
status = self.process_query(query)
self.log_status(status)
__call__ = process # allow "calling" the requestprocessor
We have our basic template. Let's create some template fillers:
class DemoQueryReader(object):
def get_query(self):
return 'this is a query'
class HelloQueryProcessor(object):
def process_query(self, query):
return 'Hello World, {}!'.format(query)
class StdoutLogProcessor(object):
def log_status(self, status):
print(status)
Now build a full request processor from the bits we want. This is where the pieces comes together:
class DemonstrationProcessor(DemonQueryReader, HelloQueryProcessor, StdoutLogProcessor, BaseRequestProcessor):
pass
Demonstrating in the console:
>>> from marty_example import DemonstrationProcessor
>>> processor = DemonstrationProcessor()
>>> processor()
Hello World, this is a query!
This is the most pedantic example you can build. You could supply default implementations when that makes sense (doing nothing, mostly). And you can group together overrides, should that make sense.
The point is, you made your process a template, allowing easy override of chosen details, while still being in control of the overall workflow. This is a form of inversion of control.
You can also save a Python file with the class name, or you can create external modules with some functions, organizing them into the modules depending on what they do. Some modules will only contain one function; others will contain a lot.

subclassing Celery Task for a `ClassTask` mixin

Prefacing my question with the fact that I'm new to Celery and this (1) may have been answered somewhere else (if so, I couldn't find the answer) or (2) there may be a better way to accomplish my objective than what I'm directly asking.
Also, I am aware of celery.contrib.methods, but task_method does not quite accomplish what I am looking for.
My Objective
I would like to create a class mixin that turns a whole class into a Celery task. For example, a mixin represented by something like the code below (which right now does not run):
from celery import Task
class ClassTaskMixin(Task):
#classmethod
def enqueue(cls, *args, **kwargs):
cls.delay(*args, **kwargs)
def run(self, *args, **kwargs):
Obj = type(self.name, (), {})
Obj(*args, **kwargs).run_now()
def run_now(self):
raise NotImplementedError()
Unlike when using task_method, I do not want to fully instantiate the class before the task is queued and .delay() is called. Rather, I want to simply hand-off the class name along with any relevant initialization parameters to the async process. The async process would then fully instantiate the class using the class name and the given initialization paremeters, and then call some method (say .run_now(), for example) on the instantiated object.
Example Use Case
Constructing and sending email asynchronously would be an example use for the mixin I need.
class WelcomeEmail(EmailBase, ClassTaskMixin):
def __init__(self, recipient_address, template_name, template_context):
self.recipient_address = recipient_address
self.template_name = template_name
self.template_context = template_context
def send(self):
self.render_templates()
self.construct_mime()
self.archive_to_db()
self.send_smtp_email()
def run_now(self):
self.send()
The above code would send an email in an async Celery process by calling WelcomeEmail.enqueue(recipient_address, template_name, template_context). Sending the email synchronously in-process would be accomplished by calling WelcomeEmail(recipient_address, template_name, template_context).send().
Questions
Is there any reason that what I'm trying to do is very, very wrong within the Celery framework?
Is there a better way to structure the mixin to make it more Celery-onic than what I've proposed (better attribute names, different method structure, etc.)?
What am I missing to make the mixin functional in a use case as I've described?
Apparently this issue isn't hugely interesting to a lot of people, but... I've accomplished what I set out to do.
See pull request https://github.com/celery/celery/pull/1897 for details.

Scanning for thread violations with Tkinter

We are just about to finish a very large update to our application which is built with python2.5 and Tkinter and the following error has crept in sadly:
alloc: invalid block: 06807CE7: 1 0 0
This application has requested the Runtime to terminate it in an unusual way.
Please contact the application's support team for more information.
We've seen this before and it is usually a Tcl Interrupter error caused when a non GUI thread tries to access TK via Tkinter in anyway (TK not being thread safe). The error pops up on application close, after the python interrupter is finished with our code. This error is very hard to reproduce and I'm thinking I will have to scan all threads in the system to see if they access TK when they shouldn't.
I'm looking for a magic python trick to help with this. All Tkinter widgets we use are first subclassed and inherit from out own Widget base class.
With this in mind I'm looking for a way to add the following check to the beginning of every method in the widget sub classes:
import thread
if thread.get_ident() != TKINTER_GUI_THREAD_ID:
assert 0, "Invalid thread accessing Tkinter!"
Decorators as a partial solution comes to mind. I do not want to add decorators manually to each method however. Is there a way I can add the decorator to all methods of a class that inherits from our Widget base class? Or is there a better way to do all this? Or does anyone have more info about this error?
enter code here
I don't know if your approach is good, as I don't know Tkinter.
But here's a sample of how to decorate all class methods using a metaclass.
import functools
# This is the decorator
def my_decorator(func):
#functools.wraps(func)
def wrapper(*args, **kwargs):
print 'calling', func.__name__, 'from decorator'
return func(*args, **kwargs)
return wrapper
# This is the metaclass
class DecorateMeta(type):
def __new__(cls, name, bases, attrs):
for key in attrs:
# Skip special methods, e.g. __init__
if not key.startswith('__') and callable(attrs[key]):
attrs[key] = my_decorator(attrs[key])
return super(DecorateMeta, cls).__new__(cls, name, bases, attrs)
# This is a sample class that uses the metaclass
class MyClass(object):
__metaclass__ = DecorateMeta
def __init__(self):
print 'in __init__()'
def test(self):
print 'in test()'
obj = MyClass()
obj.test()
The metaclass overrides the class creation. It loops through all the attributes of the class being created and decorates all callable attributes that have a "regular" name with my_decorator.
I went with a slightly easier method. I used the __getattribute__ method. The code is as follows:
def __getattribute__(self, name):
import ApplicationInfo
import thread, traceback
if ApplicationInfo.main_loop_thread_id != thread.get_ident():
print "Thread GUI violation"
traceback.print_stack()
return object.__getattribute__(self, name)
And sure enough we found one obscure place where we were accessing state from within TK while not being in the main GUI thread.
Although I must admit I need to review my python, feeling noobish looking at your example.

Categories

Resources