Change decorator to implement in new module - python

I have some class with a field spent_times. spent_times is a list and all methods of this class write some information, which is valuable for logging.
Also, I have a decorator, which calculate execution time for every function and write it to spent_times.
This is realization of my decorator:
def timing(message):
def wrap(function):
def called(*args, **kwargs):
time_start = timer()
spent_time = round(timer() - time_start, 5)
if not args:
return function(*args, **kwargs), spent_time
obj = args[0]
if hasattr(obj, "spent_times"):
obj.spent_times.append("{}={:.5f}".format(message, spent_time))
return function(*args, **kwargs)
else:
logging.warning('Decorator allows to set spent_time attribute!')
return called
return wrap
As you can see in my decorator there is a check, if the calling function has attribute self.
If it has, than I can write needed info in list spent_times on the spot, if it does not have, than decorator returns time spent on execution and function itself.
I am using this decorator in one single module and second case (when no self found) belongs to some other functions in this module, which does not belong to class, where spent_time list is defined, but I execute them inside my class, so I am able to realize for example the following structure:
This is declaration of "outer" function
def calc_users(requests, priority):
# .....
And inside my class I execute it and update my spent_time list this way:
response, spent_time = calc_users(requests, priority)
self.class_obj.spent_times.append("user_calculation={:.5f}".format(spent_time))
which is not very nice, but it is working at least.
Now, I moved a few functions of my class in different new module and I would like to use the same decorator timing.
Can someone help me to implement this realization of timing in new module. I do not know, what can I do to update my spent_times list now.
These two modules will work at the same time and I cannot create object of class and pass it as an argument to new module, because (as far as I understand it) there will be two objects and spent_times will not be updated correctly.
Maybe there is a way to pass a reference to spent_times somehow, but I do not want to change arguments of my functions in new module, since I think in this case principle of shared responsibility will be broken (decorator is responsible for logging, function for its action).
So how to improve decorator or how to pass spent_times list to a new module?
Any help will be greatly appreciate!
P.S.
Maybe make spent_times a global variable? (in the very worst case)

A global list seems fine but you can also use a class and create a singleton by deleting the class after instantiation. This prevents from creating another another instance:
# mymodule.py
from timeit import default_timer as timer
class Timing(object):
def __init__(self):
self.spent_times = []
def __call__(self, message):
def wrap(function):
def called(*args, **kwargs):
time_start = timer()
spent_time = round(timer() - time_start, 5)
self.spent_times.append("{}={:.5f}".format(message, spent_time))
return function(*args, **kwargs)
return called
return wrap
timing = Timing()
del Timing # prevent another instance
Now import in another module:
from mymodule import timing
#timing('Hello')
def add(a, b):
return a + b
The special method __call__ makes an instance of a class behave like a function, i.e. it is callable with ().
The advantage it that you can use self.attr instead of a global variable.
The deletion of the class after instantiation prevents from creating another instance. This is called a singleton. Now all your timings end up in the same list no matter how often you use timing as a decorator.

Related

If a decorator is applied every time the function is called, why not simply add the new functionality to the function itself?

I'm learning to use decorators at the moment but am struggling to wrap my head around their purpose and utility.
I initially thought they provided the convenient option to add extra functionality to an existing function (e.g. func()) without changing its source code, but if the additional functionality is executed whenever func() is called thereafter, then why wouldn't you just save the time/space/complexity and add the new functionality to func() directly?
E.g. Say I wanted to make a function print whenever it is executed, then wouldn't this:
def add(*args):
out = sum(args)
print("Function 'add' executed.")
return out
create the exact same function as below, with far less code/complexity?
def log(func):
def wrapper(*args, **kwargs):
out = func(*args, **kwargs)
print(f"Function '{func.__name__}' executed.")
return out
return wrapper
#log
def add(*args):
return sum(args)
Off the top of my head, the only cases I can think of where the latter could potentially be preferable is if you're importing a generalised decorator to modify a function defined in a separate script, or are applying the same func to many functions so are saving space by just putting it in its own function (in which case it would seem more reasonable to just write a regular old function to simply execute it normally inside others).
EDIT
Perhaps a better way to formulate this question:
Which of the following is preferable, and why?
def logger():
# New functionality here
return
def func1(*args):
# func1 functionality
logger()
return
def func2(*args):
# func2 functionality
logger()
return
Or
def logger(func):
def wrapper(*args, **kwargs):
out = func(*args, **kwargs)
# New functionality here
return out
return wrapper
#logger
def func1(*args):
# func1 functionality
return
#logger
def func2(*args):
# func2 functionality
return
It promotes code reuse and separation of concerns.
To take your argument to the logical extreme, why use functions at all? Why not just have one giant main? A decorator is just a higher-order function and provides a lot of the same benefits that "traditional" functions do; they just solve a slightly different set of problems.
In your example, what if you wanted to change your log implementation to use the logging package instead of print? You would have to find every single function where you copy-pasted the logging behavior to change each implementation. Changing a single decorator's implementation would save you a lot of time making changes (and fixing bugs that arise from making those changes).
Decorators are typically used for behavior (in the decorator function) that wraps or modifies another set of behavior (in the decorated function). Some concrete examples could include:
Start a timer when the decorated function starts, stop it when the function returns, and log the total runtime.
Inspect the function's arguments, mutate some inputs, inject new arguments, or mutate the function's return value (see functools.cache).
Catch and handle certain types of exceptions raised from inside the decorated function.
Register the current function with some other object as a callback (see Flask).
Run the decorated function within a temporary working directory, and clean up the directory when the function returns.
As many others have stated, you can do all of the above without decorators. But all of these cases could be made cleaner and easier to maintain with the help of decorators.
One benefit is minimizing clutter and repetition in your function implementations. Consider:
#log
def add(*args):
return sum(args)
#log
def mult(*args):
return math.product(*args)
vs:
def add(*args):
out = sum(args)
print("Function add executed.")
return out
def mult(*args):
out = math.product(*args)
print("Function mult executed.")
return out
and imagine that repetition over, say, a hundred functions in a large codebase.
If you kept the log function and used it without decorator syntax you might have something like:
def _add(*args):
return sum(args)
def _mult(*args):
return math.product(*args)
add = log(_add)
mult = log(_mult)
which isn't the worst thing in the world, but it'd be annoying for a reader having to bounce through a level of indirection each time they try to look up a function's implementation.
The most important factor to the benefit of decorators is the DRY code principle
"Do Not Repeat Yourself" as an principle lends itself to creating easy-to-understand, easy-to-write code. Python's decorators are a fantastic example of features that minimise unnecessary code repetition:
Consider the #dataclass decorator. In short, it allows for classes which store only instance attributes to be written easier, as shown in the following example:
class Person:
def __init__(self, name, age, gender):
self.name = name
self.age = age
self.gender = gender
versus
#dataclass
class Person:
name: str
age: int
gender: str
The important idea about decorators to realise, however, is that creating the #dataclass decorator and writing the second (better) implementation of Person DO INDEED take more time than just writing the first Person implementation.
However, very critically, the difference emerges when you write a second, or third data-oriented class! At this point, by writing the #dataclass decorator, the creation of every single class can be sped up, by removing the boilerplate.
This example generalises to all decorators: writing a #log decorator is slow for one function, but worth it to log 100 different functions.

How to run a method before/after all class function calls with arguments passed?

There are some interesting ways to run a method before every method in a class in questions such as Python: Do something for any method of a class?
However that solution doesn't let us pass arguments.
There's a decorator solution on Catch "before/after function call" events for all functions in class but I don't want to have to go back and decorate all my classes.
Is there a way to run a pre/post operation that's dependent on the arguments passed for every invocation of an object's method?
Example:
class Stuff(object):
def do_stuff(self, stuff):
print(stuff)
a = Stuff()
a.do_stuff('foobar')
"Pre operation for foobar"
"foobar"
"Post operation for foobar"
So I figured it out after a lot of experimentation.
Basically in the metaclass' __new__ you can iterate through every method in the class' namespace and swap out every method in the class being created with a new version that runs the before logic, the function itself, and the after logic. Here's a sample:
class TestMeta(type):
def __new__(mcl, name, bases, nmspc):
def replaced_fnc(fn):
def new_test(*args, **kwargs):
# do whatever for before function run
result = fn(*args, **kwargs)
# do whatever for after function run
return result
return new_test
for i in nmspc:
if callable(nmspc[i]):
nmspc[i] = replaced_fnc(nmspc[i])
return (super(TestMeta, mcl).__new__(mcl, name, bases, nmspc))
Note that if you use this code as is it will run the pre/post operation for init and other builtin functions as well.

Get attribute of a property in python

I have a decorator which simply caches return values (called #cached in my example) and I wish to use it in conjunction with #property. This works just fine normally. The problem I am facing occurs when I try and use an expire attribute added by #cached.
def cached(f):
cache = [None]
def inner(*args, **kwargs):
if cache[0]:
cache[0] = f(*args, **kwargs)
return cache[0]
def expire():
cache[0] = None
inner.expire = expire
return inner
class Example(object):
#property
#cached
def something_expensive(self):
print("expensive")
return "hello"
e = Example()
e.something_expensive
e.something_expensive.expire()
How am I able to get access to the expire function? added to the function after its replaced by #property. I understand why this doesn't work I am interested in a way of working around that problem.
Some restrictions:
I cannot change the #cached decorator its in a library I don't control
I would really rather not remove #property because I want to expire in my unit tests and they make my code much nicer to use.
One solution that I think is rather bad is (because In reality I have a lot of properties that I want to do this for):
class Example(object):
#cached
def _something_expensive(self):
return "hello"
#property
def something_expensive(self):
return self._something_expensive()
You can access it using the class dictionary:
type(e).__dict__['something_expensive'].fget.expire()
In general e.something_expensive is equivalent to:
type(e).__dict__['something_expensive'].__get__(e, type(e))
For more details read up: Descriptor HowTo Guide
Note that inside the expiry function you're not setting cache from the outer function cached function as None, you're simply creating a new local variable. You may want to do something like this:
def expire():
del cache[:]
cache.append(None)
In Python 3 it's even easier to update cache using the nonlocal keyword.

Python __metaclass__ inheritance issue

My issue is that I am using a metaclass to wrap certain class methods in a timer for logging purposes.
For example:
class MyMeta(type):
#staticmethod
def time_method(method):
def __wrapper(self, *args, **kwargs):
start = time.time()
result = method(self, *args, **kwargs)
finish = time.time()
sys.stdout.write('instancemethod %s took %0.3f s.\n' %(
method.__name__, (finish - start)))
return result
return __wrapper
def __new__(cls, name, bases, attrs):
for attr in ['__init__', 'run']:
if not attr in attrs:
continue
attrs[attr] = cls.time_method(attrs[attr])
return super(MetaBuilderModule, cls).__new__(cls, name, bases, attrs)
The problem I'm having is that my wrapper runs for every '__init__' even though I really only want it for the current module I am instantiating. The same goes for any method want to time. I dont want the timing to run on any inherited methods UNLESS they aren't being overridden.
class MyClass0(object):
__metaclass__ = MyMeta
def __init__(self):
pass
def run(self):
sys.stdout.write('running')
return True
class MyClass1(MyClass0):
def __init__(self): # I want this timed
MyClass0.__init__(self) # But not this.
pass
''' I need the inherited 'run' to be timed. '''
I've tried a few things but so far I've had no success.
Guard the timing code with an attribute. That way, only the outermost decorated method on an object will actually get timed.
#staticmethod
def time_method(method):
def __wrapper(self, *args, **kwargs):
if hasattr(self, '_being_timed'):
# We're being timed already; just run the method
return method(self, *args, **kwargs)
else:
# Not timed yet; run the timing code
self._being_timed = True # remember we're being timed
try:
start = time.time()
result = method(self, *args, **kwargs)
finish = time.time()
sys.stdout.write('instancemethod %s took %0.3f s.\n' %(
method.__name__, (finish - start)))
return result
finally:
# Done timing, reset to original state
del self._being_timed
return __wrapper
Timing only the outermost method is slightly different than “not timing inherited methods unless they aren't being overridden”, but I believe it solves your problem.
I'm not sure this has anything to do with multiple inheritance.
The trouble is that any subclass of MyClass0 has to be an instance of the same metaclass, which means MyClass1 gets created with MyMeta.__new__, so its methods get processed and wrapped in the timing code.
Effectively, what you need is that MyClass0.__init__ somehow returns something different in the two following circumstances:
When called directly (instantiating MyClass0 directly, or when MyClass1 doesn't override it), it needs to return the timed method
When called within a subclass definition, it needs to return the original untimed method
This is impossible, since MyClass0.__init__ doesn't know why it's being called.
I see three options:
Make the metaclass more complex. It can check through the base classes to see if they're already instances of the metaclass; if so it can make a new copy of them that removes the timed wrapper from the methods that are present in the class being constructed. You don't want to mutate the base classes directly, as that will affect all uses of them (including when they're instantiated directly, or when they're subclassed by other classes that override different methods). A downside of this is it really screws up the instanceof relationships; unless you construct the slight variations on the base classes by creating new subclasses of them (ugh!) and caching all the variations so you never construct duplicates (ugh!), you completely void natural assumptions that two classes share a base class (they may only share a template from which two completely independent base classes were generated).
Make the timing code more complex. Have a start_timing and stop_timing method, and if start_timing is called when the method is already being timed you just increment a counter, and stop_timing just decrements a counter and only stops timing when the counter hits zero. Be careful of timed methods that call other timed methods; you'll need to have separate counters per method name.
Give up on metaclasses and just use a decorator on the methods you want timed explicitly, with some way of getting at the undecorated method so that overriding definitions can call it. This will involve a couple of lines of boiler plate per use; that will quite possibly add up to less lines of code than either of the other two options.

How to decorate an object method?

I need to decorate a object's method. It needs to be at runtime because the decorators applied on the object depends on the arguments that the user gave when calling the program (arguments supplied with argv), so a same object could be decorated 3 times, 2 times, or not be decorated at all.
Here is some context, the program is a puzzle solver, the main behavior is to find a solution for the puzzle automatically, by automatically I mean without user intervention. And here is where the decoration gets to play, one of the things I want to is draw a graph of what happened during the execution, but I want to do so only when the flag --draw-graph is used.
Here is what I've tried:
class GraphDecorator(object):
def __init__(self, wrappee):
self.wrappee = wrappee
def method(self):
# do my stuff here
self.wrappee.method()
# do more of stuff here
def __getattr__(self,attr):
return getattr(self.wrappee,attr)
And why it did NOT work:
It did not work because of the way I built the application, when a method that did not exist in my Decorator class was called it felt back to the implementation of the decorated class, the problem is that the application always started invoking the method run that did not need to be decorated, so the undecorated fall back was used and from inside the undecorated form it always called undecorated methods, what I needed was to replace the method from the object, not to proxy it:
# method responsible to replace the undecorated form by the decorated one
def graphDecorator(obj):
old_method = obj.method
def method(self):
# do my stuff here
old_method()
# do more of my stuff
setattr(obj,'method',method) # replace with the decorated form
And here is my problem, the decorated form does not receive self when it is called resulting on a TypeError because of the wrong number of arguments.
The problem was that I couldn't use func(self) as a method. The reason is that setattr() method does not bound the function, and the function acts like it a static method - not a class method -, thanks to the introspective nature of python I've able to come up with this solution:
def decorator(obj):
old_func = obj.func # can't call 'by name' because of recursion
def decorated_func(self):
# do my stuff here
old_func() # does not need pass obj
# do some othere stuff here
# here is the magic, this get the type of a 'normal method' of a class
method = type(obj.func)
# this bounds the method to the object, so self is passed by default
obj.func = method(decorated_func, obj)
I think this is the best way to decorate a object's method at runtime, though it would be nice to find a way to call method() directly, without the line method = type(obj.func)
You might want to use __getattribute__ instead of __getattr__ (the latter being only called if "standard" lookup fails):
class GraphDecorator(object):
def __init__(self, wrappee):
self.__wrappee = wrappee
def method(self):
# do my stuff here
self.wrappe.method()
# do more of stuff here
def __getattribute__(self, name):
try:
wrappee = object.__getattribute__(self, "_GraphDecorator__wrappee")
return getattr(wrappee, name)
except AttributeError:
return object.__getattribute__(self, name)
I need to decorate a object's method. It needs to be at runtime because the decorators applied on the object depends on the arguments that the user gave when calling the program (arguments supplied with argv), so a same object could be decorated 3 times, 2 times, or not be decorated at all.
The above is unfortunately incorrect, and what you are trying to do is unnecessary.
You can do this at runtime like so. Example:
import sys
args = sys.argv[1:]
class MyClass(object):
pass
if args[0]=='--decorateWithFoo':
MyClass = decoratorFoo(MyClass)
if args[1]=='--decorateWithBar'
MyClass = decoratorBar(MyClass)
The syntax:
#deco
define something
Is the same thing as:
define something
something = deco(something)
You could also make a decorator factory #makeDecorator(command_line_arguments)
"It needs to be at runtime because the decorators applied on the object depends on the arguments that the user gave when calling the program"
The don't use decorators. Decorators are only syntactical support for wrappers, you can just as well use normal function/method calls instead.

Categories

Resources