Celery: how to set the status of a Task - python

I have defined a Celery task like this:
#app.task()
def my_task():
# Do things...
I'm using Flower, so I want to see the final state of the task, according to some rules created by me:
if condition_1:
return task_status_success
elif condition_2:
return task_status_fail
How can I do this?
I've seen some people do something like this:
class AbstractTask(Task):
abstract = True
def __init__(self):
self.last_error_log = ErrorLog(logger)
Task.__init__(self)
def _task_error(self, message):
logger.error(message)
self.update_state(state=states.FAILURE)
raise Exception(message)
But that method seems to define classes as Tasks, not as functions.
Any help on how to set manually the state of a Celery task defined as a function?

To use the method you saw that uses an abstract class, you just need to pass the class as base to your decorator:
#app.task(base=AbstractClass, bind=True)
def my_task(self):
pass
bind=True will allow you to use self to access the members of your class.

Related

It is possible for python class attribute to make as decorator?

Im trying to follow this Celery Based Background Tasks to create a celery settings for a simple application.
In my task.py
from celery import Celery
def make_celery(app):
celery = Celery(app.import_name, backend=app.config['CELERY_RESULT_BACKEND'],
broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
return celery
This method works in the app.py of main flask application.
from flask import Flask
flask_app = Flask(__name__)
flask_app.config.update(
CELERY_BROKER_URL='redis://localhost:6379',
CELERY_RESULT_BACKEND='redis://localhost:6379'
)
celery = make_celery(flask_app)
#celery.task()
def add_together(a, b):
return a + b
My use case is I want to create another module helpers.py where I
can define a collections of asynchronous classes. To separate
celery based methods and make it modular.
What I did is call the task.py module to other module helpers.py in order to create a class AsyncMail to handle email action background work.
from task import make_celery
class AsyncMail(object):
def __init__(self, app):
"""
:param app: An instance of a flask application.
"""
self.celery = make_celery(app)
def send(self, msg):
print(msg)
Now how can I access self.celery attribute to be a decorator for any method of the class?
#celery.task()
def send(self, msg):
print(msg)
If it impossible, what other alternative steps in order to achieved this problem?
You can't do what you're trying to do. At the time the class is being defined, there is no self, much less self.celery, to call, so you can't use #self.celery. Even if you had some kind of time machine, there could be 38 different AsyncMail instances created, and which one's self.celery would you want here?
Before getting into how you could do what you want, are you sure you want to? Do you actually want each AsyncMail object to have it own separate Celery? Normally you only have one per app, which is why normally this doesn't come up.
If you really wanted to, you could give each instance decorated methods after you have an object to decorate them with. But it's going to be ugly.
def __init__(self, app):
self.celery = make_celery(app)
# We need to get the function off the class, not the bound method off self
send = type(self).send
# Then we decorate it manually—this is all #self.celery.task does
send = self.celery.task(send)
# Then we manually bind it as a method
send = send.__get__(self)
# And now we can store it as an instance attribute, shadowing the class's
self.send = send
Or, if you prefer to put it all together in one line:
self.send = self.celery.task(type(self).send).__get__(self)
For Python 2, the "function off the class" is actually an unbound method, and IIRC you have to call __get__(self, type(self)) to turn it into a bound method at the end, but otherwise it should all be the same.

Inheriting setUp method Python Unittest

I have a question regarding unittest with Python! Let's say that I have a docker container set up that handles a specific api endpoint (let's say users, ex: my_site/users/etc/etc/etc). There are quite a few different layers that are broken up and handled for this container. Classes that handle the actual call and response, logic layer, data layer. I am wanting to write tests around the specific calls (just checking for status codes).
There are a lot of different classes that act as Handlers for the given endpoints. There are a few things that I would have to set up differently per one, however, each one inherits from Application and uses some methods from it. I am wanting to do a setUp class for my unittest so I don't have to re-establish this each time. Any advice will help. So far I've mainly seen that inheritance is a bad idea with testing, however, I am only wanting to use this for setUp. Here's an example:
class SetUpClass(unittest.TestCase):
def setUp(self):
self._some_data = data_set.FirstOne()
self._another_data_set = data_set.SecondOne()
def get_app(self):
config = Config()
return Application(config,
first_one=self._some_data,
second_one=self._another_data_set)
class TestFirstHandler(SetUpClass, unittest.TestCase):
def setUp(self):
new_var = something
def tearDown(self):
pass
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users')
self.assertEqual(res.code, 200)
class TestSecondHandler(SetUpClass, unittest.TestCase):
def setUp(self):
different_var_thats_specific_to_this_handler = something_else
def tearDown(self):
pass
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users/account/?something_custom={}'.format('WOW'))
self.assertEqual(res.code, 200)
Thanks again!!
As mentioned in the comments, you just need to learn how to use super(). You also don't need to repeat TestCase in the list of base classes.
Here's the simple version for Python 3:
class TestFirstHandler(SetUpClass):
def setUp(self):
super().setUp()
new_var = something
def tearDown(self): # Easier to not declare this if it's empty.
super().tearDown()
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users')
self.assertEqual(res.code, 200)

routing class based tasks in celery based on class inheritance

Suppose I have the following celery tasks defined as classes
from celery import Task
class BaseTask(Task):
abstract = True
def run():
pass # not important
class SpecificTask(BaseTask):
def run():
pass # not important
Is there an easy way to set routing for all taks inheriting from BaseTaks class? Ideally I would like to do something like:
CELERY_ROUTES = {
'project.tasks.BaseTask': {'queue': 'notify'},
}
Unfortunately this doesn't seem to work as I intended.
Thanks for advice.

How to execute BaseClass method before it gets overridden by DerivedClass method in Python

I am almost sure that there is a proper term for what I want to do but since I'm not familiar with it, I will try to describe the whole idea explicitly. So what I have is a collection of classes that all inherit from one base class. All the classes consist almost entirely of different methods that are relevant within each class only. However, there are several methods that share similar name, general functionality and also some logic but their implementation is still mostly different. So what I want to know is whether it's possible to create a method in a base class that will execute some logic that is similar to all the methods but still continue the execution in the class specific method. Hopefully that makes sense but I will try to give a basic example of what I want.
So consider a base class that looks something like that:
class App(object):
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
def access(self):
LOGIC_SHARED
And an example of a derived class:
class App1(App):
def __init__(self, testName):
. . .
super(App1, self).__init__(testName)
def access(self):
LOGIC_SPECIFIC
So what I'd like to achieve is that the LOGIC_SHARED part in base class access method to be executed when calling the access method of any App class before executing the LOGIC_SPECIFIC part which is(as it says) specific for each access method of all derived classes.
If that makes any difference, the LOGIC_SHARED mostly consists of logging and maintenance tasks.
Hope that is clear enough and the idea makes sense.
NOTE 1:
There are class specific parameters which are being used in the LOGIC_SHARED section.
NOTE 2:
It is important to implement that behavior using only Python built-in functions and modules.
NOTE 3:
The LOGIC_SHARED part looks something like that:
try:
self.localLog.info("Checking the actual link for %s", self.application)
self.link = self.checkLink(self.application)
self.localLog.info("Actual link found!: %s", self.link)
except:
self.localLog.info("No links found. Going to use the default link: %s", self.link)
So, there are plenty of specific class instance attributes that I use and I'm not sure how to use these attributes from the base class.
Sure, just put the specific logic in its own "private" function, which can overridden by the derived classes, and leave access in the Base.
class Base(object):
def access(self):
# Shared logic 1
self._specific_logic()
# Shared logic 2
def _specific_logic(self):
# Nothing special to do in the base class
pass
# Or you could even raise an exception
raise Exception('Called access on Base class instance')
class DerivedA(Base):
# overrides Base implementation
def _specific_logic(self):
# DerivedA specific logic
class DerivedB(Base):
# overrides Base implementation
def _specific_logic(self):
# DerivedB specific logic
def test():
x = Base()
x.access() # Shared logic 1
# Shared logic 2
a = DerivedA()
a.access() # Shared logic 1
# Derived A specific logic
# Shared logic 2
b = DerivedB()
b.access() # Shared logic 1
# Derived B specific logic
# Shared logic 2
The easiest method to do what you want is to simply call the parent's class access method inside the child's access method.
class App(object):
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
def access(self):
LOGIC_SHARED
class App1(App):
def __init__(self, testName):
super(App1, self).__init__(testName)
def access(self):
App.access(self)
# or use super
super(App1, self).access()
However, your shared functionality is mostly logging and maintenance. Unless there is a pressing reason to put this inside the parent class, you may want to consider is to refactor the shared functionality into a decorator function. This is particularly useful if you want to reuse similar logging and maintenance functionality for a range of methods inside your class.
You can read more about function decorators here: http://www.artima.com/weblogs/viewpost.jsp?thread=240808, or here on Stack Overflow: How to make a chain of function decorators?.
def decorated(method):
def decorated_method(self, *args, **kwargs):
LOGIC_SHARED
method(self, *args, **kwargs)
return decorated_method
Remember than in python, functions are first class objects. That means that you can take a function and pass it as a parameter to another function. A decorator function make use of this. The decorator function takes another function as a parameter (here called method) and then creates a new function (here called decorated_method) that takes the place of the original function.
Your App1 class then would look like this:
class App1(App):
#logged
def access(self):
LOGIC_SPECIFIC
This really is shorthand for this:
class App1(App):
def access(self):
LOGIC_SPECIFIC
decorated_access = logged(App.access)
App.access = decorated_access
I would find this more elegant than adding methods to the superclass to capture shared functionality.
If I understand well this commment (How to execute BaseClass method before it gets overridden by DerivedClass method in Python) you want that additional arguments passed to the parent class used in derived class
based on Jonathon Reinhart's answer
it's how you could do
class Base(object):
def access(self,
param1 ,param2, #first common parameters
*args, #second positional parameters
**kwargs #third keyword arguments
):
# Shared logic 1
self._specific_logic(param1, param2, *args, **kwargs)
# Shared logic 2
def _specific_logic(self, param1, param2, *args, **kwargs):
# Nothing special to do in the base class
pass
# Or you could even raise an exception
raise Exception('Called access on Base class instance')
class DerivedA(Base):
# overrides Base implementation
def _specific_logic(self, param1, param2, param3):
# DerivedA specific logic
class DerivedB(Base):
# overrides Base implementation
def _specific_logic(self, param1, param2, param4):
# DerivedB specific logic
def test():
x = Base()
a = DerivedA()
a.access("param1", "param2", "param3") # Shared logic 1
# Derived A specific logic
# Shared logic 2
b = DerivedB()
b.access("param1", "param2", param4="param4") # Shared logic 1
# Derived B specific logic
# Shared logic 2
I personally prefer Jonathon Reinhart's answer, but seeing as you seem to want more options, here's two more. I would probably never use the metaclass one, as cool as it is, but I might consider the second one with decorators.
With Metaclasses
This method uses a metaclass for the base class that will force the base class's access method to be called first, without having a separate private function, and without having to explicitly call super or anything like that. End result: no extra work/code goes into inheriting classes.
Plus, it works like maaaagiiiiic </spongebob>
Below is the code that will do this. Here http://dbgr.cc/W you can step through the code live and see how it works :
#!/usr/bin/env python
class ForceBaseClassFirst(type):
def __new__(cls, name, bases, attrs):
"""
"""
print("Creating class '%s'" % name)
def wrap_function(fn_name, base_fn, other_fn):
def new_fn(*args, **kwargs):
print("calling base '%s' function" % fn_name)
base_fn(*args, **kwargs)
print("calling other '%s' function" % fn_name)
other_fn(*args, **kwargs)
new_fn.__name__ = "wrapped_%s" % fn_name
return new_fn
if name != "BaseClass":
print("setting attrs['access'] to wrapped function")
attrs["access"] = wrap_function(
"access",
getattr(bases[0], "access", lambda: None),
attrs.setdefault("access", lambda: None)
)
return type.__new__(cls, name, bases, attrs)
class BaseClass(object):
__metaclass__ = ForceBaseClassFirst
def access(self):
print("in BaseClass access function")
class OtherClass(BaseClass):
def access(self):
print("in OtherClass access function")
print("OtherClass attributes:")
for k,v in OtherClass.__dict__.iteritems():
print("%15s: %r" % (k, v))
o = OtherClass()
print("Calling access on OtherClass instance")
print("-------------------------------------")
o.access()
This uses a metaclass to replace OtherClass's access function with a function that wraps a call to BaseClass's access function and a call to OtherClass's access function. See the best explanation of metaclasses here https://stackoverflow.com/a/6581949.
Stepping through the code should really help you understand the order of things.
With Decorators
This functionality could also easily be put into a decorator, as shown below. Again, a steppable/debuggable/runnable version of the code below can be found here http://dbgr.cc/0
#!/usr/bin/env python
def superfy(some_func):
def wrapped(self, *args, **kwargs):
# NOTE might need to be changed when dealing with
# multiple inheritance
base_fn = getattr(self.__class__.__bases__[0], some_func.__name__, lambda *args, **kwargs: None)
# bind the parent class' function and call it
base_fn.__get__(self, self.__class__)(*args, **kwargs)
# call the child class' function
some_func(self, *args, **kwargs)
wrapped.__name__ = "superfy(%s)" % some_func.__name__
return wrapped
class BaseClass(object):
def access(self):
print("in BaseClass access function")
class OtherClass(BaseClass):
#superfy
def access(self):
print("in OtherClass access function")
print("OtherClass attributes")
print("----------------------")
for k,v in OtherClass.__dict__.iteritems():
print("%15s: %r" % (k, v))
print("")
o = OtherClass()
print("Calling access on OtherClass instance")
print("-------------------------------------")
o.access()
The decorator above retrieves the BaseClass' function of the same name, and calls that first before calling the OtherClass' function.
May this simple approach can help.
class App:
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
self.application = None
self.link = None
def access(self):
print('There is something BaseClass must do')
print('The application is ', self.application)
print('The link is ', self.link)
class App1(App):
def __init__(self, testName):
# ...
super(App1, self).__init__(testName)
def access(self):
self.application = 'Application created by App1'
self.link = 'Link created by App1'
super(App1, self).access()
print('There is something App1 must do')
class App2(App):
def __init__(self, testName):
# ...
super(App2, self).__init__(testName)
def access(self):
self.application = 'Application created by App2'
self.link = 'Link created by App2'
super(App2, self).access()
print('There is something App2 must do')
and the test result:
>>>
>>> app = App('Baseclass')
>>> app.access()
There is something BaseClass must do
The application is None
The link is None
>>> app1 = App1('App1 test')
>>> app1.access()
There is something BaseClass must do
The application is Application created by App1
The link is Link created by App1
There is something App1 must do
>>> app2 = App2('App2 text')
>>> app2.access()
There is something BaseClass must do
The application is Application created by App2
The link is Link created by App2
There is something App2 must do
>>>
Adding a combine function we can combine two functions and execute them one after other as bellow
def combine(*fun):
def new(*s):
for i in fun:
i(*s)
return new
class base():
def x(self,i):
print 'i',i
class derived(base):
def x(self,i):
print 'i*i',i*i
x=combine(base.x,x)
new_obj=derived():
new_obj.x(3)
Output Bellow
i 3
i*i 9
it need not be single level hierarchy it can have any number of levels or nested

Python decorators and class inheritance

I'm trying to use decorators in order to manage the way users may or may not access resources within a web application (running on Google App Engine). Please note that I'm not allowing users to log in with their Google accounts, so setting specific access rights to specific routes within app.yaml is not an option.
I used the following resources :
- Bruce Eckel's guide to decorators
- SO : get-class-in-python-decorator2
- SO : python-decorators-and-inheritance
- SO : get-class-in-python-decorator
However I'm still a bit confused...
Here's my code ! In the following example, current_user is a #property method which belong to the RequestHandler class. It returns a User(db.model) object stored in the datastore, with a level IntProperty().
class FoobarController(RequestHandler):
# Access decorator
def requiredLevel(required_level):
def wrap(func):
def f(self, *args):
if self.current_user.level >= required_level:
func(self, *args)
else:
raise Exception('Insufficient level to access this resource')
return f
return wrap
#requiredLevel(100)
def get(self, someparameters):
#do stuff here...
#requiredLevel(200)
def post(self):
#do something else here...
However, my application uses different controllers for different kind of resources. In order to use the #requiredLevel decorator within all subclasses, I need to move it to the parent class (RequestHandler) :
class RequestHandler(webapp.RequestHandler):
#Access decorator
def requiredLevel(required_level):
#See code above
My idea is to access the decorator in all controller subclasses using the following code :
class FoobarController(RequestHandler):
#RequestHandler.requiredLevel(100)
def get(self):
#do stuff here...
I think I just reached the limit of my knowledge about decorators and class inheritance :). Any thoughts ?
Your original code, with two small tweaks, should also work. A class-based approach seems rather heavy-weight for such a simple decorator:
class RequestHandler(webapp.RequestHandler):
# The decorator is now a class method.
#classmethod # Note the 'klass' argument, similar to 'self' on an instance method
def requiredLevel(klass, required_level):
def wrap(func):
def f(self, *args):
if self.current_user.level >= required_level:
func(self, *args)
else:
raise Exception('Insufficient level to access this resource')
return f
return wrap
class FoobarController(RequestHandler):
#RequestHandler.requiredLevel(100)
def get(self, someparameters):
#do stuff here...
#RequestHandler.requiredLevel(200)
def post(self):
#do something else here...
Alternately, you could use a #staticmethod instead:
class RequestHandler(webapp.RequestHandler):
# The decorator is now a static method.
#staticmethod # No default argument required...
def requiredLevel(required_level):
The reason the original code didn't work is that requiredLevel was assumed to be an instance method, which isn't going to be available at class-declaration time (when you were decorating the other methods), nor will it be available from the class object (putting the decorator on your RequestHandler base class is an excellent idea, and the resulting decorator call is nicely self-documenting).
You might be interested to read the documentation about #classmethod and #staticmethod.
Also, a little bit of boilerplate I like to put in my decorators:
#staticmethod
def requiredLevel(required_level):
def wrap(func):
def f(self, *args):
if self.current_user.level >= required_level:
func(self, *args)
else:
raise Exception('Insufficient level to access this resource')
# This will maintain the function name and documentation of the wrapped function.
# Very helpful when debugging or checking the docs from the python shell:
wrap.__doc__ = f.__doc__
wrap.__name__ = f.__name__
return f
return wrap
After digging through StackOverflow, and carefully reading Bruce Eckel's guide to decorators, I think I found a possible solution.
It involves implementing the decorator as a class in the Parent class :
class RequestHandler(webapp.RequestHandler):
# Decorator class :
class requiredLevel(object):
def __init__(self, required_level):
self.required_level = required_level
def __call__(self, f):
def wrapped_f(*f_args):
if f_args[0].current_user.level >= self.required_level:
return f(*f_args)
else:
raise Exception('User has insufficient level to access this resource')
return wrapped_f
This does the work ! Using f_args[0] seems a bit dirty to me, I'll edit this answer if I find something prettier.
Then you can decorate methods in subclasses the following way :
FooController(RequestHandler):
#RequestHandler.requiredLevel(100)
def get(self, id):
# Do something here
#RequestHandler.requiredLevel(250)
def post(self)
# Do some stuff here
BarController(RequestHandler):
#RequestHandler.requiredLevel(500)
def get(self, id):
# Do something here
Feel free to comment or propose an enhancement.

Categories

Resources