routing class based tasks in celery based on class inheritance - python

Suppose I have the following celery tasks defined as classes
from celery import Task
class BaseTask(Task):
abstract = True
def run():
pass # not important
class SpecificTask(BaseTask):
def run():
pass # not important
Is there an easy way to set routing for all taks inheriting from BaseTaks class? Ideally I would like to do something like:
CELERY_ROUTES = {
'project.tasks.BaseTask': {'queue': 'notify'},
}
Unfortunately this doesn't seem to work as I intended.
Thanks for advice.

Related

Imported python class instance inheritance

Currently I'm inheriting from an external class and importing another one locally:
from locust.contrib.fasthttp import FastHttpUser
from local_helper import TaskDetails
class User_1(FastHttpUser):
#task
def iteration_task(self):
self.client.get(url)
I want to offload self.client.get(url) to TaskDetails,
What I want to achieve is something like this:
td = TaskDetails()
class User_1(FastHttpUser):
#task
td.client_get(url) # with included FastHttpUser methods from User_1 class
is it possible to do something like this?

It is possible for python class attribute to make as decorator?

Im trying to follow this Celery Based Background Tasks to create a celery settings for a simple application.
In my task.py
from celery import Celery
def make_celery(app):
celery = Celery(app.import_name, backend=app.config['CELERY_RESULT_BACKEND'],
broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
return celery
This method works in the app.py of main flask application.
from flask import Flask
flask_app = Flask(__name__)
flask_app.config.update(
CELERY_BROKER_URL='redis://localhost:6379',
CELERY_RESULT_BACKEND='redis://localhost:6379'
)
celery = make_celery(flask_app)
#celery.task()
def add_together(a, b):
return a + b
My use case is I want to create another module helpers.py where I
can define a collections of asynchronous classes. To separate
celery based methods and make it modular.
What I did is call the task.py module to other module helpers.py in order to create a class AsyncMail to handle email action background work.
from task import make_celery
class AsyncMail(object):
def __init__(self, app):
"""
:param app: An instance of a flask application.
"""
self.celery = make_celery(app)
def send(self, msg):
print(msg)
Now how can I access self.celery attribute to be a decorator for any method of the class?
#celery.task()
def send(self, msg):
print(msg)
If it impossible, what other alternative steps in order to achieved this problem?
You can't do what you're trying to do. At the time the class is being defined, there is no self, much less self.celery, to call, so you can't use #self.celery. Even if you had some kind of time machine, there could be 38 different AsyncMail instances created, and which one's self.celery would you want here?
Before getting into how you could do what you want, are you sure you want to? Do you actually want each AsyncMail object to have it own separate Celery? Normally you only have one per app, which is why normally this doesn't come up.
If you really wanted to, you could give each instance decorated methods after you have an object to decorate them with. But it's going to be ugly.
def __init__(self, app):
self.celery = make_celery(app)
# We need to get the function off the class, not the bound method off self
send = type(self).send
# Then we decorate it manually—this is all #self.celery.task does
send = self.celery.task(send)
# Then we manually bind it as a method
send = send.__get__(self)
# And now we can store it as an instance attribute, shadowing the class's
self.send = send
Or, if you prefer to put it all together in one line:
self.send = self.celery.task(type(self).send).__get__(self)
For Python 2, the "function off the class" is actually an unbound method, and IIRC you have to call __get__(self, type(self)) to turn it into a bound method at the end, but otherwise it should all be the same.

Register class-based task in

I am using Celery version 4.0.2.
Compared to previous versions of Celery, it seems that class-based tasks are not registered automatically (i.e. if you configured auto-discovery).
However, I do not even achieve registering a class-based task manually.
According to the Celery change log:
http://docs.celeryproject.org/en/latest/changelog.html#version-4-0-1
since version 4.0.1 it should be possible to register the task manually:
from celery import Celery, Task
app = Celery()
class CustomTask(Task):
def run(self):
return 'hello'
app.register_task(CustomTask())
But this does not seem to work. Does anyone know how to achieve this?
I tried a few suggestions which are being discussed (apart from integrating a custom task loader mentioned in https://github.com/celery/celery/issues/3744):
Register Celery Class-based Task
https://github.com/celery/celery/issues/3615
https://github.com/celery/celery/issues/3744
Almost there! You need to call delay() on the task that you registered.
This would work:
from celery import Celery, Task
app = Celery()
class CustomTask(Task):
def run(self):
return 'hello'
task = CustomTask()
app.register_task(task)
task.delay()
If You need shared_task decorator:
from celery import Task, shared_task
class CustomTask(Task):
def process(self):
return 'hello'
#shared_task(bind=True, base=CustomTask)
def custom(self):
self.process()
process is a custom name that starts task (decorator overrides run method)
bind=True binds function to a class instance
base=CustomTask sets a base class for a task

Inheriting setUp method Python Unittest

I have a question regarding unittest with Python! Let's say that I have a docker container set up that handles a specific api endpoint (let's say users, ex: my_site/users/etc/etc/etc). There are quite a few different layers that are broken up and handled for this container. Classes that handle the actual call and response, logic layer, data layer. I am wanting to write tests around the specific calls (just checking for status codes).
There are a lot of different classes that act as Handlers for the given endpoints. There are a few things that I would have to set up differently per one, however, each one inherits from Application and uses some methods from it. I am wanting to do a setUp class for my unittest so I don't have to re-establish this each time. Any advice will help. So far I've mainly seen that inheritance is a bad idea with testing, however, I am only wanting to use this for setUp. Here's an example:
class SetUpClass(unittest.TestCase):
def setUp(self):
self._some_data = data_set.FirstOne()
self._another_data_set = data_set.SecondOne()
def get_app(self):
config = Config()
return Application(config,
first_one=self._some_data,
second_one=self._another_data_set)
class TestFirstHandler(SetUpClass, unittest.TestCase):
def setUp(self):
new_var = something
def tearDown(self):
pass
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users')
self.assertEqual(res.code, 200)
class TestSecondHandler(SetUpClass, unittest.TestCase):
def setUp(self):
different_var_thats_specific_to_this_handler = something_else
def tearDown(self):
pass
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users/account/?something_custom={}'.format('WOW'))
self.assertEqual(res.code, 200)
Thanks again!!
As mentioned in the comments, you just need to learn how to use super(). You also don't need to repeat TestCase in the list of base classes.
Here's the simple version for Python 3:
class TestFirstHandler(SetUpClass):
def setUp(self):
super().setUp()
new_var = something
def tearDown(self): # Easier to not declare this if it's empty.
super().tearDown()
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users')
self.assertEqual(res.code, 200)

Celery: how to set the status of a Task

I have defined a Celery task like this:
#app.task()
def my_task():
# Do things...
I'm using Flower, so I want to see the final state of the task, according to some rules created by me:
if condition_1:
return task_status_success
elif condition_2:
return task_status_fail
How can I do this?
I've seen some people do something like this:
class AbstractTask(Task):
abstract = True
def __init__(self):
self.last_error_log = ErrorLog(logger)
Task.__init__(self)
def _task_error(self, message):
logger.error(message)
self.update_state(state=states.FAILURE)
raise Exception(message)
But that method seems to define classes as Tasks, not as functions.
Any help on how to set manually the state of a Celery task defined as a function?
To use the method you saw that uses an abstract class, you just need to pass the class as base to your decorator:
#app.task(base=AbstractClass, bind=True)
def my_task(self):
pass
bind=True will allow you to use self to access the members of your class.

Categories

Resources