How to run celery in django tests using PyCharm - python

I'm trying to test my Django application functionalities.
Some of my tasks are using Celery.
How can I run celery in my test environment using PyCharm?

I assume that you're using the shared_task decorator and running the tasks as function_name.delay()
This could be tested by adding a conditional that runs the task if running locally (or via a test) and uses celery in production. This also allows one to run the server locally without using celery, and that can be easier to debug and maintain.
PRODUCTION = False
if PRODUCTION:
some_task.delay()
else:
some_task()
One could also create a decorator that does the same thing, which is prettier, but perhaps more complicated to maintain?

To do unit tests without testing the celery stuff you may want to mock your #shared_task decorator
from unittest import mock
#mock.patch('celery.shared_task', lambda *args, **kwargs: lambda func: func)
def test_celery_unit()
from tasks import mytask
result = mytask() # could be call just as a function

using mock library will solve your problem.

Related

Django celery redis remove a specific periodic task from queue

There is a specific periodic task that needs to be removed from message queue. I am using the configuration of Redis and celery here.
tasks.py
#periodic_task(run_every=crontab(minute='*/6'))
def task_abcd():
"""
some operations here
"""
There are other periodic tasks also in the project but I need to stop this specific task to stop from now on.
As explained in this answer, the following code will work?
#periodic_task(run_every=crontab(minute='*/6'))
def task_abcd():
pass
In this example periodic task schedule is defined directly in code, meaning it is hard-coded and cannot be altered dynamically without code change and app re-deploy.
The provided code with task logic deleted or with simple return at the beginning - will work, but will not be the answer to the question - task will still run, there just is no code that will run with it.
Also, it is recommended NOT to use #periodic_task:
"""Deprecated decorator, please use :setting:beat_schedule."""
so it is not recommended to use it.
First, change method from being #periodic_task to just regular celery #task, and because you are using Django - it is better to go straightforward for #shared_task:
from celery import shared_task
#shared_task
def task_abcd():
...
Now this is just one of celery tasks, which needs to be called explicitly. Or it can be run periodically if added to celery beat schedule.
For production and if using multiple workers it is not recommended to run celery worker with embedded beat (-B) - run separate instance of celery beat scheduler.
Schedule can specified in celery.py or in django project settings (settings.py).
It is still not very dynamic, as to re-read settings app needs to be reloaded.
Then, use Database Scheduler which will allow dynamically creating schedules - which tasks need to be run and when and with what arguments. It even provides nice django admin web views for administration!
That code will work but I'd go for something that doesn't force you to update your code every time you need to disable/enable the task.
What you could do is to use a configurable variable whose value could come from an admin panel, a configuration file, or whatever you want, and use that to return before your code runs if the task is in disabled mode.
For instance:
#periodic_task(run_every=crontab(minute='*/6'))
def task_abcd():
config = load_config_for_task_abcd()
if not config.is_enabled:
return
# some operations here
In this way, even if your task is scheduled, its operations won't be executed.
If you simply want to remove the periodic task, have you tried to remove the function and then restart your celery service. You can restart your Redis service as well as your Django server for safe measure.
Make sure that the function you removed is not referenced anywhere else.

Django does not exit after finishing tests when using threading module

I am doing integration tests using django.test.SimpleTestCase.
After running python manage.py test, the tests run successfully and the terminal hangs with the message:
---------------------------
Ran 5 tests in 1.365s
OK
The problem is that currently I go back to the terminal using CTRL+C, but I want to have automated tests in my CI/CD pipeline.
Did I do something wrong in the way I executed the tests? Or is this behaviour normal? In this case, is there an way in Bash to programatically execute and then exit the tests?
EDIT:
After analysing my app in depth, I was able to identify what was causing that behaviour. I am using threading in a way like the following in my views.py:
def __pooling():
wait_time = 10
call_remote_server()
threading.Timer(wait_time, __pooling).start()
__pooling()
Basically I need that my application do something from time to time asynchronously.
Should I change the way I am doing the pooling? Or should I disable it (how?) during the tests?
I was able to identify what was causing that behaviour. I am using threading in a way like the following in my views.py:
def __pooling():
wait_time = 10
call_remote_server()
threading.Timer(wait_time, __pooling).start()
__pooling()
Basically I need that my application do something from time to time
asynchronously. Should I change the way I am doing the pooling?
I don't fully understand your needs, but a more traditional approach would be to schedule a task (probably a management command) outside of Django itself. An OS-level scheduler like cron or Windows Task Scheduler, something like APScheduler, or a task queue like Celery would all be reasonable choices.
Or should I disable it (how?) during the tests?
I don't recommend continuing to use your __pooling() function as it exists today. In my opinion this kind of thing doesn't belong in your views.py. But if you want to keep it, something like
from django.conf import settings
if not settings.DEBUG:
__pooling()
might help. Your __pooling() function would only be called when DEBUG is falsy, as it should be in production. (If it is also falsy in your CI environment you could choose another existing setting, or add something to your settings.py specifically to control this.)

Difference between different ways to create celery task

I am very confused by looking at different ways of creating a celery task. On the surface they all work the same So, Can someone explain what is the difference between these.
1.
from myproject.tasks import app
#app.task
def foo():
pass
2.
from celery import task
#task
def foo():
pass
3.
from celery import shared_task
#shared_task
def foo():
pass
I know by a little bit of googling that the difference between the 1nd and 3rd one is shared_task is used when you don't have a concrete app instance. Can someone elaborate more on that and when is the second one is used?
Don't use #2 unless you are using celery v3. If you are using celery v4, use #1.
Use #3 in instances where you are writing a reusable library or django app. For example, if you are writing an open source set of tasks that allow you to manage aws ec2 instances using celery, you would use shared_task so that the tasks could be run on celery, but you would leave it to the person using your library to configure celery for themselves.
Use #1 if you are writing for your own project and there is no concern for re-use.

What is the purpose of Celery's "autodiscover_tasks" function?

I'm wondering what is the purpose of Celery's autodiscover_tasks function. Im am using Celery 4.1.2 with Django 2.1.4.
The Celery documentation refers to imports:
foo.tasks and bar.tasks being imported
But I can't comprehend how this works.
All the examples I found on GitHub including this one from the Official Celery repo, rely on manually importing (i.e.from demoapp.tasks import add, mul, xsum) the tasks even when calling the autodiscover_tasks function when booting the worker.
I guess this is how Python work, you can't access to classes "globally", like in Ruby, for example.
Then once again, what is this function for? I'm no expert at Celery and maybe I am missing something. The only thing I see is the name of the discovered tasks when launching the Celery worker, is that all this function is supposed to do?
Thanks for your inputs,
When using celery with django, the autodiscover_tasks function registers all decorated tasks within the task module inside each INSTALLED_APPS entry. e.g.,
if your INSTALLED_APPS settings included app1, app2, and app3, celery would automatically register any decorated tasks that could be found by looking at app1.tasks, app2.tasks, and app3.tasks.

How to test code that creates Celery tasks?

I've read Testing with Celery but I'm still a bit confused. I want to test code that generates a Celery task by running the task manually and explicitly, something like:
def test_something(self):
do_something_that_generates_a_celery_task()
assert_state_before_task_runs()
run_task()
assert_state_after_task_runs()
I don't want to entirely mock up the creation of the task but at the same time I don't care about testing the task being picked up by a Celery worker. I'm assuming Celery works.
The actual context in which I'm trying to do this is a Django application where there's some code that takes too long to run in a request, so, it's delegated to background jobs.
In test mode use CELERY_TASK_ALWAYS_EAGER = True. You can set this setting in your settings.py in django if you have followed the default guide for django-celery configuration.

Categories

Resources