Execute a callback upon task end? - python

Celery. An app sends a task to be executed.
r = task.delay()
Is it possible to execute code in the app address space upon task end other than by polling r?

There is no built-in way in celery. Generally speaking, you will have to come up with your own way for the task to send a notification back to your app if you want to perform the callback processing in the calling application's address space.
Here are some patterns you can use:
The general pattern
Using django channels
Using firebase

Related

How create a real time application using python?

I have a python backend using celery and Redis to serve long-running tasks. The app has a front end build in Vuejs. For these long-running tasks, I need to provide a real-time update to the user about the status of that task. The approach I think of is to poll the status endpoint to using setTimeOut function in my js code. Is there any better approach to handle this kind of use case?

Interupt backend function and wait for frontend response

I'm looking for a good solution to implement a handshake between a python backend server and a react frontend connected through a websocket.
The frontend allows the user to upload a file and then send it to the backend for processing. Now it might be possible that the processing encounters some issues and likes the user's confirmation to proceed or cancel - and that's where I'm stuck.
My current implementation has different "endpoints" in the backend which call then different function implementations and a queue which is continuously processed and content (messages) is sent to the frontend. But these are always complete actions, they either succeed or fail and the returned message is accordingly. I have no system in place to interupt a running task (e.g. file processing), send a request to the frontend and then wait for response before I continue the function.
Is there a design pattern or common approach for this kind of problem?
How long it takes to process? Maybe a good solution is set up a message broker like RabbitMQ and create a queue for this process. In the front-end you have to create a panel to see the state of the process, which is running in an async task, and if it has found some issues, let the user know and ask what to do.

Can I use a celery task's send_event instead of update_state for state updates?

While working on a asynchronous task queue for a webserver (built with python and flask), I was finding a way to get the server to actually perform some work once a task update comes in. There is a function for a task that can be used on the client side (celery.app.task.get), and one to send updates on the worker side (celery.app.task.update_state).
But this requires a result backend to be configured. This is not a problem, perse. But I came across celery events (https://docs.celeryproject.org/en/stable/userguide/monitoring.html#real-time-processing).
This apparently allows to omit the result backend. On the worker side, this requires to use the celery.app.task.send_event function.
I do not need to send the result of a task to the client (it is a file on a shared volume), or store it in a database, but I do like to receive progress updates (percentage) of the tasks. Is using the event system a good alternative to update_state()?

Send mails async in web.py

I try to solve problem with sending mails(or any long task) in web.py project. What I want is to start sending any mail and return the http response. But this task (sending) is taking a long time. Is there any solution?
Example:
import web
''some settings urls, etc.''
class Index:
def GET(self):
''task''
sending_mail()
return 'response'
I found many examples about async tasks but I think that if this task put to background and return 'response' it will fail.
You could get away with sending email in a separate thread (you can spawn one when you need to send an email):
import threading
threading.Thread(target=sending_email).start()
However, the all-around best (and standard) solution would be to use an asynchronous task processor, such as Celery. In your web thread, simply create a new task, and Celery will asynchronously execute it.
There is no reason why "returning response" would fail when using a message queue, unless your response depends on the email being sent prior to sending the response (but in that case, you have an architectural problem).
Moving the sending_email() task to a background queue would be the best solution. This would allow you to return the response immediately and get the results of the sending_email task later on.
Let me also suggest taking a look at RQ
It is a lightweight alternative to Celery that I find easier to get up and running. I have used it in the past for sending emails in the background and it didn't disappoint.

Do I need celery when I am using gevent?

I am working on a django web app that has functions (say for e.g. sync_files()) that take a long time to return. When I use gevent, my app does not block when sync_file() runs and other clients can connect and interact with the webapp just fine.
My goal is to have the webapp responsive to other clients and not block. I do not expect a zillion users to connect to my webapp (perhaps max 20 connections), and I do not want to set this up to become the next twitter. My app is running on a vps, so I need something light weight.
So in my case listed above, is it redundant to use celery when I am using gevent? Is there a specific advantage to using celery? I prefer not to use celery since it is yet another service that will be running on my machine.
edit: found out that celery can run the worker pool on gevent. I think I am a litle more unsure about the relationship between gevent & celery.
In short you do need a celery.
Even if you use gevent and have concurrency, the problem becomes request timeout. Lets say your task takes 10 minutes to run however the typical request timeout is about up to a minute. So what will happen if you trigger the task directly within a view is that the server will start processing it however after a minute a client (browser) will probably disconnect the connection since it will think the server is offline. As a result, your data can become corrupt since you cannot be guaranteed what will happen when connection will close. Celery solves this because it will trigger a background process which will process the task independent of the view. So the user will get the view response right away and at the same time the server will start processing the task. That is a correct pattern to handle any scenarios which require lots of processing.

Categories

Resources