I'm working on a Django application processing AJAX requests. The code is like this:
def handler(request):
# save the data to the database
return HttpResponse(# some json)
The problem is that I want to further process the data that the user has submitted, but after responding the user with the success information. So is there a way to make the response without using return ? Or is there any other way to do that?
In general, you should use Celery for this type of task: it's the standard for asynchronous processing.
You can learn more about it here:
Django with celery
Related
My problem is basically that I am currently making a customized management system based on Django(3.1) Python(v3.7.9) in which I am pulling the data from a third-party tool. The tool is not giving me the webhooks of every data that I want for visualization and analysis.
The webhook is giving me bits of information and I have to perform a GET request to their API to fetch the rest details if those are not in my database. They are asking for a successful response of webhook within 5 secs otherwise it will trigger a retry.
If I try to do a get request within the function of webhook the time of 5 second will get exceeded the solutions that I came up with was to Django Middleware or Django Triggers so which would be best suitable for my problem I am bit confused.
Note: I can not lower the Django version as I have to use Async Functions
This would be a good use case for a task scheduler like Celery.
Django-triggers is an interface to the Celery scheduler, so it might be a good fit.
Keep in mind, Celery has to be run as a separate process next to django.
Another popular task scheduler is rq-scheduler.
This offers a simple implementation using Redis as a message queue. *Note that Loadbalanced/multi-instance applications are not easily setup with RQ.
I am using Python 3.6.1 with Flask, with the goal being to load a webpage, display live loading information via websockets, then once the loading is complete redirect or otherwise allow the user to do something.
I have the first two steps working- I can load the page fine, and the websocket interface (running on a separate thread using SocketIO) updates properly with the needed data. But how can I make something happen once the functions that need to load are finished? To my understanding once I return a webpage in Flask it is simply static, and there's no easy way to change it.
Specific code examples aren't necessary, I'm just looking for ideas or resources. Thanks.
I'm making a Flask app and I was wondering if I could render a template for a route, but redirect the user after a function is complete. Currently using Python 2.7 Here is my example
#app.route('/loading/matched')
def match():
time_match()
return render_template('matched.html')
def time_match():
# match two users based on time
sleep(3) # pretend to be doing
return redirect('/loading/generation')
I don't know where to begin. Is there a library I should use?
This sounds more like a client sided thing to me? Do you want something like a loading bar?
You could provide an ajax route which initiates heavy workload on the server side - while the client does show some progress. Once the workload finished you render a template which than gets loaded via ajax.
For asycn workload you could look into Celery, which is a great library for that. It even can do work on a seperate server...
More sources Integration in Flask
Is there a way to copy the request to a celery task in Flask in such a manner that the task executes inside the request context which initiated the task?
I need to access the flask security current user in a celery task, but since the task is outside the request context, I can not do that. I need additional information from the request, so just forwarding the current user to the task would not do the trick.
My task does inserts on the database. It needs the current user to save the id of the user which creates the row. Passing the user object to the task would solve the problem. However, the application logic is such that every insert/delete/update is logged via before flush event, which logs the user who did the modification, his IP, original url, the data it inserts...)
Log event is done like I said before flush, and it works in 99% scenarios.
But when I have one lengthy task which I want to be a celery task, the request
data is not available, nor is the current user (since it is outside the original request context)
The is no out-of-the-box way to pass the request or current_user objects to the celery task as they are non serializable. But people have worked around this by creating a wrapper to call the celery task in the request context.
The Celery task in a Flask request context blog post explores this topic in detail. The gists by Xion and derived one by aviaryan with the Request Context wrapper.
So I'm currently working on adding a recommendation engine to a Django project and need to do some heavy processing (in an external module) for one of my view functions. This significantly slows down page load time because I have to load in some data, transform it, perform my calculations based on parameters sent by the request, and then return the suggestions to the view. This has to be done every time the view is loaded.
I was wondering if there was some way I could have the recommender module load and transform the data in memory, and then wait for parameters to be sent from the view, have calculations run on those parameters and then send it back to the view.
Any help would be greatly appreciated.
Celery is a task queue that reeally excels at this sort of thing.
It would allow you to do something like:
user makes request to view
view starts an async task that does the heavy lifting, then returns to the user immediately
you can poll from javascript to see if your task is done and load the results when it is
Might not quite be the flow you're looking for but celery is definitetly worth checking out
Celery has a great django package too, extremely easy to use
Rereading your question, i think it would also be possible to create a local webservice around your recommendation engine. On startup it can load all the data into memory, then you can just make requests to it from your django app?