What should I use Django Non-Global Middlewares or Django triggers - python

My problem is basically that I am currently making a customized management system based on Django(3.1) Python(v3.7.9) in which I am pulling the data from a third-party tool. The tool is not giving me the webhooks of every data that I want for visualization and analysis.
The webhook is giving me bits of information and I have to perform a GET request to their API to fetch the rest details if those are not in my database. They are asking for a successful response of webhook within 5 secs otherwise it will trigger a retry.
If I try to do a get request within the function of webhook the time of 5 second will get exceeded the solutions that I came up with was to Django Middleware or Django Triggers so which would be best suitable for my problem I am bit confused.
Note: I can not lower the Django version as I have to use Async Functions

This would be a good use case for a task scheduler like Celery.
Django-triggers is an interface to the Celery scheduler, so it might be a good fit.
Keep in mind, Celery has to be run as a separate process next to django.
Another popular task scheduler is rq-scheduler.
This offers a simple implementation using Redis as a message queue. *Note that Loadbalanced/multi-instance applications are not easily setup with RQ.

Related

Python Flask: How add requests to a queue and rearrange the tasks in queue before executing

A bit of the context:
want to write an algorithm that accepts tickets from the client. Sorts them on some constraints, handles them, and replies back to the client with the results.
I did some research and though REST API of python is a good idea. But as I explored it I found out that, It usually is built to handle one request at once.
Is there a way to add tasks(REST API requests) to the queue, sort them and execute them with workers and reply back to clients once processing is done?
I can suggest three ways to do that.
Try to use database to store the request content, constraint and status as 'pending'. Later, when you want to trigger the processing of the requests just retrieve them in sorted order by your constraint and update the status to 'processed'.
You can use Redis task queue with flask. See the article. https://realpython.com/flask-by-example-implementing-a-redis-task-queue/
You can also use Celery module with Flask. See the documentation. https://flask.palletsprojects.com/en/1.1.x/patterns/celery/

Long time request in Flask with React

I am building an application in Flask API and React.
The first page of the app presents the user with an upload file form. The user selects a file (700 MB) and click uploads.
Once this is done, the backend:
Takes the file, unzip it
Run some ML model
Returns a JSON containing the right data
When this is over, react gets the JSON and renders a new page.
These three steps takes more than 10 minutes therefore I get an error 500 which I believe is due to the long time request timeout.
I would like to know if there is a way to make timeout=None.
I looked for some answers and they suggest to use Celery. However, I am not sure if this is the right approach for my task.
I second with #TheIncorrigible suggestion to solve with some kind of event driven architecture what you are doing is Web Worker Architecture. Ref
Your problem reminds me one of the AWS service called control tower where launching landing zone of that service takes more than >10min and AWS gracefully handles that. When you try to launch it gives me a banner saying it is progress and would take 1 hour. In console log I noticed they were using Promise(Not exactly sure how they are achieving and how long it can handle).
May be you could try using Promises in react for asynchronous computations. I am not expert but it looks like you can achieve this using that. You may watch this short video for basic understanding.
There is also signalr that allows server code to send asynchronous notifications to client-side web applications. You can check if that can be applied in your case signalr in python dicussion

Django how best to perform api request for large jobs

I need some direction as to how to achieve the following functionality using Django.
I want my application to enable multiple users to submit jobs to make calls to an API.
Each user job will require multiple API calls and will store the results in a db or a file.
Each user should be able to submit multiple jobs.
In case of some failure such as network blocked or API not returning results I want the application to pause for a while and then resume completing that job.
Basically want the application to pickup from where it was left off.
Any ideas as to how I could implement this solution or any technologies such as celery I should be looking at or even if you can suggest an opensource project where I can learn how to perform this would be a great help.
You can do this with rabbitmq and celery.
This post might be helpful.
https://medium.com/#ffreitasalves/executing-time-consuming-tasks-asynchronously-with-django-and-celery-8578eebab356

integrating django with aiohttp/asyncio

I want to integate django with aiohttp/asyncio
for asynchronous programming and for websockets handling.
I know django has celery & django-channels to do asynchronous task and websocket server respectively but aiohttp is having both asynchronous and websocket server pre built in it and I found that framework more scalable and easy, compared to celery/django channels while creating a function to webscraping (I don't know if webscraping can be possible in celery I, havent tried it yet).
And it also supports async and await perfectly.
But my question is: How can we implement both django and aiohttp in a project? Instead using django's development server can we use aiohttp server to serve the site.
And are we able to integrate django with aiohttp function (like lets take an example: If I want to scrape a website of user submmited input to my database. Can I use await calls in my function while fetching the website and posting the following website to my django database? Or post the function results to another django function?)
And I want to know the disadvantages of integrating, if any?
And while posting your answer please could you post a sample practical example of integration instead suggesting me those libraries over github.
Maybe is time for considering Django >= 4.1 wich already have built-in Asynchronous Support
From the docs:
Django has support for writing asynchronous (“async”) views, along with an entirely async-enabled request stack if you are running under ASGI. Async views will still work under WSGI, but with performance penalties, and without the ability to have efficient long-running requests.
We’re still working on async support for the ORM and other parts of Django. You can expect to see this in future releases. For now, you can use the sync_to_async() adapter to interact with the sync parts of Django. There is also a whole range of async-native Python libraries that you can integrate with.

How can I have a python module run asynchronously and recieve calls from other modules?

So I'm currently working on adding a recommendation engine to a Django project and need to do some heavy processing (in an external module) for one of my view functions. This significantly slows down page load time because I have to load in some data, transform it, perform my calculations based on parameters sent by the request, and then return the suggestions to the view. This has to be done every time the view is loaded.
I was wondering if there was some way I could have the recommender module load and transform the data in memory, and then wait for parameters to be sent from the view, have calculations run on those parameters and then send it back to the view.
Any help would be greatly appreciated.
Celery is a task queue that reeally excels at this sort of thing.
It would allow you to do something like:
user makes request to view
view starts an async task that does the heavy lifting, then returns to the user immediately
you can poll from javascript to see if your task is done and load the results when it is
Might not quite be the flow you're looking for but celery is definitetly worth checking out
Celery has a great django package too, extremely easy to use
Rereading your question, i think it would also be possible to create a local webservice around your recommendation engine. On startup it can load all the data into memory, then you can just make requests to it from your django app?

Categories

Resources