Send PMs in Tandem ( Parallel ) using Django - python

Problem
Please tell me how to do Django tasks in parallel, like I've 3 accounts sending 3 forms at same time, each form on one account, I want the forms to be sent at the same time on the background. Currently, I'm sending the forms with celery on background, I just want them to run on parallel instead of a queue.
Info
When I send a form, it sends private messages using selenium to many users from a csv file
Without Celery
It works great without celery, and they send in parallel but I'm trying to get them work on background due to heavy loads on the server
Tried to run the server with daphne ( ASGI instead of WSGI ) but didn't help

Related

How to run python in backend of flask web app?

I'm building a web app using flask. I've made a completely separate python program and I want to run it in the backend of the web app. It is basically a program that takes in inputs and makes some calculations about scheduling and sends the user an email.
I don't know where or how to implement it into the web app.
normally one would implement such a manner with a message queue to pass it to a backend or with a database to persist these calculation jobs for another backend program.
If you want to stay with python flask is offering something for you called Celery

How to transfer Django's messages to Angular app

I am building Django 2.2 +Node + Angular 8 app. Django is used to run a couple of simple scrapers when user clicks on Search btn. I want to make user notified that the scraping process started successfully (if it is the fact) and that the scraping is finished, or that some errors occurred. Some intermediate statuses are also desirable, but not mandatory. I thought about using django.contrib.messages, but not sure how to make my Angular app receive them.
Can anybody advise me with that problem?
P.S.: not sure if it is important - I want to use Angular's snackbar to make user notified about scraping statuses.
there is only one common way to push messages from Django to angular (from server -> to client), is to create a web-socket.
Check out this tutorial to create a web-socket that initiate a connexion between Angular and Django, and then pushing random notificatons from Django to Angular and finally closing the websocket.
In your case :
Open Web Socket when starting the task
Sending notifications about the task (DJANGO -> NG)
Sending Final Notification and Closing Web Socket
For other ideas checkout this medium: Do you really need websockets ?

Django and celery on different servers and celery being able to send a callback to django once a task gets completed

I have a django project where I am using celery with rabbitmq to perform a set of async. tasks. So the setup i have planned goes like this.
Django app running on one server.
Celery workers and rabbitmq running from another server.
My initial issue being, how to do i access django models from the celery tasks resting on another server?
and assuming I am not able to access the Django models, is there a way once the tasks gets completed, i can send a callback to the Django application passing values, so that i get to update the Django's database based on the values passed?
Concerning your first question, accessing django models from the workers' server:
Your django app must be available on both Server A (serving users) and Server B (hosting the celery workers)
Concerning your second question, updating the database based on the values. Do you mean the result of the async task? If so, then you have two options:
You can just save whatever you need to save from within the task itself, assuming you have access to the database.
You could use a results backend (one of which is through the Django ORM) as mentioned in the official documentation of Celery about Keeping Results
I've used the following set up on my application:
Task is initiated from Django - information is extracted from the model instance and passed to the task as a dictionary. NB - this will be more future proof as Celery 4 will default to JSON encoding
Remote server runs task and creates a dictionary of results
Remote server then calls an update task that is only listened for by a worker on the Django server.
Django worker read results dictionary and updates model.
The Django worker listens to a separate queue, those this isn't strictly necessary. Results backend isn't used - data needed is just passed to the task

Async Tasks for Django and Gunicorn

I have a use case where I have to send_email to user in my views. Now the user who submitted the form will not receive an HTTP response until the email has been sent . I do not want to make the user wait on the send_mail. So i want to send the mail asynchronously without caring of the email error. I am using using celery for sending mail async but i have read that it may be a overkill for simpler tasks like this. How can i achieve the above task without using celery
I'm assuming you don't want to wait because you are using an external service (outside of your control) for sending email. If that's the case then setup a local SMTP server as a relay. Many services such as Amazon SES, SendGrid, Mandrill/Mailchimp have directions on how to do it. The application will only have to wait on the delivery to localhost (which should be fast and is within your control). The final delivery will be forwarded on asynchronously to the request/response. STMP servers are already built to handle delivery failures with retries which is what you might gain by moving to Celery.

automatic page update appengine

I have an application that uses Python appengine, there is a service that updates the status of users, if an admin person has a page open, I would need it to update in real time. I know that appengine has CRON and task queues, what would be the correct way to handle this? Should I set an update flag in the models that that triggers jscript?
The Channel API can be used to send real-time(ish) data to clients, without the need of clients polling the server.

Categories

Resources