I have a task that build a task Server and I decide to use Celery.
My idea is that :
Build a celery worker Server on Machine 1
A Web cluster consist of some web servers running Django.
There are some task that Django website have to tell celery Server on Machine 1 to work.
For example :
When a new user registered , Django code will call the celery worker to send email somehow.
I read documents about celery, but I can not find any documents that show me how to call a "send email" task to the Machine 1 and ask Machine 1 to send the email.
any idea ?
Thank you very much
When you use django and celery together take a look at django-celery,
http://docs.celeryproject.org:8000/en/latest/django/
Related
Problem
Please tell me how to do Django tasks in parallel, like I've 3 accounts sending 3 forms at same time, each form on one account, I want the forms to be sent at the same time on the background. Currently, I'm sending the forms with celery on background, I just want them to run on parallel instead of a queue.
Info
When I send a form, it sends private messages using selenium to many users from a csv file
Without Celery
It works great without celery, and they send in parallel but I'm trying to get them work on background due to heavy loads on the server
Tried to run the server with daphne ( ASGI instead of WSGI ) but didn't help
I am having trouble finding an answer to this question.
I am looking for a framework where I can send a python function to a rest endpoint, and it sends a response upon completion.
If there is no precanned solution, what would be a good set of packages to use to do this?
Maybe you could try to use SocketIO with Celery.
SocketIO is an option to create a channel between clients (web, mobile apps, etc) and server.
When received a specific message (you can create you own protocol) you can call a task in a Message Queue using Celery framework with Redis or RabbitMQ.
Maybe you need to use an another framework called Kombu to be able to send a message using the connection in socketio.
You can find some tips on this link: http://python-socketio.readthedocs.io/en/latest/
I hope I could helped.
I have a django project where I am using celery with rabbitmq to perform a set of async. tasks. So the setup i have planned goes like this.
Django app running on one server.
Celery workers and rabbitmq running from another server.
My initial issue being, how to do i access django models from the celery tasks resting on another server?
and assuming I am not able to access the Django models, is there a way once the tasks gets completed, i can send a callback to the Django application passing values, so that i get to update the Django's database based on the values passed?
Concerning your first question, accessing django models from the workers' server:
Your django app must be available on both Server A (serving users) and Server B (hosting the celery workers)
Concerning your second question, updating the database based on the values. Do you mean the result of the async task? If so, then you have two options:
You can just save whatever you need to save from within the task itself, assuming you have access to the database.
You could use a results backend (one of which is through the Django ORM) as mentioned in the official documentation of Celery about Keeping Results
I've used the following set up on my application:
Task is initiated from Django - information is extracted from the model instance and passed to the task as a dictionary. NB - this will be more future proof as Celery 4 will default to JSON encoding
Remote server runs task and creates a dictionary of results
Remote server then calls an update task that is only listened for by a worker on the Django server.
Django worker read results dictionary and updates model.
The Django worker listens to a separate queue, those this isn't strictly necessary. Results backend isn't used - data needed is just passed to the task
In my django project, I am using celery to run a periodic task that will check a URL that responds with a json and updating my database with some elements from that json.
Since requesting from the URL is limited, the total process of updating the whole database with my task will take about 40 minutes and I will run the task every 2 hours.
If I check a view of my django project, which also requests information from the database while the task is asynchronously running in the background, will I run into any problems?
While requesting information from your database you are reading your database. And in your celery task your are writing data into your database. You can write only once at a time but read as many times as you want as there is no lock permission on database while reading.
The only time when you are going to run into issues while using db with celery is when you use the database as backend for celery because it will continuously poll the db for tasks. If you use a normal broker you should not have issues.
I have a use case where I have to send_email to user in my views. Now the user who submitted the form will not receive an HTTP response until the email has been sent . I do not want to make the user wait on the send_mail. So i want to send the mail asynchronously without caring of the email error. I am using using celery for sending mail async but i have read that it may be a overkill for simpler tasks like this. How can i achieve the above task without using celery
I'm assuming you don't want to wait because you are using an external service (outside of your control) for sending email. If that's the case then setup a local SMTP server as a relay. Many services such as Amazon SES, SendGrid, Mandrill/Mailchimp have directions on how to do it. The application will only have to wait on the delivery to localhost (which should be fast and is within your control). The final delivery will be forwarded on asynchronously to the request/response. STMP servers are already built to handle delivery failures with retries which is what you might gain by moving to Celery.