I have a DJango app and where users upload many images. What I want is to trigger a management command using the call_command function from management once a new object is created but the problem is I want to do it asynchronously. I don't want to keep the user waiting for the management command to finish. Yes I am aware about third party services like Celery but I want to keep it simple. I also know that I can schedule a cron job but the thing is I want the change to reflect instantly. Is there any other way to do so?
Related
I have two servers, A primary server that provide REST API to accept data from user and maintain a product details list. This server is also responsible to share product list (a subset of product data) with secondary server as soon as product is updated/created.
also note that secondary url depends on product details, not a fix server.
Primary server written in Django. I have used django model db signal as product update, create and delete event.
Now problem is that I don’t want to bock my primary server REST call until it populates detail to secondary server. I need some scheduler stuff to do that, i.e. create a task to populate data in background without blocking my current thread.
I found python asyncio module comes with a function 'run_in_executor', and its working till now, But I don’t have a knowledge of the side effect over django run in wsgi server, can anyone explain ? or any other alternate ?
I found django channel, but it need extra stuff like run worker thread separately, redis cache.
You should use Django Celery for running Tasks asynchronously or in the background.
Celery is a task queue with batteries included. It’s easy to use so that you can get started without learning the full complexities of the problem it solves.
You can get more information on celery from http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html#first-steps
I've got a django project with simple form to take users details in. I want to use python bot running in the background and constantly checking django database for any changes. Is it Celery the right tool for this job? Any other solution? Thank you
I don't think Celery is really what you want here - Celery is primarily for moving tasks that don't need to be dealt with in the same process to a separate worker, such as sending registration emails.
For this situation I'd be inclined to use Django's signals to trigger the required functionality whenever the appropriate changes are made to the database. For instance, if it needed to be triggered when a particular type of object was created, such as a new user, then you might use the post_save signal of the user model.
The bot would be in a separate process, but it's not too hard to communicate between processes using Redis. Just have the signal publish a message to Redis, and have the bot listen for that message and carry out the required action on that event.
I don't have the details of your needs but, there are a few ways to achieve such things:
The Constantly checking approach:
A crontab which launch your python script every minute.
Like you said, you could use Celery beat, to achieve what a crontab would do, in your python environment
"On change" approach:
Probably the best, if you have control of the Django project, you could have your script run on the form validation/save! For this, You can add a celery task, run the python script, use Django signals...
My current project contains quite a few custom commands inside an app which act as listeners from a BUS, and each of the task are blocking means they will have to run in their own processes.
[bus]
consume_pay_transaction_completed
consume_pay_transaction_declined
consume_pay_transaction_failed
This makes development/testing difficult because I will have to run each command individually to test the workflow.
I am wondering how easy to write a master command and make the other ones as slaves, monitor their health and respawn them if necessary. Are there any existing utilities/libraries in Django or Python to assist me to write a command 'start_all'
[bus]
consume_pay_transaction_completed
consume_pay_transaction_declined
consume_pay_transaction_failed
start_all
The start_all-command could be done with call_command.
Monitoring health and respawning them if necessary sounds like a job for something like celery.
I have a code which deletes an api column when executed. Now I want it to execute after some time lets say two weeks. Any idea or directions how do I implement it?
My code:
authtoken = models.UserApiToken.objects.get(api_token=token)
authtoken.delete()
This is inside a function and executed when a request is made.
There are two main ways to get this done:
Make it a custom management command, and trigger it through crontab.
Use celery, make it a celery task, and use celerybeat to trigger the job after 2 weeks.
I would recommend celery, as it provides a better control of your task queues and jobs.
I need to make a bot who can automatically add a cron job for itself,but I don't think I could access the cron.yaml file on GAE server.What can I do with this?
You could tell the bot to add the new schedule in your datastore instead.
Then create a single "master" cron job with 1 minute schedule that checks the schedules that you had set in the datastore. The cron job would then determine whether on the current time the handler for an associated schedule need to be invoked or not.
If it does, the master cron job would then invoke the stored job using the TaskQueue API.
It's true that a lot of dev environments don't give you access to the cron.yaml files, however, you can run a Python script locally that communicates with your deployed program, edits your local copy of cron.yaml and pushes up the changes.