I have Django Postgres Database with DateField which is date of sending some message (SMS and email). I would like to schedule delivering somehow (so basically run function with parameters at this date). Everything is running on aws-lambda.
I read Django - Set Up A Scheduled Job? but I wondering if there isn't some strictly aws solution. Or maybe if there is something better than https://aws.amazon.com/solutions/instance-scheduler/.
Thanks!
At the end I have used aws cloudwatch which runs lambda which pings my django endpoint every [n] minutes.
Related
I have a Django application connected to Cloud Run at Google Cloud Platform.
I am in need to schedule a task to run everyday. The task should go through a table in the database and remove rows that exceed todays date.
I have looked in to Cloud functions, however, when I try to create one, it appears to only support Flask, and not Django.
Any ideas on how to precede to make this scheduled function and access the database?
I have a django project where I am using celery with rabbitmq to perform a set of async. tasks. So the setup i have planned goes like this.
Django app running on one server.
Celery workers and rabbitmq running from another server.
My initial issue being, how to do i access django models from the celery tasks resting on another server?
and assuming I am not able to access the Django models, is there a way once the tasks gets completed, i can send a callback to the Django application passing values, so that i get to update the Django's database based on the values passed?
Concerning your first question, accessing django models from the workers' server:
Your django app must be available on both Server A (serving users) and Server B (hosting the celery workers)
Concerning your second question, updating the database based on the values. Do you mean the result of the async task? If so, then you have two options:
You can just save whatever you need to save from within the task itself, assuming you have access to the database.
You could use a results backend (one of which is through the Django ORM) as mentioned in the official documentation of Celery about Keeping Results
I've used the following set up on my application:
Task is initiated from Django - information is extracted from the model instance and passed to the task as a dictionary. NB - this will be more future proof as Celery 4 will default to JSON encoding
Remote server runs task and creates a dictionary of results
Remote server then calls an update task that is only listened for by a worker on the Django server.
Django worker read results dictionary and updates model.
The Django worker listens to a separate queue, those this isn't strictly necessary. Results backend isn't used - data needed is just passed to the task
In my django project, I am using celery to run a periodic task that will check a URL that responds with a json and updating my database with some elements from that json.
Since requesting from the URL is limited, the total process of updating the whole database with my task will take about 40 minutes and I will run the task every 2 hours.
If I check a view of my django project, which also requests information from the database while the task is asynchronously running in the background, will I run into any problems?
While requesting information from your database you are reading your database. And in your celery task your are writing data into your database. You can write only once at a time but read as many times as you want as there is no lock permission on database while reading.
The only time when you are going to run into issues while using db with celery is when you use the database as backend for celery because it will continuously poll the db for tasks. If you use a normal broker you should not have issues.
Ive been configuring and troubleshooting some Django auth issues with a custom backend.
One thing I have noticed is that once the expiry date has expired for the session (confirmed via a Session.objects.all()) that the session remains in the table.
At the point that I have to reauthenticate it creates another entry creating a situation where a single user can have tons of sessions within the table rather then just one.
Is there a simple way of getting Django to clear these out at the point of them expiring ?
Thanks,
From official documentation -
Django does not provide automatic purging of expired sessions. Therefore, it’s your job to purge expired sessions on a regular basis. Django provides a clean-up management command for this purpose: clearsessions. It’s recommended to call this command on a regular basis, for example as a daily cron job.
Use something like this:
python manage.py clearsessions
...and schedule it to run regularly.
I work on a page in Django, where users can set custom reminders for different dates (max. 3 per date). The reminders should send via e-mail. Its similar to Google Calendar, where you can set multiple reminders for each event in x-minutes, x-hour or x-days before the date starts.
I wonder, how I can solve it combined with Django. Since there will be a lot of users and dates, which should of course also run perfomant.
Should I do this with a cron job? Is there a python way?
The other traditional way is to use django-celery: http://pypi.python.org/pypi/django-celery/
You can use the celerybeat command to run periodical tasks. Also you can start pending tasks from a django view.
You can use a cron job. To create a management command: refer to the documentation here
Also, you can create the email generation as a queue based, distributed implementation for enhanced performance. You can use Django-mailer app for the same.