I am doing some tasks with tens of thousands of activity directory objects that can take several minutes to load.
To speed up things up I'd like to just refresh this data into the sqllite database in the middle of the night (since there's no need for it to be current).
Is there a way to typically approach to this type of problem? Perhaps have Django periodically run a function somehow?
you can write a django admin command and use cron or at to execute the command.
or just use a django cron lib:
django-cron
django-crontab
Related
I'm working on a platform that uses Django and Django REST Framework, so this platform is used to monitor the state of some sensors which send data to the computer.
Now I have the data in the computer, and I want to create a Python file to read the data from the computer and store it in a MySQL database that Django will read and show to the user.
I have two questions:
Can I make Django read from the file, save the data in the database, and show it in the user interface?
If not, how could I make a script to run in the background when I start the project?
Can i make django read from the file save the data in the database
Yes, you could create a management command for this which could be run from crontab. Or create a sort of 'daemonized' version which keep on running but sleeps for X amount of seconds before running again. This command reads the data, puts it into the database.
show it in the user interface ?
Yes, but I would advise against doing this sequential (as in, don't put your 'data reading' in your view!!). So your managment command updates the database with the latest data and you view only shows the latest data.
You could also use this
https://github.com/kraiz/django-crontab
That makes it even more simple to use
my problem is the following: I have an application in ruby on rails that I would like to update in realtime, second by second. I would not, however, overload the database unnecessarily (because too many users and small server). I would like the ruby on rails application to be notified in some way by the mysql database when an update occurred in some datqabase table. It's possible?
I have a python script that in realtime could populate with new data the mysql tables.
I'm not sure what problem you're trying to solve. Won't the rails app get the updated data from the db on each request anyway? Are you caching the data? If that's the case just have the python script invalidate the cache.
I'm making API server with python flask.
In my case, it is real production level, So I have to be careful when developing server.
After google searching, found that celery&redis is suitable for task queueing.
So I installed celery&redis via pip3 install 'celery[redis]' and defined task, and run.
Everything was fine, but I got some question about it.
Assume that there is user model. Maybe CRUD for user model like this.
Register user(with photo)
Delete user
Get a single user
In my personal think, only Register user need to celery&redis.
Because upload photo can take long time, so it have to treated with asynchronize work.
Delete user and Get a slngle user just query to db and retreive it.
So it doesn't takes longer time. (it means, do not need to work with celery)
It is right? Or, any missing feature I do not know?
To summarize my question, I want to know that is there any standard for celery?
Thanks!
You have it about right. You can put whatever processing you want in celery, but the rule that you just used--use celery for things that take a long time--is the one we use most in our production environment. You can also use celery when you want to scale out an operation across servers more easily. For example, when scraping a large number of pages, you might want to execute that in parallel to speed up what would otherwise be a long-running task.
I do think there is a great tutorial about this topic.
using-celery-with-flask
And you can also check out this repo.
In my django project, I am using celery to run a periodic task that will check a URL that responds with a json and updating my database with some elements from that json.
Since requesting from the URL is limited, the total process of updating the whole database with my task will take about 40 minutes and I will run the task every 2 hours.
If I check a view of my django project, which also requests information from the database while the task is asynchronously running in the background, will I run into any problems?
While requesting information from your database you are reading your database. And in your celery task your are writing data into your database. You can write only once at a time but read as many times as you want as there is no lock permission on database while reading.
The only time when you are going to run into issues while using db with celery is when you use the database as backend for celery because it will continuously poll the db for tasks. If you use a normal broker you should not have issues.
Ive been configuring and troubleshooting some Django auth issues with a custom backend.
One thing I have noticed is that once the expiry date has expired for the session (confirmed via a Session.objects.all()) that the session remains in the table.
At the point that I have to reauthenticate it creates another entry creating a situation where a single user can have tons of sessions within the table rather then just one.
Is there a simple way of getting Django to clear these out at the point of them expiring ?
Thanks,
From official documentation -
Django does not provide automatic purging of expired sessions. Therefore, it’s your job to purge expired sessions on a regular basis. Django provides a clean-up management command for this purpose: clearsessions. It’s recommended to call this command on a regular basis, for example as a daily cron job.
Use something like this:
python manage.py clearsessions
...and schedule it to run regularly.