I wrote small django website that read the data from data base and show that data in a table. (Database will be filled by making the request to an external API. )
Now my problem is that I need to make the request every 5 minutes to API and get the last 5 mins data and store them in the data base and at the same time update my table to show last 5 mins data.
I have read about job scheduler but I did not get how I can perform it. First of all is an scheduler such as celery is a good solution for this problem? and would be helpful for me if you can guide me how would be the approach to solve this?
A simple solution I have used in the past is to write a django custom command and then have a cronjob run that command at whatever interval you would like.
Django commands: https://docs.djangoproject.com/en/1.11/howto/custom-management-commands/
Related
I'm trying to add real-time features to my Django webapp. Basically, i want to show real time data on a webpage.
I have an external Python script which generates some JSON data, not big data but around 10 records per second. On the other part, i have a Django app, i would like my Django app to receive that data and show it on a HTML page in real time. I've already considered updating the data on a db and then retrieving it from Django, but i would have too many queries, since Django would query the DB 1+ times per second for every user and my external script would be writing a lot of data every second.
What i'm missing is a "central" system, a way to make these two pieces communicate. I know the question is probably not specific enough, but is there some way to do this? I know something about Django Channels, but i don't know if i could do what i want with it; i've also considered updating the data on a RabbitMQ queue and then retrieve it from Django, but this is not the best use of RabbitMQ.
So is there a way to do this with Django-Channels? Any kind of advice is appreciated.
I would suggest using Django Channels. You can also use Redis instead of RabbitMQ. In your case, Redis might be a better choice.
Here is an approach: http://www.maxburstein.com/blog/realtime-django-using-nodejs-and-socketio/
After i researched for three days and played with redis and celery i m no longer sure what the right solution to my problem is.
Its a simple problem. I have a simple flask app returning the data of a mysql query. But i dont want to query the database for every request made, as they might be 100 requests in a second. I wanna setup a daemon that queries independently my database every five seconds and if someone makes a request it should return the data of the previous request and when those 5 secs pass it will return the data from the latest query. All users recieve the same data. Is CELERY the solution?
i researched for three days.
The easiest way is to use Flask-Caching]
Just set a cache timeout of 5 seconds on your view and it will return a cached view response containing the result of the query made the first time and for all other query in the next 5 secs. When time is out, the first request will regenerate the cache by doing the query and all the flow of your view.
If your view function use arguments, use memoization instead of cache decorator to let caching use your arguments to generate the cache. For exemple, if you want to return a page details and you don't use memoization, you will return the same page detail for all your user, no matter of the id / slug in arguments.
The documentation of Flask-Caching explain everything better than me
I am building a small web-app based on Flask. I want to make a web-page, which will show some data taken from a database. This database is filed with data from some internet resources. The data in this database should be updated every 30 sec, which means I am sending update requests every 30 sec.
The way of how I see it's working:
1). script_db.py gets data from web and puts it in the database and updates database every 30 sec.
2). script_main.py gets the updated data from the database and renders the HTML template.
Question:
I want script_db.py to be executed automatically in the background, so that script_main.py will always have access to the most updated data.
1). The way I see it, is it the best way to work with the database, which needs to be automatically updated?
2). How to run that script_db.py in a background and then start script_main.py, whenever I want?
Thank You
Disclaimer: I am totally new to what I am trying to do. Any suggestion would help.
background
I aim at overing a project which is made up of django and celery.
And I code two tasks which would sipder from two differnt web and save some data to database —— mysql.
As before, I do just one task, and I use update_or_create shows enough.
But when I want to do more task in different workers ,and all of them may save data to database, and what's more, they may save the same data at the same time.
question
How can I ensure different tasks running in different worker makeing no repeating data when all of them try to save the same data?
I know a django API is select_for_update which would set a lock in database. But when I reading the documentions, it similar means, would select,if exist,update. But I want to select,if exist,update,else create. It would more be like update_or_create,but this API may not use a lock?
about sorry
May the user anwered my question before had give me the right answer, but I did not get what they means.
what I choose
Finally I use the redis lock to ensure no repeating data.
The logic just below:
when I get the data,I try to use 'set(key,value.nx=True,ex=60)' to get a lock from redis.
If the answer is True,I would try to use the django queryqpi 'update_or_create()'.
If not,I did nothing and return True.
It make the burst problem like a single process.
I have a Django app that is basically a web front end on a database.
Every now and then, users upload files containing perhaps 1000s of records. These records will need to be parsed out of the file, processed, and used to create new records or update existing records in the database. I'm just wondering what is the better approach for processing the uploaded file:
in the view (while the user waits - I guess this could be a up to 5 minues) ?
save the uploaded file and have some background cron job call a custom admin command to process it? This seems the most sensible to me.
or perhaps another method I haven't thought of?
Celery seems to be pretty hot these days too, you should definitely look into this:
https://github.com/ask/django-celery
http://celeryproject.org/
Send an email when done, or have the front end poll for results every X seconds after submission. "Are we there yet?" "Are we there yet?"
I'd like to know too a simple, safe way to start a thread that writes to the db.