Web Development - Flask and Database connection - python

I am building a small web-app based on Flask. I want to make a web-page, which will show some data taken from a database. This database is filed with data from some internet resources. The data in this database should be updated every 30 sec, which means I am sending update requests every 30 sec.
The way of how I see it's working:
1). script_db.py gets data from web and puts it in the database and updates database every 30 sec.
2). script_main.py gets the updated data from the database and renders the HTML template.
Question:
I want script_db.py to be executed automatically in the background, so that script_main.py will always have access to the most updated data.
1). The way I see it, is it the best way to work with the database, which needs to be automatically updated?
2). How to run that script_db.py in a background and then start script_main.py, whenever I want?
Thank You
Disclaimer: I am totally new to what I am trying to do. Any suggestion would help.

Related

SQLAlchemy session with Celery (Multipart batch writes)

Supposing I have a mobile app which will send a filled form data (which also contains images) to a Commercial software using its API, and this data should be committed all at once.
Since the mobile does not have enough memory to send all the dataset at once, I need to send it as a Multipart batch.
I use transactions in cases where I want to perform a bunch of operations on the database, but I kind of need them to be performed all at once, meaning that I don't want the database to change out from under me while I'm in the middle of making my changes. And if I'm making a bunch of changes, I don't want users to be able to read my set of documents in that partially changed state. And I certainly don't want a set of operations failing halfway through, leaving me in a weird and inconsistent state forever. It's got to be all or nothing.
I know that Firebase provides the batch write operation which does exactly what I need. However, I need to do this into a local database (like redis or postgres).
The first approach I considered is using POST requests identified by a main session_ID.
- POST /session -> returns new SESSION_ID
- POST [image1] /session/<session_id> -> returns new IMG_ID
- POST [image2] /session/<session_id> -> returns new IMG_ID
- PUT /session/<session_id> -> validate/update metadata
However it does not seem very robust to handle errors.
The second approach I was considering is combining SQLAlchemy session with Celery task using Flask or FastAPI. I am not sure if it is common to do this to solve this issue. I just found this question. I would like to know what do you guys recommend for this second case approach (sending all data parts first, and commit all at once) ?

How can i send real time data to a Django application and show it on a webpage?

I'm trying to add real-time features to my Django webapp. Basically, i want to show real time data on a webpage.
I have an external Python script which generates some JSON data, not big data but around 10 records per second. On the other part, i have a Django app, i would like my Django app to receive that data and show it on a HTML page in real time. I've already considered updating the data on a db and then retrieving it from Django, but i would have too many queries, since Django would query the DB 1+ times per second for every user and my external script would be writing a lot of data every second.
What i'm missing is a "central" system, a way to make these two pieces communicate. I know the question is probably not specific enough, but is there some way to do this? I know something about Django Channels, but i don't know if i could do what i want with it; i've also considered updating the data on a RabbitMQ queue and then retrieve it from Django, but this is not the best use of RabbitMQ.
So is there a way to do this with Django-Channels? Any kind of advice is appreciated.
I would suggest using Django Channels. You can also use Redis instead of RabbitMQ. In your case, Redis might be a better choice.
Here is an approach: http://www.maxburstein.com/blog/realtime-django-using-nodejs-and-socketio/

Is there any way mssql can notify my python application when any table or row has been updated?

I dont have much knowledge in dbs, but wanted to know if there is any technique by which when i update or insert a specific entry in a table, it should notify my python application to which i can then listen whats updated and then update that particular row, in the data stored in session or some temporary storage.
I need to send data filter and sort calls again n again, so i dont want to fetch whole data from sql, so i decided to keep it local, nd process it from there. But i was worried if in the mean time the db updates, and i could have been passing the same old data to filter requests.
Any suggestions?
rdbs only will be updated by your program's method or function sort of things.
you can just print console or log inside of yours.
if you want to track what updated modified deleted things,
you have to build a another program to able to track the logs for rdbs
thanks.

Monitor MySQLdb in python for new entries and Flask

I'm looking for a way to constantly check my database (MySQL) for new entries. Once a new entry is committed I want to output it in a webpage using Flask.
Since the process takes time to finish I would like to give the users the impression it took only few seconds to retrieve data.
For now I'm waiting that the whole process finishes to give to the user the whole result. But I would prefer to update the result web-page every time a new entry was added to the DB. So for example the first entry is added to the DB, immediately the user can see it on the web-page, then a second entry is added the user can now see both the first and the second entries on the web-page and so on. I don't know if it has to come from flask or other ways
Any idea?
You can set MySQL to log all commits to General Query Log and monitor all changes (for example via Watchdog or PyNotify). Once the file changes, you can parse the new log entries and get the signal. By this way you'll avoid pooling for changes.
The better way would be of course send the signal while storing data to the database.

Django uploading and importing a file

I have a Django app that is basically a web front end on a database.
Every now and then, users upload files containing perhaps 1000s of records. These records will need to be parsed out of the file, processed, and used to create new records or update existing records in the database. I'm just wondering what is the better approach for processing the uploaded file:
in the view (while the user waits - I guess this could be a up to 5 minues) ?
save the uploaded file and have some background cron job call a custom admin command to process it? This seems the most sensible to me.
or perhaps another method I haven't thought of?
Celery seems to be pretty hot these days too, you should definitely look into this:
https://github.com/ask/django-celery
http://celeryproject.org/
Send an email when done, or have the front end poll for results every X seconds after submission. "Are we there yet?" "Are we there yet?"
I'd like to know too a simple, safe way to start a thread that writes to the db.

Categories

Resources