django 3rd party API Event Loop? - python

I'm quite new to back-end development so please forgive my ignorance.
I need to connect my Django backend to a 3rd party API to fetch live location data from GPS trackers. This needs to be done server side as it triggers other events, so it can't just be run in the browser.
What's the best way to go about doing this please? So far I have thought of something like an event loop which calls the API every 60 seconds, but can I run this on a separate thread for example? Is that even the best thing to do?
Is it possible to do something like a websocket form my backend to the 3rd party? What's the best way of keeping this data updated?
Finally, how does a solution like this scale, what if I had 5,000 vehicle trackers which all needed updating?
Any advice would be greatly appreciated.
Thank you.

you might create an empty webpage which triggers your backend logic,
and configure cronjob which would send get request on this page every 60 seconds,
it also will be started on separate thread by deafult.
you might scale it for example you have 5000 vehicles, just split this backend logic on two different functions(each 2500 vhicles) (which would be triggered by two different get requests), and they would work independant, on separate threads)
or you can use asyncio to split your logic inside one function, and still start it asynchronously and separatly

Related

Python websockets: how to create dynamically restartable connections

I've been using websocket.WebSocketApp().run_forever to subscribe to a websocket to obtain a stream of data. However, now I need to dynamically restart the connection at certain times of the day to change the channel parameters.
I am thinking of basically running two concurrent tasks:
The original task that connects to the websocket
A task that runs forever and checks the time of day. If a certain time is reached, it will restart task 1. with new parameters.
There's quite a lot of technologies out there but after trying a few options, there seems to always be some caveat where some library doesn't play well with being asynchronous. If anyone has tackled a problem like this before, any pointers would be greatly appreciated.

How do I run a time consuming calculation in the background using Python and FastAPI?

I've been trying to wrap my head around Python multiprocessing but I'm getting stuck. I thought I could solve my problems with the use of a Pool but not really sure how to keep it running between calls to the backend.
Basically I have this setup.
A web frontend
Python backend (fastapi)
What I need (want) to do is:
At one endpoints start a calculation (POST).
At another open a WebSocket and get updates regarding the calculation at regular intervals.
The calculation just needs to keep running. It stores information about its progress in a db which can be used to update the client.
One of the issues (I think) is that I use a context manager when starting the calculation and when the initial request finishes the pool is stopped and deleted.
Some help with this would be really appreciated. I'm probably missing something obvious...

Using Django as GUI for long running python process

This a question about architecture. Say I have a long running process on a server such as machine learning in a middle of a training. Now as this run on external machine I would like to have a tool to quickly see from time to time the results. So I thought the best way would be to have a website which quickly connects to the process for example using RPC to display the results as this allows me to always check in. Now the question is how should Django view gather the information from the server process:
1) Using RPC calls such as rpyc directly in the views?
2) Using some kind of messaging queue such as celery ?
3) Or in a completely different way I am not seeing ?
There's at least 2 possible ways to do this.
Implement your data-refreshing function as a view and visit it by ajax(sync)+javascript timer.Since you visit your page that contains these js, it will fetch your data silently and update the page. However,this solution does not work well when you need to record all the data in a given frequency;the ajax/view only executes when the web page is open.
Use messaging queue like selcuk suggests.Alongside celery, APscheduler is also a good choice because it's easier to install and use.You can implement a task(as modal) queue with status(queue/done/stoped/whatever as field) and check them at the frequency you wanted,save the date you retrieved and do all the other stuff.

Can I have Python code to continue executing after I call Flask app.run?

I have just started with Python, although I have been programming in other languages over the past 30 years. I wanted to keep my first application simple, so I started out with a little home automation project hosted on a Raspberry Pi.
I got my code to work fine (controlling a valve, reading a flow sensor and showing some data on a display), but when I wanted to add some web interactivity it came to a sudden halt.
Most articles I have found on the subject suggest to use the Flask framework to compose dynamic web pages. I have tried, and understood, the basics of Flask, but I just can't get around the issue that Flask is blocking once I call the "app.run" function. The rest of my python code waits for Flask to return, which never happens. I.e. no more water flow measurement, valve motor steering or display updating.
So, my basic question would be: What tool should I use in order to serve a simple dynamic web page (with very low load, like 1 request / week), in parallel to my applications main tasks (GPIO/Pulse counting)? All this in the resource constrained environment of a Raspberry Pi (3).
If you still suggest Flask (because it seems very close to target), how should I arrange my code to keep handling the real-world events, such as mentioned above?
(This last part might be tough answering without seeing the actual code, but maybe it's possible answering it in a "generic" way? Or pointing to existing examples that I might have missed while searching.)
You're on the right track with multithreading. If your monitoring code runs in a loop, you could define a function like
def monitoring_loop():
while True:
# do the monitoring
Then, before you call app.run(), start a thread that runs that function:
import threading
from wherever import monitoring_loop
monitoring_thread = threading.Thread(target = monitoring_loop)
monitoring_thread.start()
# app.run() and whatever else you want to do
Don't join the thread - you want it to keep running in parallel to your Flask app. If you joined it, it would block the main execution thread until it finished, which would be never, since it's running a while True loop.
To communicate between the monitoring thread and the rest of the program, you could use a queue to pass messages in a thread-safe way between them.
The way I would probably handle this is to split your program into two distinct separately running programs.
One program handles the GPIO monitoring and communication, and the other program is your small Flask server. Since they run as separate processes, they won't block each other.
You can have the two processes communicate through a small database. The GIPO interface can periodically record flow measurements or other relevant data to a table in the database. It can also monitor another table in the database that might serve as a queue for requests.
Your Flask instance can query that same database to get the current statistics to return to the user, and can submit entries to the requests queue based on user input. (If the GIPO process updates that requests queue with the current status, the Flask process can report that back out.)
And as far as what kind of database to use on a little Raspberry Pi, consider sqlite3 which is a very small, lightweight file-based database well supported as a standard library in Python. (It doesn't require running a full "database server" process.)
Good luck with your project, it sounds like fun!
Hi i was trying the connection with dronekit_sitl and i got the same issue , after 30 seconds the connection was closed.To get rid of that , there are 2 solutions:
You use the decorator before_request:in this one you define a method that will handle the connection before each request
You use the decorator before_first_request : in this case the connection will be made once the first request will be called and the you can handle the object in the other route using a global variable
For more information https://pythonise.com/series/learning-flask/python-before-after-request

Best practice to make partial search results appear (one by one as they come in from a secondary server)

I'd like to do the following:
the queries on a django site (first server) are send to a second
server (for performance and security reasons)
the query is processed on the second server using sqlite
the python search function has to keep a lot of data in memory. a simple cgi would always have to reread data from disk which would further slow down the search process. so i guess i need some daemon to run on the second server.
the search process is slow and i'd like to send partial results back, and show them as they arrive.
this looks like a common task, but somehow i don't get it.
i tried Pyro first which exposes the search class (and then i needed a workaround to avoid sqlite threading issues). i managed to get the complete search results onto the first server, but only as a whole. i don't know how to "yield" the results one by one (as generators cannot be pickled), and i anyway wouldn't know how to write them one by one onto the search result page.
i may need some "push technology" says this thread: https://stackoverflow.com/a/5346075/1389074 talking about some different framework. but which?
i don't seem to search for the right terms. maybe someone can point me to some discussions or frameworks that address this task?
thanks a lot in advance!
You can use python tornado websockets. This will allow you to establish 2 way connection from the client side to the server and return data as it comes. Tornado is an async framework built in python.

Categories

Resources