How to invoke a python script after successfully running a Django view - python

Lets say I have a view page(request) which loads page.html.
Now after successfully loading page.html, I want to automatically run a python script behind the scene 10 - 15 sec after the page.html loaded. How it is possible?
Also, is it possible to show the status of the script dynamically (running/ stopped/ Syntax Error..etc)

Runing a script from the javascript is not a clean way to do it, because the user can close the browser, disable js ... etc. instead you can use django-celery, it let you run backgroud scripts and you can check to status of the script dynamically from a middleware. Good luck

You could add a client-side timeout to AJAX back to the server 10-15 sec later. Point it to a different view and execute your script within that view. For example:
function runServerScript() {
$.get("/yourviewurlhere", function(data) {
// Do something with the return data
});
}
setTimeout("runServerScript()", 10000);
If you want status to be displayed, the client would have to make multiple requests back to the server.

Celery might come in handy for such use cases. You can start a task (or script as you call them) from a view (even with a delay, as you want). Sending status reports back to the browser will be harder unless you opt for something like WebSockets but that's highly experimental right now.

Related

Sharing memory between Python shells run by Electron

I'm currently working on a desktop application that makes use of React for frontend and runs through an electron window. Electron needs to be able to communicate with my Python backend. I've found an example online that works fine for running a simple Python script from Electron and returning the result to React.
Electron code that waits for signal from React:
ipcMain.on("REACT_TEST_PYTHON", (event, args) => {
let pyshell = new PythonShell(
path.join(__dirname, "../background/python/test.py"),
{
args: [args.test],
}
);
pyshell.on("message", function (results) {
mainWindow.webContents.send("ELECTRON_TESTED_PYTHON", { tasks: results });
});
})
test.py that is being run by electron:
import sys
data = sys.argv[1]
def factorial(x):
if x == 1 :
return 1
else:
return x * factorial(x-1)
print(factorial( int(data) ))
I completely understand how this works, but for my application the python scripts will not be as simple. Basically, I want electron to create a Python shell that starts a named task. This Python shell should continue running in the background while the frontend works normally. The part I'm stuck on is figuring out how to access data from this initial python shell from a different python shell created by a subsequent signal in electron. If that doesn't make sense this is my intended pipeline:
User clicks React button to start "Task1"
Electron gets signal from React and starts a Python shell to start "Task1". This python shell is running in the background (Electron should not wait for a result to continue processing).
Later on, user decides to click React button to cancel "Task1"
Electron gets signal from React and creates a new Python shell to cancel "Task1". In order to do this the new Python shell needs to access data from the original Python shell.
This new Python shell should also close the original Python shell so that it doesn't continue to try and run "Task1"
What would be the best way to do this?
Some thoughts I've had on how to do this:
Creating a file where necessary data for "Task1" could be written. I think I would need the mmap module in order to use some kind of shared memory between the shells (but I could be wrong). Would also need to figure out how to close the original Python shell.
Somehow saving a reference to the original Python shell which could allow for Electron to cancel "Task1" using the original shell. Not sure if this is possible though since the original Python shell will still be running, I doubt I could access the shell while its in the middle of processing.
Thank you for any help or insight you can provide! I apologize that this may be a confusing question, please let me know if I can clear anything up.

Express closes the request when spawned Python script sleeps

Original problem
I am creating an API using express that queries a sqlite DB and outputs the result as a PDF using html-pdf module.
The problem is that certain queries might take a long time to process and thus would like to de-couple the actual query call from the node server where express is running, otherwise the API might slow down if several clients are running heavy queries.
My idea to solve this was to decouple the execution of the sqlite query and instead run that on a python script. This script can then be called from the API and thus avoid using node to query the DB.
Current problem
After quickly creating a python script that runs a sqlite query, and calling that from my API using child_process.spawn(), I found out that express seems to get an exit code signal as soon as the python script starts to execute the query.
To confirm this, I created a simple python script that just sleeps in between printing two messages and the problem was isolated.
To reproduce this behavior you can create a python script like this:
print("test 1")
sleep(1)
print("test 2)
Then call it from express like this:
router.get('/async', function(req, res, next) {
var python = child_process.spawn([
'python3'
);
var output = "";
python.stdout.on('data', function(data){
output += data
console.log(output)
});
python.on('close', function(code){
if (code !== 0) {
return res.status(200).send(code)
}
return res.status(200).send(output)
});
});
If you then run the express server and do a GET /async you will get a "1" as the exit code.
However if you comment the sleep(1) line, the server successfully returns
test 1
test 2
as the response.
You can even trigger this using sleep(0).
I have tried flushing the stdout before the sleep, I have also tried piping the result instead of using .on('close') and I have also tried using -u option when calling python (to use unbuffered streams).
None of this has worked, so I'm guessing there's some mechanism baked into express that closes the request as soon as the spawned process sleeps OR finishes (instead of only when finishing).
I also found this answer related to using child_process.fork() but I'm not sure if this would have a different behavior or not and this one is very similar to my issue but has no answer.
Main question
So my question is, why does the python script send an exit signal when doing a sleep() (or in the case of my query script when running cursor.execute(query))?
If my supposition is correct that express closes the request when a spawned process sleeps, is this avoidable?
One potential solution I found suggested the use of ZeroRPC, but I don't see how that would make express keep the connection open.
The only other option I can think of is using something like Kue so that my express API will only need to respond with some sort of job ID, and then Kue will actually spawn the python script and wait for its response, so that I can query the result via some other API endpoint.
Is there something I'm missing?
Edit:
AllTheTime's comment is correct regarding the sleep issue. After I added from time import sleep it worked. However my sqlite script is not working yet.
As it turns out AllTheTime was indeed correct.
The problem was that in my python script I was loading a config.json file, which was loaded correctly when called from the console because the path was relative to the script.
However when calling it from node, the relative path was no longer correct.
After fixing the path it worked exactly as expected.

Kill Django process from browser

Within a Django view I call a function for uploading and importing an excel file.
def import_log(request):
report = ""
if request.method == "POST":
file_object = request.FILES
sheet_name = request.POST["sheet_name"]
if len(file_object):
file_object = file_object["file_object"]
if len(file_object):
process_import()
context = {
"report": report
}
return render(request, "import_log.html", context)
else:
return import_upload_view(request, error="No file uploaded")
When I try to stop the page by clicking "Stop loading this page" or by closing the browser the import process does not stop.
These import files are pretty big so I would like to be able to kill the process from the browser when needed.
How would I be able to do this?
Put simply, you can't.
The internet works by sending requests to a server and then waiting for a response, it doesn't pertain an open connection to a process, thats the server's job to handle its own processes.
The browser is essentially nothing more than your computers monitor, displaying the information sent to it - so you could turn your monitor off or pull the plug as much as you'd like, its not going to stop your computer from running
The only time Django/server would know about the 'aborted connection' is when it tries to send the response back. To understand this you can write a dummy view and sleep for may be 10 seconds in that view; call it from browser and stop it as fast you can; wait to see what Django does. If you are running Django Dev server it should be clear for you that Django does behave normally and sends the response but a 500 error happens because of aborted connection and then a try happens to send this 500 error to client which of course obviously fails too.
So, it is not possible to stop the view process in the middle.
But, you can change the way you approach this problem probably by first sending the request to your view then spin off a new process to do the "pretty big" import process; register the process by using some unique ID and current timestamp in some persistent data store (probably in a database); return HTTP status code 202(Accepted) with the registered ID to end the view.
In the spun process, have multihreading. One thread continuously polls the database to check delta between current time and the one in database. If the difference exceeds the threshold you decide (say 10 seconds) this whole process should kill itself.
From browser keep hitting an API (another Django view) using AJAX to update the timestamp in database for the particular record whose ID is the ID that you got in 202 response.
The idea is to keep let know the server that client is still available and if for some reason you don't see any ping from client, you would treat it as the case of browser close or navigate away from page and so stop working on the process spun off.
This approach may get tricky if you are single page application.

Django : Detecting finishing of a script in the background and other functionalities should work in the meantime

I need to execute a command on a simple button press event in my Django project (for which I'm using "subprocess.Popen()" in my views.py ).
After I execute this script it may take anywhere from 2 minutes to 5 minutes to complete. So while the script executes I need to disable the html button but I want the users to continue using other web pages while the script finishes in the background. Now the real problem is that I want to enable the html button back, when the process finishes!
I'm stuck at this from many days. Any help or suggestion is really really appreciated.
I think you have to use some "realtime" libraries for django. I personally know django-realtime (simple one) and swampdragon (less simple, but more functional). With both of this libraries you can create web-socket connection and send messages to clients from server that way. It may be command for enabling html button or javascript alert or whatever you want.
In your case I advice you first option, because you can send message to client directly from any view. And swampdragon needs model to track changes as far I know.
Like valentjedi suggested, you should be using swampdragon for real time with django.
You should take the first tutorial here: http://swampdragon.net/tutorial/part-1-here-be-dragons-and-thats-a-good-thing/
Then read this as it holds knowledge required to accomplish what you want:
http://swampdragon.net/tutorial/building-a-real-time-server-monitor-app-with-swampdragon-and-django/
However there is a difference between your situation and the example given above, in your situation:
Use Celery or any other task queue, since the action you wait for takes long time to finish, you will need to pass it to the background. (You can also make these tasks occur one after another if you don't want to freeze your system with enormous memory usage).
Move the part of code that runs the script to your celery task, in this case, Popen should be called in your Celery task and not in your view (router in swampdragon).
You then create a channel with the user's unique identifier, and add relevant swampdragon javascript code in your html file for the button to subscribe to that user's channel (also consider disabling the feature on your view (router) since front-end code can be tempered with.
The channel's role will be to pull the celery task state, you
then disable or enable the button according to the state of
the task.
overview:
Create celery task for your script.
Create a user unique channel that pulls the task state.
Disable or enable the button on the front-end according to the state of the taks, consider displaying failure message in case the script fails so that the user restart again.
Hope this helps!

How I can print 'process status' in Django?

In one of my views I have several steps and they use 5 or 7 minutes to finish totally, so I was wondering if there is a way to print the status of the view in the browser, like:
"Calculating models..."
"Post processing models..."
"Making DB..."
"Cleaning old tables..."
Is there a way to do that?
Thanks!
Such heavy duty should probably not be part of a django view. You might want to look into django celery for asynchronous task management.
However you can do something like that just fine by polling your server. The easy setup, use short polling (basically a javascript loop that triggers an ajax request to the server every i seconds, retrieving a status response* which you can use to show your user anything).
*You'll have to setup an url and function that calculates the status somehow or if you're using celery you can use it's asynchronous result

Categories

Resources