The code below is almost what I need. But when second user connect, new subprocess.Popen created, i need run only one subprocess.Popen and send same data to multiple users.
Example: The first user connect, start subprocess.Popen, he begins to receive the result starting from the number 0, when second user connect 30 seconds after, he begins to receive result starting from de number 30.
#!/usr/bin/env python
import os
from functools import partial
from subprocess import Popen, PIPE
from flask import Flask, Response # $ pip install flask
file = 'test.file'
app = Flask(__name__)
#app.route('/' + file)
def stream():
process = Popen([ "bash", "-c", "for ((i=0;i<100;i=i+1)); do echo $i; sleep 1; done" ], stdout=PIPE, bufsize=-1)
read_chunk = partial(os.read, process.stdout.fileno(), 1024)
return Response(iter(read_chunk, b''), mimetype='audio/mp3')
if __name__ == "__main__":
app.run(host='0.0.0.0',threaded=True)
To be honest I am not sure if this will work. I do not use the subprocess module very much, so I am not sure if this is an appropriate use case. The question in general reminds me of flask extensions.
I was trying to suggest you use a similar pattern to flask extensions...
check if resource exists
if it does not, create it
return it
Store it on a Flask global, which is the API recommendation for extension development.
edit: realized your route was by file name so changed keys to reflect that
#!/usr/bin/env python
import os
from functools import partial
from subprocess import Popen, PIPE
from flask import Flask, Response, _app_ctx_stack as stack
file = 'test.file'
app = Flask(__name__)
def get_chunk(file):
ctx = stack.top
key = "read_chunk_%s" % file
if ctx is not None:
if not hasattr(ctx, key):
process = Popen(["bash", "-c", "for ((i=0;i<100;i=i+1)); do echo $i; sleep 1; done"],
stdout=PIPE, bufsize=-1)
setattr(ctx, key, partial(os.read, process.stdout.fileno(), 1024))
return getattr(ctx, key)
#app.route('/' + file)
def stream():
read_chunk = get_chunk(file)
return Response(iter(read_chunk, b''), mimetype='audio/mp3')
if __name__ == "__main__":
app.run(host='0.0.0.0',threaded=True)
Related
Currently, I am in the process of building an application that is capable of executing shell commands and fetching the results to a web browser.
My problem is that the results display fine on the browser but not in real-time. The command I'm having trouble with is the Linux 'top' command. And as we know, this command sends the list of processes executed on the system as well as the CPU, RAM and in real-time. Is it possible to do this with Flask ???
I've seen similar posts before but they didn't help me. And frankly, I'm still new to flask and your help will be very useful to me. thanks in advance
Following is the code I've tried. (the command is in the test.sh file)
import flask
from shelljob import proc
app = flask.Flask(__name__)
#app.route( '/stream' )
def stream():
g = proc.Group()
p = g.run( [ "./test.sh" ] )
def read_process():
while g.is_pending():
lines = g.readlines()
for proc, line in lines:
yield line
retour = flask.Response( read_process(), mimetype= 'text/json' )
return retour
#app.route('/exec')
def streaming():
return flask.render_template('index.html')
if __name__ == "__main__":
app.run(debug=True)
I wrote an API in Django. It runs a shell script in the server and prints some logs. When the client calls the API in the browser, I want the browser to print the logs line by line. So far, the snippet is like:
from django.http import StreamingHttpResponse
import subprocess
import shlex
def test(request):
cmd = 'bash test.sh'
args = shlex.split(cmd)
proc = subprocess.Popen(
args,
cwd='/foo/bar',
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
return StreamingHttpResponse(
(line for line in iter(proc.stdout.readline, '')),
content_type="text/plain",
)
And test.sh is like
#!/bin/bash
echo hello
sleep 1
echo world
sleep 1
echo love
sleep 1
echo and
sleep 1
echo peace
sleep 1
The logs are not printed until bash test.sh is finished. How can I make it print the logs as if test.sh is run in the client?
I'm using Python 2.7 and Django 1.11.29. I know they are old, but I still have to use them now. Thanks for help.
You provide the content to StreamingHttpResponse as:
(line for line in iter(proc.stdout.readline, ''))
Here you make an iterable and at iterate over it at the same time and return a generator. Although this won't really matter and you should be getting real time output. The problem may simply be that the output is too fast to actually see the real time difference or perhaps your configuration is incorrect (See Streaming HTTP response, flushing to the browser), Also another problem can be that you are not using the response as a stream with javascript and instead are downloading it fully before using it. You can try using time.sleep() to see if it is just too fast or instead a configuration problem:
import time
def iter_subproc_with_sleep(proc):
for line in iter(proc.stdout.readline, ''):
time.sleep(1)
yield line
def test(request):
cmd = 'bash test.sh'
args = shlex.split(cmd)
proc = subprocess.Popen(
args,
cwd='/foo/bar',
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
return StreamingHttpResponse(
iter_subproc_with_sleep(proc),
content_type="text/plain",
)
How can I execute linux command inside a Python function? I will run the python file in a linux-based server, and in some functions I want to have something like,
def function():
#execute some commands on the linux system, eg. python /path1/path2/file.py
# Or execute a shell script, eg. /path1/path2/file.sh
What python module do I need to achieve this?
Thanks in advance.
This code will create a flask server and allow you to run commands. You can also capture the output.
import subprocess
from flask import Flask
app = Flask(__name__)
def run_command(command):
return subprocess.Popen(command, shell=True, stdout=subprocess.PIPE).stdout.read()
#app.route('/<command>')
def command_server(command):
return run_command(command)
You can run it by saving above text in server.py
$ export FLASK_APP=server.py
$ flask run
Try the following:
import os, subprocess
# if you do not need to parse the result
def function ():
os.system('ls')
# collect result
def function(command):
out = subprocess.run(
command.split(" "),
stdout=subprocess.PIPE)
return out.stdout
The code below is executed on a certain URL (/new...) and assigns variables to the session cookie, which is used to build the display. This example calls a command using subprocess.Popen.
The problem is that the Popen command called below typically takes 3 minutes - and the subprocess.communicate Waits for the output - during which time all other Flask calls (e.g. another user connecting) are halted. I have some commented lines related to other things I've tried without success - one using the threading module and another using subprocess.poll.
from app import app
from flask import render_template, redirect, session
from subprocess import Popen, PIPE
import threading
#app.route('/new/<number>')
def new_session(number):
get_list(number)
#t = threading.Thread(target=get_list, args=(number))
#t.start()
#t.join()
return redirect('/')
def get_list(number):
#1 Call JAR Get String
command = 'java -jar fetch.jar' + str(number)
print "Executing " + command
stream=Popen(command, shell=False, stdout=PIPE)
#while stream.poll() is None:
# print "stream.poll = " + str(stream.poll())
# time.sleep(1)
stdout,stderr = stream.communicate()
#do some item splits and some processing, left out for brevity
session['data'] = stdout.split("\r\n")
return
What's the "better practice" for handling this situation correctly?
For reference, this code is run in Python 2.7.8 on win32, including Flask 0.10.1
First, you should use a work queue like Celery, RabbitMQ or Redis (here is a helpful hint).
Then, define the get_list function becomes :
#celery.task
def get_list(number):
command = 'java -jar fetch.jar {}'.format(number)
print "Executing " + command
stream = Popen(command, shell=False, stdout=PIPE)
stdout, stderr = stream.communicate()
return stdout.split('\r\n')
And in your view, you wait for the result :
#app.route('/new/<number>')
def new_session(number):
result = get_list.delay(number)
session['data'] = result.wait()
return redirect('/')
Now, it doesn't block your view! :)
let take a example , i want to run a mail function from scheduler. i made mail.py under the modules
from gluon.tools import Mail
mail=Mail()
mail.settings.server='smtp.gmail.com:587'
mail.settings.login='ass.aa#gmail.com:aaaaaa'
mail.settings.sender='aaa.aa#aa.com'
mail.send(to=['aaa#aaa.a,'],
subject='aaaaaa',
message='<html>'
'<body>'
'test mail'
'</body>'
'</html>')
my corntab file is
0-59/1 * * * * root *applications/comv1/modules/mail1.py
scheduler.py file which under home > application name of pythonanywhwer
#/usr/bin/env python
import os
import socket
import sys
import subprocess
filename = os.path.abspath(__file__) # we use this to generate a unique socket name
try:
socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM).bind('\0' + filename)
except socket.error:
print("Failed to acquire lock, task must already be running")
sys.exit()
subprocess.call(["python", "web2py/web2py.py", "-K", "comv3"])
after this also i am not able to run cron job.....
can any one help me
PythonAnywhere does not read crontab. If you want to run a task periodically, we provide Scheduled tasks. There are docs about how to use them on our help pages.