I have created a streamlit app which runs fine on my local machine. However, I cannot get it running on the streamlit-cloud. In a nutshell my app does the following:
take some user input
create a markdown file from the input ("deck.md")
convert markdown file to html file ("deck.html") using a npx command via subprocess
opens a new tab showing the html file via subprocess
On my local machine I use the following command to do steps 3 and 4:
import subprocess
def markdown2marp(file):
# Create HTML
marp_it = f"npx #marp-team/marp-cli#latest --theme custom.css --html {file}"
file = file.split(".")[0] # remove .md
proc = subprocess.run([marp_it], shell=True, stdout=subprocess.PIPE)
# Open HTML in Browser
subprocess.Popen(['open', f'{file}.html'])
return proc
Now on the streamlit cloud this is not working obviously.
Question: is there a workaround to achieve the described functionality in the streamlit-cloud.
Any help is much appreciated!
Related
I'm trying to write a python script that can launch DaVinci Resolve in headless mode, then send it some commands via its API, then close it.
What I'm looking for would look something like
Open resolve.exe with argument --nogui
Do stuff with the API here
Terminate this instance of Resolve
I've managed to launch an instance of Resolve in headless. But it always ends up being a subprocess of something else. While it's running as a subprocess, I can't get the API to communicate with it.
Here's the code of tried
import subprocess
args = ["C:\Program Files\Blackmagic Design\DaVinci Resolve\Resolve.exe", '--nogui']
resolve_headles = subprocess.Popen(args)
from python_get_resolve import GetResolve
resolve = GetResolve()
This should return an object of Resolve, but it always fails.
I believe this is because its running as a subprocess of my IDE
I've also tried this
from subprocess import call
dir = "C:\Program Files\Blackmagic Design\DaVinci Resolve"
cmdline = "Resolve.exe --nogui"
rc = call("start cmd /K " + cmdline, cwd=dir, shell=True)
This just has the same problem of Resolve running as a subprocess of Windows Command Processor.
How can I execute linux command inside a Python function? I will run the python file in a linux-based server, and in some functions I want to have something like,
def function():
#execute some commands on the linux system, eg. python /path1/path2/file.py
# Or execute a shell script, eg. /path1/path2/file.sh
What python module do I need to achieve this?
Thanks in advance.
This code will create a flask server and allow you to run commands. You can also capture the output.
import subprocess
from flask import Flask
app = Flask(__name__)
def run_command(command):
return subprocess.Popen(command, shell=True, stdout=subprocess.PIPE).stdout.read()
#app.route('/<command>')
def command_server(command):
return run_command(command)
You can run it by saving above text in server.py
$ export FLASK_APP=server.py
$ flask run
Try the following:
import os, subprocess
# if you do not need to parse the result
def function ():
os.system('ls')
# collect result
def function(command):
out = subprocess.run(
command.split(" "),
stdout=subprocess.PIPE)
return out.stdout
I want to copy the folder from one server to another by executing the following shell command
sshpass -p 'XXXX' scp -r root#X.X.X.X(ip): Sourcedirectory(sdr) DestinationDirectory(ddr) ,
using the python script and web GUI.
I have html form page with 3 inputs (ip,sdr,ddr), when I pass these inputs through web page to the python script these inputs are copied to their respective location , but the command is not being executed and rest of the lines in python file gets executed.
if the same script is executed by command line it works.
Can anyone help me where Im missing out.
The following is the code I used.
Python script:
#!/usr/bin/python
import gzip,glob,os.path
import os
import subprocess
# Import modules for CGI handling
import cgi, cgitb
import time
# Create instance of FieldStorage
# Get data from fields
form = cgi.FieldStorage()
ip = form.getvalue('ip')
sdr = form.getvalue('sdr')
ddr = form.getvalue('ddr')
import sys
class RunCmd(object):
def cmd_run(self, cmd):
self.cmd = cmd
subprocess.call(self.cmd, shell=True)
id=("sshpass -p 'XXXX' scp -r root#%s:%s %s" %(ip,sdr,ddr))
b = RunCmd()
b.cmd_run(id)
I have a script test.py which is used for server automation task and have other script server.py which list out all server name.
server.py list all sevrer name in text.log file and this log file is used by test.py as a input.
I want one single script test.py to execute server.py from inside it and also redirect server.py script output to text.log file as well.
So far i have tried with execfile("server.py") >text.log in test.py which didn't worked.
Use subprocess.call with stdout argument:
import subprocess
import sys
with open('text.log', 'w') as f:
subprocess.call([sys.executable, 'server.py'], stdout=f)
# ADD stderr=subprocess.STDOUT if you want also catch standard error output
Thanks in advance for any help. I am fairly new to python and even newer to html.
I have been trying the last few days to create a web page with buttons to perform tasks on a home server.
At the moment I have a python script that generates a page with buttons:
(See the simplified example below. removed code to clean up post)
Then a python script which runs said command and outputs to an iframe on the page:
(See the simplified example below. removed code to clean up post)
This does output the entire finished output after the command is finished. I have also tried adding the -u option to the python script to run it unbuffered. I have also tried using the Python subprocess as well. If it helps the types of commands I am running are apt-get update, and other Python scripts for moving files and fixing folder permissions.
And when run from normal Ubuntu server terminal it runs fine and outputs in real time and from my research it should be outputting as the command is run.
Can anyone tell me where I am going wrong? Should I be using a different language to perform this function?
EDIT Simplified example:
initial page:
#runcmd.html
<head>
<title>Admin Tasks</title>
</head>
<center>
<iframe src="/scripts/python/test/createbutton.py" width="650" height="800" frameborder="0" ALLOWTRANSPARENCY="true"></iframe>
<iframe width="650" height="800" frameborder="0" ALLOWTRANSPARENCY="true" name="display"></iframe>
</center>
script that creates button:
cmd_page = '<form action="/scripts/python/test/runcmd.py" method="post" target="display" >' + '<label for="run_update">run updates</label><br>' + '<input align="Left" type="submit" value="runupdate" name="update" title="run_update">' + "</form><br>" + "\n"
print ("Content-type: text/html")
print ''
print cmd_page
script that should run command:
# runcmd.py:
import os
import pexpect
import cgi
import cgitb
import sys
cgitb.enable()
fs = cgi.FieldStorage()
sc_command = fs.getvalue("update")
if sc_command == "runupdate":
cmd = "/usr/bin/sudo apt-get update"
pd = pexpect.spawn(cmd, timeout=None, logfile=sys.stdout)
print ("Content-type: text/html")
print ''
print "<pre>"
line = pd.readline()
while line:
line = pd.readline()
I havent tested the above simplified example so unsure if its functional.
EDIT:
Simplified example should work now.
Edit:
Imrans code below if I open a browser to the ip:8000 it displays the output just like it was running in a terminal which is Exactly what I want. Except I am using Apache server for my website and an iframe to display the output. How do I do that with Apache?
edit:
I now have the output going to the iframe using Imrans example below but it still seems to buffer for example:
If I have it (the script through the web server using curl ip:8000) run apt-get update in terminal it runs fine but when outputting to the web page it seems to buffer a couple of lines => output => buffer => ouput till the command is done.
But running other python scripts the same way buffer then output everything at once even with the -u flag. While again in terminal running curl ip:800 outputs like normal.
Is that just how it is supposed to work?
EDIT 19-03-2014:
any bash / shell command I run using Imrans way seems to output to the iframe in near realtime. But if I run any kind of python script through it the output is buffered then sent to the iframe.
Do I possibly need to PIPE the output of the python script that is run by the script that runs the web server?
You need to use HTTP chunked transfer encoding to stream unbuffered command line output. CherryPy's wsgiserver module has built-in support for chunked transfer encoding. WSGI applications can be either functions that return list of strings, or generators that produces strings. If you use a generator as WSGI application, CherryPy will use chunked transfer automatically.
Let's assume this is the program, of which the output will be streamed.
# slowprint.py
import sys
import time
for i in xrange(5):
print i
sys.stdout.flush()
time.sleep(1)
This is our web server.
2014 Version (Older cherrpy Version)
# webserver.py
import subprocess
from cherrypy import wsgiserver
def application(environ, start_response):
start_response('200 OK', [('Content-Type', 'text/plain')])
proc = subprocess.Popen(['python', 'slowprint.py'], stdout=subprocess.PIPE)
line = proc.stdout.readline()
while line:
yield line
line = proc.stdout.readline()
server = wsgiserver.CherryPyWSGIServer(('0.0.0.0', 8000), application)
server.start()
2018 Version
#!/usr/bin/env python2
# webserver.py
import subprocess
import cherrypy
class Root(object):
def index(self):
def content():
proc = subprocess.Popen(['python', 'slowprint.py'], stdout=subprocess.PIPE)
line = proc.stdout.readline()
while line:
yield line
line = proc.stdout.readline()
return content()
index.exposed = True
index._cp_config = {'response.stream': True}
cherrypy.quickstart(Root())
Start the server with python webapp.py, then in another terminal make a request with curl, and watch output being printed line by line
curl 'http://localhost:8000'