Outputting with Willie bot doesn't work, but print does - python

I am trying to return the output of a command to the IRC channel using Willie bot.
My code seems to work outputting my variable line by line, but for some reason once I utilize Willie bots say command to output to IRC it doesn't output anything.
Here is my code:
from willie import module
import subprocess
import urllib2
import os
#module.commands('splint')
def splint(bot, trigger):
bot.reply('I will process your request now!')
page = urllib2.urlopen(trigger.group(2))
page_content = page.read();
with open('codeToCheck.c', 'w') as code:
code.write(page_content)
command = 'splint "C:\Users\Justin\Desktop\codeToCheck.c"'
output = subprocess.Popen(command,shell=True, stdout=subprocess.PIPE,stderr=subprocess.PIPE).communicate()[0]
bot.say('I have saved the file successfully. Outputting:')
for i in output.splitlines():
bot.say(i)
bot.say(output)
Using my little test code here I have determined it works with print:
import subprocess,os
output = subprocess.Popen(["splint", "C:\cygwin64\home\Justin\codeToCheck.c"], stdout=subprocess.PIPE).communicate()[0]
command = 'splint "C:\Users\Justin\Desktop\codeToCheck.c"'
output = subprocess.Popen(command,shell=True, stdout=subprocess.PIPE,stderr=subprocess.PIPE).communicate()[0]
for i in output.splitlines():
print i
print 'I have saved the file successfully. Outputting:'
This is what the irc output looks like for my code:
<Fogest> .splint http://pastebin.com/raw.php?i=8cB7DdnQ
<Fogbot> Fogest: I will process your request now!
<Fogbot> I have saved the file successfully. Outputting:
There should be output, but there is nothing. Am I doing something wrong here? Running my test file (the test code I show on this post) via the command line I get the following output like I should:
$ python test.py
Splint 3.1.2 --- 25 Aug 2010
Finished checking --- no warnings
I have saved the file successfully. Outputting:

I switched the code to use the following instead:
process = subprocess.Popen(command,shell=True, stderr=subprocess.PIPE)
output = process.stderr.read()
The problem is now resolved.

Related

Unable to run powershell script with parameters from within python

I'm trying to run a powershell script (with parameters) from within python (version 3.8.3), and after reading many stackoverflow posts I came to the following code:
import subprocess
path = r"'C:\Program Files\Company\Some space\Some-Script.ps1'"
args = r"-UserId abcd1234 -PageUri https://example.com/testapp -Privileges #('ReadOnly', 'ReadWrite')"
full_command = f'"& {path} {args}"';
result = subprocess.run(['powershell.exe', full_command ], capture_output=True)
print('printing out ...')
print(result.stdout)
print('printing error ...')
print(result.stderr)
However when I run the above python script, the following output is generated .... which is basically just dumping the powershell command back out without actually running it (by the looks of it)
printing out ...
b"& 'C:\\Program Files\\Company\\Some space\\Some-Script.ps1' -UserId abcd1234 -PageUri https://example.com/testapp -Privileges #('ReadOnly', 'ReadWrite')\r\n"
printing error ...
b''
The python code above was generated using the following Windows shell command, which works fine:
powershell.exe "& 'C:\Program Files\Company\Some space\Some-Script.ps1' -UserId abcd1234 -PageUri https://example.com/testapp -Privileges #('ReadOnly', 'ReadWrite')"
Can someone kindly tell me what is the issue with my python code ?

Subprocess to open python file and return data

I am trying to use Python to open another file. This file is going to start up a socket and create threads for listening for additional connections, and threads for sending/receiving data. The main thread will not return.
However, if the setup of sockets fail, I want to return a error code to the other python script that executed the subprocess.
main.py
py3output = subprocess.check_output(['python3', 'py3.py'])
print('py3 said:' + str(py3output))
py3.py
def returnme():
return 10
returnme()
When I run this, it prints:
py3 said:b''
I am just trying to figure out how to get the return value back to the main calling program.
To return an exit code n back to the OS, you need sys.exit(n). But seems like you do not want to check the exit code but the stdout otput. So your program might need to rewrite to:
def returnme():
return 10
print(returnme())
You should only return a string as a standard output using following code:
sample.py
import sys
def returnme():
sys.stdout.write(str(10))
sys.stdout.flush()
returnme()
main.py
from subprocess import check_output
output = check_output(['python','sample.py'])
print('Sample.py says :' + output)

How can python wait for a batch SGE script finish execution?

I have a problem I'd like you to help me to solve.
I am working in Python and I want to do the following:
call an SGE batch script on a server
see if it works correctly
do something
What I do now is approx the following:
import subprocess
try:
tmp = subprocess.call(qsub ....)
if tmp != 0:
error_handler_1()
else:
correct_routine()
except:
error_handler_2()
My problem is that once the script is sent to SGE, my python script interpret it as a success and keeps working as if it finished.
Do you have any suggestion about how could I make the python code wait for the actual processing result of the SGE script ?
Ah, btw I tried using qrsh but I don't have permission to use it on the SGE
Thanks!
From your code you want the program to wait for job to finish and return code, right? If so, the qsub sync option is likely what you want:
http://gridscheduler.sourceforge.net/htmlman/htmlman1/qsub.html
Additional Answer for an easier processing:
By using the python drmaa module : link which allows a more complete processing with SGE.
A functioning code provided in the documentation is here: [provided you put a sleeper.sh script in the same directory]
please notice that the -b n option is needed to execute a .sh script, otherwise it expects a binary by default like explained here
import drmaa
import os
def main():
"""Submit a job.
Note, need file called sleeper.sh in current directory.
"""
s = drmaa.Session()
s.initialize()
print 'Creating job template'
jt = s.createJobTemplate()
jt.remoteCommand = os.getcwd()+'/sleeper.sh'
jt.args = ['42','Simon says:']
jt.joinFiles=False
jt.nativeSpecification ="-m abe -M mymail -q so-el6 -b n"
jobid = s.runJob(jt)
print 'Your job has been submitted with id ' + jobid
retval = s.wait(jobid, drmaa.Session.TIMEOUT_WAIT_FOREVER)
print('Job: {0} finished with status {1}'.format(retval.jobId, retval.hasExited))
print 'Cleaning up'
s.deleteJobTemplate(jt)
s.exit()
if __name__=='__main__':
main()

Is it possible to stream output from a python subprocess to a webpage in real time?

Thanks in advance for any help. I am fairly new to python and even newer to html.
I have been trying the last few days to create a web page with buttons to perform tasks on a home server.
At the moment I have a python script that generates a page with buttons:
(See the simplified example below. removed code to clean up post)
Then a python script which runs said command and outputs to an iframe on the page:
(See the simplified example below. removed code to clean up post)
This does output the entire finished output after the command is finished. I have also tried adding the -u option to the python script to run it unbuffered. I have also tried using the Python subprocess as well. If it helps the types of commands I am running are apt-get update, and other Python scripts for moving files and fixing folder permissions.
And when run from normal Ubuntu server terminal it runs fine and outputs in real time and from my research it should be outputting as the command is run.
Can anyone tell me where I am going wrong? Should I be using a different language to perform this function?
EDIT Simplified example:
initial page:
#runcmd.html
<head>
<title>Admin Tasks</title>
</head>
<center>
<iframe src="/scripts/python/test/createbutton.py" width="650" height="800" frameborder="0" ALLOWTRANSPARENCY="true"></iframe>
<iframe width="650" height="800" frameborder="0" ALLOWTRANSPARENCY="true" name="display"></iframe>
</center>
script that creates button:
cmd_page = '<form action="/scripts/python/test/runcmd.py" method="post" target="display" >' + '<label for="run_update">run updates</label><br>' + '<input align="Left" type="submit" value="runupdate" name="update" title="run_update">' + "</form><br>" + "\n"
print ("Content-type: text/html")
print ''
print cmd_page
script that should run command:
# runcmd.py:
import os
import pexpect
import cgi
import cgitb
import sys
cgitb.enable()
fs = cgi.FieldStorage()
sc_command = fs.getvalue("update")
if sc_command == "runupdate":
cmd = "/usr/bin/sudo apt-get update"
pd = pexpect.spawn(cmd, timeout=None, logfile=sys.stdout)
print ("Content-type: text/html")
print ''
print "<pre>"
line = pd.readline()
while line:
line = pd.readline()
I havent tested the above simplified example so unsure if its functional.
EDIT:
Simplified example should work now.
Edit:
Imrans code below if I open a browser to the ip:8000 it displays the output just like it was running in a terminal which is Exactly what I want. Except I am using Apache server for my website and an iframe to display the output. How do I do that with Apache?
edit:
I now have the output going to the iframe using Imrans example below but it still seems to buffer for example:
If I have it (the script through the web server using curl ip:8000) run apt-get update in terminal it runs fine but when outputting to the web page it seems to buffer a couple of lines => output => buffer => ouput till the command is done.
But running other python scripts the same way buffer then output everything at once even with the -u flag. While again in terminal running curl ip:800 outputs like normal.
Is that just how it is supposed to work?
EDIT 19-03-2014:
any bash / shell command I run using Imrans way seems to output to the iframe in near realtime. But if I run any kind of python script through it the output is buffered then sent to the iframe.
Do I possibly need to PIPE the output of the python script that is run by the script that runs the web server?
You need to use HTTP chunked transfer encoding to stream unbuffered command line output. CherryPy's wsgiserver module has built-in support for chunked transfer encoding. WSGI applications can be either functions that return list of strings, or generators that produces strings. If you use a generator as WSGI application, CherryPy will use chunked transfer automatically.
Let's assume this is the program, of which the output will be streamed.
# slowprint.py
import sys
import time
for i in xrange(5):
print i
sys.stdout.flush()
time.sleep(1)
This is our web server.
2014 Version (Older cherrpy Version)
# webserver.py
import subprocess
from cherrypy import wsgiserver
def application(environ, start_response):
start_response('200 OK', [('Content-Type', 'text/plain')])
proc = subprocess.Popen(['python', 'slowprint.py'], stdout=subprocess.PIPE)
line = proc.stdout.readline()
while line:
yield line
line = proc.stdout.readline()
server = wsgiserver.CherryPyWSGIServer(('0.0.0.0', 8000), application)
server.start()
2018 Version
#!/usr/bin/env python2
# webserver.py
import subprocess
import cherrypy
class Root(object):
def index(self):
def content():
proc = subprocess.Popen(['python', 'slowprint.py'], stdout=subprocess.PIPE)
line = proc.stdout.readline()
while line:
yield line
line = proc.stdout.readline()
return content()
index.exposed = True
index._cp_config = {'response.stream': True}
cherrypy.quickstart(Root())
Start the server with python webapp.py, then in another terminal make a request with curl, and watch output being printed line by line
curl 'http://localhost:8000'

Python telnetlib telnet.write() dropping messages

Given the following code:
import telnetlib
import sys
def func1(IP,user,passw):
t=telnetlib.Telnet(IP)
t.write(user.encode('ascii')+b'\n')
t.write(passw.encode('ascii')+b'\n')
return t
def func2(t,command):
t.write(command.encode('ascii')+b'\n')
print(command)
user=sys.argv[1]
passw=sys.argv[2]
IP=sys.argv[3]
t=func1(IP,user,passw)
for i in range(6):
func2(t, "message "+str(i))
By looking at the server and also Wiresharking it, only messages 1 and 2 gets through.
Now, if I change the func2(t,command) function as follows:
def func2(t,command):
t.write(command.encode('ascii')+b'\n')
t.read_eager() #This is the new line.
print(command)
It all works fine and all messages are been transmitted.
Any idea?
Python3.3 WindowsXP
You need to read the text coming back to prevent the socket blocking. This is why your other method works.
In a real world session you would always be reading back the results of logging in and of commands.

Categories

Resources