How to capture output of a command run in python3 in python2? - python

I have a file which connects a database and fetches the result. Now the file must be ran using python 3 and my project uses python 2.7. So I run the file as a command line using subprocess module. Here is how I call the file.
import subprocess
import ast
def execute_python3(param):
param = param.replace("\\", "")
param = "\"" + param + "\""
cmd = "python3 " + "get_db_result.py" + " " + param
result = subprocess.check_output(cmd, shell=True)
return ast.literal_eval(result)
execute_python3(sql_query)
Here in the command, I am passing sql query to the get_db_result file.
The get_db_result.py file looks something like this
import sys
def get_result():
param = sys.argv[1]
'''
Logic to get result from db
'''
result = db_output
print(result)
if __name__ == "__main__":
get_result()
Now the issue is when I fetch the output from db, I have to do a print for the output to be captured by the subprocess module. This makes it difficult to parse the output to be used by program for further work. For example, when I receive an output like this
"[(u'Delhi', 20199330), (u'Mumbai', 134869470), (u'Kolkata', 6678446)]"
This is a string list of tuples which can be converted to list of tuples by doing something like ast.literal_eval(result)
But sometimes I get output like this
"[(datetime.date(2019, 5, 27), 228.168093587), (datetime.date(2019, 5, 28), 228.834493641)]"
Here ast doesn't understand datetime. Even json.loads() doesn't work on this.
How can I capture the output from a file without having to use print and simply return it back to subprocess as it is. Is it even possible?

You need to serialize and deserialize the data on both ends. Simplest solution would be to use Python's pickle module and hope the types that are serialized on the Python 3 end, are similar enough to those on the deserializing Python 2 end. You need to specify the used protocol on the sending end to a version understood by the receiving end:
Receiver with safer call of subprocess (no shell process in between):
#!/usr/bin/env python
import pickle
import subprocess
def execute_python3(param):
result = subprocess.check_output(['python3', 'get_db_result.py', param])
return pickle.loads(result)
def main():
execute_python3(sql_query)
if __name__ == '__main__':
main()
Sender, explicitly choosing a pickle protocol still understood by Python 2:
#!/usr/bin/env python3
import sys
import pickle
def get_result():
param = sys.argv[1]
'''
Logic to get result from db
'''
result = db_output
pickle.dump(result, sys.stdout.buffer, protocol=2)
if __name__ == '__main__':
get_result()
If this doesn't work because of differences in the (de)serialized objects between Python 2 and 3, you have to fall back to explicitly (de)serialize the data, for example in JSON, as suggested by a comment from Jay.

Related

How to obtain a python command line argument if only it's a string

I'm making my own python CLI and i want to pass only String arguments
import sys
import urllib, json
# from .classmodule import MyClass
# from .funcmodule import my_function
def main():
args = sys.argv[1:]
#print('count of args :: {}'.format(len(args)))
#for arg in args:
# print('passed argument :: {}'.format(arg))
#always returns true even if i don't pass the argument as a "String"
if(isinstance(args[0], str)):
print('JSON Body:')
url = args[0]
response = urllib.urlopen(url)
data = json.loads(response.read())
print(data)
# my_function('hello world')
# my_object = MyClass('Thomas')
# my_object.say_name()
if __name__ == '__main__':
main()
I execute it by api "url" and this is the correct output:
Although when i'm trying to execute api url without passing it as a String my output is a little odd:
How can i accept only String arguments?
What I've tried so far:
Found this solution here but it didn't work for me (couldn't recognize the join() function)
problem isn't a python issue. It's just that your URL contains a &, and on a linux/unix shell, this asks to run your command in the background (and the data after & is dropped). That explains the [1]+ done output, with your truncated command line.
So you have to quote your argument to avoid it to be interpreted (or use \&). There's no way around this from a Un*x shell (that would work unquoted from a Windows shell for instance)

Subprocess to open python file and return data

I am trying to use Python to open another file. This file is going to start up a socket and create threads for listening for additional connections, and threads for sending/receiving data. The main thread will not return.
However, if the setup of sockets fail, I want to return a error code to the other python script that executed the subprocess.
main.py
py3output = subprocess.check_output(['python3', 'py3.py'])
print('py3 said:' + str(py3output))
py3.py
def returnme():
return 10
returnme()
When I run this, it prints:
py3 said:b''
I am just trying to figure out how to get the return value back to the main calling program.
To return an exit code n back to the OS, you need sys.exit(n). But seems like you do not want to check the exit code but the stdout otput. So your program might need to rewrite to:
def returnme():
return 10
print(returnme())
You should only return a string as a standard output using following code:
sample.py
import sys
def returnme():
sys.stdout.write(str(10))
sys.stdout.flush()
returnme()
main.py
from subprocess import check_output
output = check_output(['python','sample.py'])
print('Sample.py says :' + output)

Passing arguments to execfile in python 2.7

I need to call one python script from another script,I'm trying to do it with the help of execfile function.I need to pass a dictionary as an argument to the calling function.Is there any possibility to do that?
import subprocess
from subprocess import Popen
-------To read the data from xls-----
ret_lst = T_read("LDW_App05")
for each in ret_lst:
lst.append(each.replace(' ','-'))
lst.append(' ')
result = Popen(['python','LDW_App05.py'] + lst ,stdin = subprocess.PIPE,stdout = subprocess.PIPE).communicate()
print result
Here,in the above code I'm reading the Input data from the Excel sheet in the form of list,I need to pass the list as an argument to LDW_App05.py file
Instead of passing complex data as CL arguments, I propose piping your data via the STDIN/STDOUT - then you don't need to worry about escaping special, shell-significant chars and exceeding the maximum command line length.
Typically, as CL argument-based script you might have something like app.py:
import sys
if __name__ == "__main__": # ensure the script is run directly
if len(sys.argv) > 1: # if at least one CL argument was provided
print("ARG_DATA: {}".format(sys.argv[1])) # print it out...
else:
print("usage: python {} ARG_DATA".format(__file__))
It clearly expects an argument to be passed and it will print it out if passed from another script, say caller.py:
import subprocess
out = subprocess.check_output(["python", "app.py", "foo bar"]) # pass foo bar to the app
print(out.rstrip()) # print out the response
# ARG_DATA: foo bar
But what if you want to pass something more complex, let's say a dict? Since a dict is a hierarchical structure we'll need a way to present it in a single line. There are a lot of formats that would fit the bill, but let's stick to the basic JSON, so you might have your caller.py set to something like this:
import json
import subprocess
data = { # our complex data
"user": {
"first_name": "foo",
"last_name": "bar",
}
}
serialized = json.dumps(data) # serialize it to JSON
out = subprocess.check_output(["python", "app.py", serialized]) # pass the serialized data
print(out.rstrip()) # print out the response
# ARG_DATA: {"user": {"first_name": "foo", "last_name": "bar"}}
Now if you modify your app.py to recognize the fact that it's receiving JSON as an argument you can deserialize it back to Python dict to access its structure:
import json
import sys
if __name__ == "__main__": # ensure the script is run directly
if len(sys.argv) > 1:
data = json.loads(sys.argv[1]) # parse the JSON from the first argument
print("First name: {}".format(data["user"]["first_name"]))
print("Last name: {}".format(data["user"]["last_name"]))
else:
print("usage: python {} JSON".format(__file__))
Then if you run your caller.py again you'll get:
First name: foo
Last name: bar
But this is very tedious and JSON is not very friendly to the CL (behind the scenes Python does a ton of escaping to make it work) not to mention there is a limit (OS and shell depending) on how big your JSON can be passed this way. It's much better to use STDIN/STDOUT buffer to pass your complex data between processes. To do so, you'll have to modify your app.py to wait for input on its STDIN, and for caller.py to send serialized data to it. So, app.py can be as simple as:
import json
if __name__ == "__main__": # ensure the script is run directly
try:
arg = raw_input() # get input from STDIN (Python 2.x)
except NameError:
arg = input() # get input from STDIN (Python 3.x)
data = json.loads(arg) # parse the JSON from the first argument
print("First name: {}".format(data["user"]["first_name"])) # print to STDOUT
print("Last name: {}".format(data["user"]["last_name"])) # print to STDOUT
and caller.py:
import json
import subprocess
data = { # our complex data
"user": {
"first_name": "foo",
"last_name": "bar",
}
}
# start the process and pipe its STDIN and STDOUT to this process handle:
proc = subprocess.Popen(["python", "app.py"], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
serialized = json.dumps(data) # serialize data to JSON
out, err = proc.communicate(serialized) # send the serialized data to proc's STDIN
print(out.rstrip()) # print what was returned on STDOUT
and if you invoke caller.py you again get:
First name: foo
Last name: bar
But this time there is no limit to the data size you're passing over to your app.py and you don't have to worry if a certain format would be messed up during shell escaping etc. You can also keep the 'channel' open and have both processes communicate with each other in a bi-directional fashion - check this answer for an example.

python or bash script that does something when there is no response(output)

There is an external program A.
I want to write a script that does some action if the called external program A does not bring up any output(stout).
How is this possible in bash or python?
You can use the subprocess module which allows you to execute system calls and store its output in variables which can be used later on.
#!/usr/bin/python
import subprocess as sub
ur_call = '<your system call here>'
p = sub.Popen(ur_call, stdout=sub.PIPE,stderr=sub.PIPE)
output, errors = p.communicate()
if len(output) == 0 and len(errors) == 0:
pass # Do something
In a Bash-script, you could redirect the output to a file, and if the length of the file is zero then there was no output.
If the script that sometimes gives output is no.sh then you can do this in Python:
import os
x = os.popen("./no.sh")
y = x.read()
if y:
print "Got output"

How to use subprocess to execute programs with Python

Hello i am using the subprocess.Popen() class and i succesful execute commands on the terminal, but when i try to execute programs for example an script written on Python and i try to pass arguments the system fails.
This is the code:
argPath = "test1"
args = open(argPath, 'w')
if self.extract.getByAttr(self.block, 'name', 'args') != None:
args.write("<request>"+self.extract.getByAttr(self.block, 'name', 'args')[0].toxml()+"</request>")
else:
args.write('')
car = Popen(shlex.split('python3.1 /home/hidura/webapps/karinapp/Suite/ForeingCode/saveCSS.py', stdin=args, stdout=subprocess.PIPE, stderr=subprocess.PIPE))
args.close()
dataOut = car.stdout.read().decode()
log = car.stderr.read().decode()
if dataOut!='':
return dataOut.split('\n')
elif log != '':
return log.split('\n')[0]
else:
return None
And the code from the saveCSS.py
from xml.dom.minidom import parseString
import os
import sys
class savCSS:
"""This class has to save
the changes on the css file.
"""
def __init__(self, args):
document = parseString(args)
request = document.firstChild
address = request.getElementsByTagName('element')[0]
newdata = request.getElementsByTagName('element')[1]
cssfl = open("/webapps/karinapp/Suite/"+address.getAttribute('value'), 'r')
cssData = cssfl.read()
cssfl.close()
dataCSS = ''
for child in newdata.childNodes:
if child.nodeType == 3:
dataCSS += child.nodeValue
nwcssDict = {}
for piece in dataCSS.split('}'):
nwcssDict[piece.split('{')[0]] = piece.split('{')[1]
cssDict = {}
for piece in cssData.split('}'):
cssDict[piece.split('{')[0]] = piece.split('{')[1]
for key in nwcssDict:
if key in cssDict == True:
del cssDict[key]
cssDict[key] = nwcssDict[key]
result = ''
for key in cssDict:
result += key+"{"+cssDict[key]+"}"
cssfl = open(cssfl.name, 'a')
cssfl.write(result)
cssfl.close()
if __name__ == "__main__":
savCSS(sys.stdin)
BTW: There's no output...
Thanks in advance.
OK, I'm ignoring that your code doesn't run (neither the script you try to execute, not the main script actually works), and looking at what you are doing:
It does execute the script, or you would get an error, like "bin/sh: foo: not found".
Also you seem to be using an open file as stdin after you have written to it. That doesn't work.
>>> thefile = open('/tmp/foo.txt', 'w')
>>> thefile.write("Hej!")
4
>>> thefile.read()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
IOError: not readable
You need to close the file, and reopen it as a read file. Although better in this case would be to use StringIO, I think.
To talk to the subprocess, you use communicate(), not read() on the pipes.
I'm not sure why you are using shell=True here, it doesn't seem necessary, I would remove it if I was you, it only complicates stuff unless you actually need the shell to do things.
Specifically you should not split the command into a list when using shell=True. What your code is actually doing, is starting a Python prompt.
You should rather use communicate() instead of .stdout.read().
And the code you posted isn't even correct:
Popen(shlex.split('python3.1 /home/hidura/webapps/karinapp/Suite/ForeingCode/saveCSS.py', stdin=args, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
There's a missing parenthesis, and from the stdout/stderr parameters, it's clear that you get no output to the console, but rather into pipes (if that's what you meant by "There's no output...").
Your code will actually work on Windows, but on Linux you must remove the shell=True parameter. You should always omit that parameter if you provide the full command line yourself (as a sequence).

Categories

Resources