i am executing a r script from python and i want the output to be available in the python variable. how can i do that?
python script:
import subprocess
def runR(id, lat, long):
value = subprocess.popen("C:/R/R-3.2.0/bin/Rscript E:/Personal/python/script.R --args "+id+" "+lat+" "+long , shell=True)
print value
R script :
a = "Hello";
I want Hello to be availabe on the python variable value.
You could use rpy2:
import rpy2.robjects as robjects
robjects.r('''
a = "Hello";
''')
a = robjects.r['a']
As an alternative, you could rewrite your R script so that it would dump its result to stdout in some well-known format such as json, then run it using subprocess module, and parse the result:
#!/usr/bin/env python
import json
import subprocess
id, lat, long = 1, 40, 74
out = subprocess.check_output(r"C:\R\R-3.2.0\bin\Rscript.exe "
r"E:\path\to\script.R --args "
"{id} {lat} {long}".format(**vars()))
data = json.loads(out.decode('utf-8'))
Note: no need to use shell=True on Windows if you use the full path to the executable here.
You can modify the following example by your need.
a.py:
print 'hello'
b.py:
import subprocess
result = subprocess.check_output(["python", "a.py"])
print result.strip() + 'world'
Output of b.py:
helloworld
Related
I have a file which connects a database and fetches the result. Now the file must be ran using python 3 and my project uses python 2.7. So I run the file as a command line using subprocess module. Here is how I call the file.
import subprocess
import ast
def execute_python3(param):
param = param.replace("\\", "")
param = "\"" + param + "\""
cmd = "python3 " + "get_db_result.py" + " " + param
result = subprocess.check_output(cmd, shell=True)
return ast.literal_eval(result)
execute_python3(sql_query)
Here in the command, I am passing sql query to the get_db_result file.
The get_db_result.py file looks something like this
import sys
def get_result():
param = sys.argv[1]
'''
Logic to get result from db
'''
result = db_output
print(result)
if __name__ == "__main__":
get_result()
Now the issue is when I fetch the output from db, I have to do a print for the output to be captured by the subprocess module. This makes it difficult to parse the output to be used by program for further work. For example, when I receive an output like this
"[(u'Delhi', 20199330), (u'Mumbai', 134869470), (u'Kolkata', 6678446)]"
This is a string list of tuples which can be converted to list of tuples by doing something like ast.literal_eval(result)
But sometimes I get output like this
"[(datetime.date(2019, 5, 27), 228.168093587), (datetime.date(2019, 5, 28), 228.834493641)]"
Here ast doesn't understand datetime. Even json.loads() doesn't work on this.
How can I capture the output from a file without having to use print and simply return it back to subprocess as it is. Is it even possible?
You need to serialize and deserialize the data on both ends. Simplest solution would be to use Python's pickle module and hope the types that are serialized on the Python 3 end, are similar enough to those on the deserializing Python 2 end. You need to specify the used protocol on the sending end to a version understood by the receiving end:
Receiver with safer call of subprocess (no shell process in between):
#!/usr/bin/env python
import pickle
import subprocess
def execute_python3(param):
result = subprocess.check_output(['python3', 'get_db_result.py', param])
return pickle.loads(result)
def main():
execute_python3(sql_query)
if __name__ == '__main__':
main()
Sender, explicitly choosing a pickle protocol still understood by Python 2:
#!/usr/bin/env python3
import sys
import pickle
def get_result():
param = sys.argv[1]
'''
Logic to get result from db
'''
result = db_output
pickle.dump(result, sys.stdout.buffer, protocol=2)
if __name__ == '__main__':
get_result()
If this doesn't work because of differences in the (de)serialized objects between Python 2 and 3, you have to fall back to explicitly (de)serialize the data, for example in JSON, as suggested by a comment from Jay.
I have this code:
from subprocess import Popen
link="abc"
theproc = Popen([sys.executable, "p1.py",link])
I want to send the variable "link" to p1.py,
and p1.py will print it.
something like this. here is p1.py:
print "in p1.py link is "+ link
How can I do that?
I'm assuming python refers to Python 2.x on your system.
Retrieve the command line argument in p1.py using sys.argv:
import sys
if not len(sys.argv) > 1:
print "Expecting link argument."
else:
print "in p1.py link is " + sys.argv[1]
There's a function subprocess.check_output that is easier to use if you only want to call a program and retrieve its output:
from subprocess import check_output
output = check_output(["python", "p1.py", "SOME_URL"])
print "p1.py returned the following output:\n'{}'".format(output)
Example output:
$ python call_p1.py
p1.py returned the following output:
'in p1.py link is SOME_URL
'
You have to parse the command line arguments in your p1.py to get it in a variable:
import sys
try:
link = sys.argv[1]
except IndexError:
print 'argument missing'
sys.exit(1)
I am trying to pass JSON parameters through command line in Python:
automation.py {"cmd":"sel_media","value":"5X7_photo_paper.p}
how can I extract the values sel_media and 5X7_photo_paper.p?
I used the following code, but it is not working:
cmdargs = str(sys.argv[1])
print cmdargs
Provided you pass actual valid JSON to the command line and quote it correctly, you can parse the value with the json module.
You need to quote the value properly, otherwise your shell or console will interpret the value instead:
automation.py '{"cmd":"sel_media","value":"5X7_photo_paper.p"}'
should be enough for a bash shell.
In Python, decode with json.loads():
import sys
import json
cmdargs = json.loads(sys.argv[1])
print cmdargs['cmd'], cmdargs['value']
Demo:
$ cat demo.py
import sys
import json
cmdargs = json.loads(sys.argv[1])
print cmdargs['cmd'], cmdargs['value']
$ bin/python demo.py '{"cmd":"sel_media","value":"5X7_photo_paper.p"}'
sel_media 5X7_photo_paper.p
The above is generally correct, but I ran into issues with it when running on my own python script
python myscript.py '{"a":"1"}'
does not work directly in my terminal
so I did
python myscript.py '{\"a\":\"1\"}'
I have a script a.py :
#!/usr/bin/env python
def foo(arg1, arg2):
return int(arg1) + int(arg2)
if __name__ == "__main__":
import sys
print foo(sys.argv[1], sys.argv[2])`
I now want to make a script that can run the first script and write the output of a.py to a file with some arguments as well. I want to make the automate_output(src,arglist) generate some kind of an output that I can write to the outfile :
import sys
def automate_output(src, arglist):
return ""
def print_to_file (src, outfile, arglist):
print "printing to file %s" %(outfile)
out = open(outfile, 'w')
s = open(src, 'r')
for line in s:
out.write(line)
s.close()
out.write(" \"\"\"\n Run time example: \n")
out.write(automate(src, arglist))
out.write(" \"\"\"\n")
out.close()
try:
src = sys.argv[1]
outfile = sys.argv[2]
arglist = sys.argv[3:]
automate(src, arglist)
print_to_file(src,outfile,arglist)
except:
print "error"
#print "usage : python automate_runtime.py scriptname outfile args"
I have tried searching around, but so far I do not understand how to pass arguments by using os.system with arguments. I have also tried doing :
import a
a.main()
There I get a NameError: name 'main' is not defined
Update :
I researched some more and found subprocess and I'm quite close to cracking it now it seems.
The following code does work, but I would like to pass args instead of manually passing '2' and '3'
src = 'bar.py'
args = ('2' , '3')
proc = subprocess.Popen(['python', src, '2' , '3'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
print proc.communicate()[0]
This is not a function, it's an if statement:
if __name__ == "__main__":
...
If you want a main function, define one:
import sys
def main():
print foo(sys.argv[1], sys.argv[2])`
Then just call it if you need to:
if __name__ == "__main__":
main()
a.main() has nothing to do with if __name__=="__main__" block. The former calls a function named main() from a module, the latter executes its block if current module name is __main__ i.e., when a module is called as a script.
#!/usr/bin/env python
# a.py
def func():
print repr(__name__)
if __name__=="__main__":
print "as a script",
func()
Compare a module executed as a script and a function called from the imported module:
$ python a.py
as a script '__main__'
$ python -c "import a; print 'via import',; a.func()"
via import 'a'
See section Modules in the Python tutorial.
To get output from the subprocess you could use subprocess.check_output() function:
import sys
from subprocess import check_output as qx
args = ['2', '3']
output = qx([sys.executable, 'bar.py'] + args)
print output
I'm writing a script that needs to take advantage of a Java daemon via the local dbus of the linux machines it will run on. This daemon in particular will return an array of tuples which I want so that I can parse through/use the information in later in my code. I want this code to take this value from multiple machines at once, but the problem is the only way I see to really take return/exit values from a terminal which I am ssh'ed into is by parsing stdout's output. I don't want to do this, I'd much prefer to get the actual variable. Right now I have this:
import os
message = "import dbus, sys\nbus=dbus.SystemBus()\nremote_object=bus.get_object('daemon.location', '/daemon')\ncontroller=dbus.Interface(remote_object, 'daemon.path')\nsys.exit(controller.getValue())"
x = os.system('echo \-e "%s" \| ssh %s python' %(message, ip))
In this example when I run "controller.getValue()" it returns an array of tuples. I'm trying to figure out a way to get that array. When using something like popen it pipes the output in stdout into a file and returns it to you, that way you get a string equivalent of the array. What I'm trying to figure out is how to get the actual array. As if to pass the variable returned when exiting the ssh tty into my code. Any ideas?
You can't avoid serialization if there is no shared memory. There are only bytes on the wire.
You could use a library that hides it from you e.g., with execnet module:
#!/usr/bin/env python
import execnet
gw = execnet.makegateway("ssh=user#host")
channel = gw.remote_exec("""
import dbus, sys
bus = dbus.SystemBus()
remote_object = bus.get_object('daemon.location', '/daemon')
controller = dbus.Interface(remote_object, 'daemon.path')
channel.send(controller.getValue())
""")
tuple_ = channel.receive()
print tuple_
print tuple_[0]
But it easy to parse simple tuple values yourself using ast.literal_eval() from stdlib:
#fabfile.py
import ast
from fabric.api import run
def getcontroller():
"""Return controller value."""
cmd = """
import dbus, sys
bus = dbus.SystemBus()
remote_object = bus.get_object('daemon.location', '/daemon')
controller = dbus.Interface(remote_object, 'daemon.path')
print repr(controller.getValue())
""" #NOTE: you must escape all quotation marks
output = run('python -c "%s"' % cmd)
tuple_ = ast.literal_eval(output)
print tuple_[0]
Example: $ fab getcontroller -H user#host
Here I've used fabric to run the command on remote host.
You could use JSON as a serialization format if the other end doesn't produce Python literals:
>>> import json
>>> t = (1, "a")
>>> json.dumps(t)
'[1, "a"]'
>>> json.loads(_)
[1, u'a']
>>>
Why not use popen?
lines = os.popen("your command here").readlines()
If you just want a shell variable then you could do this
$ FOO="myFOO"
$ export FOO
$ cat x.py
#!/usr/bin/python
import os
print os.environ['FOO']
$ ./x.py
myFOO
$
If you want the return code of a program:
try:
retcode = call("mycmd" + " myarg", shell=True)
if retcode < 0:
print >>sys.stderr, "Child was terminated by signal", -retcode
else:
print >>sys.stderr, "Child returned", retcode
except OSError, e:
print >>sys.stderr, "Execution failed:", e
If you could probably explain you requirement a little better, you might get better help