I'm trying to execute a perl script within another python script. My code is as below:
command = "/path/to/perl/script/" + "script.pl"
input = "< " + "/path/to/file1/" + sys.argv[1] + " >"
output = "/path/to/file2/" + sys.argv[1]
subprocess.Popen(["perl", command, "/path/to/file1/", input, output])
When execute the python script, it returned:
No info key.
All path leading to the perl script as well as files are correct.
My perl script is executed with command:
perl script.pl /path/to/file1/ < input > output
Any advice on this is much appreciate.
The analog of the shell command:
#!/usr/bin/env python
from subprocess import check_call
check_call("perl script.pl /path/to/file1/ < input > output", shell=True)
is:
#!/usr/bin/env python
from subprocess import check_call
with open('input', 'rb', 0) as input_file, \
open('output', 'wb', 0) as output_file:
check_call(["perl", "script.pl", "/path/to/file1/"],
stdin=input_file, stdout=output_file)
To avoid the verbose code, you could use plumbum to emulate a shell pipeline:
#!/usr/bin/env python
from plumbum.cmd import perl $ pip install plumbum
((perl["script.pl", "/path/to/file1"] < "input") > "output")()
Note: Only the code example with shell=True runs the shell. The 2nd and 3rd examples do not use shell.
Related
How to capture bash command output using python script.
For eg:
running below in linux :
[root#xxxxxx oracle]# echo sumit
sumit
[root#xxxxxx oracle]#
How can i re print only the above output using python script ? like running python test.py shoud give 'sumit' as output. i tried below:
test.py :
import sys
sys.stdout.flush()
out = sys.stdin.readline()
print(out)
Above prints only the input i type but not the already displayed output
With subprocess, you can run commands and check their return code, stdout and stderr outputs. Would that help?
For example:
import subprocess as proc
byte_output = proc.check_output(["ls", "-1", "."])
str_output = str(byte_output, "utf-8")
print(str_output)
# prints my local folders dev\ngit
I am under Python 3.8.10 in Ubuntu 20.04 trying to execute a multiline bash command and get its output. For this I am trying to combine this and this. My bash command is this:
/home/foo/.drsosc/drs-5.0.6/drscl << ASD
info
exit
ASD
and it works as I want. Now in Python I have this:
from pathlib import Path
import subprocess
PATH_TO_drscl = Path.home()/Path('.drsosc/drs-5.0.6/drscl')
def send_command(cmd: str):
execute_this = f'''{PATH_TO_drscl} << ASD
{cmd}
exit
ASD'''
return subprocess.run([execute_this], stdout=subprocess.PIPE)
print(send_command('info'))
but I get
FileNotFoundError: [Errno 2] No such file or directory: '/home/foo/.drsosc/drs-5.0.6/drscl << ASD\ninfo\nexit\nASD'
It seems that the problem is with the '\n' not being properly interpreted?
I found that this works as I want:
result = subprocess.run(
str(PATH_TO_drscl),
input = f'{cmd}\nexit',
text = True,
stdout = subprocess.PIPE
)
No, the problem is that you're trying to run small shell script but
you're calling an executable that has a name composed of all commands
in the script. Try with shell=True:
return subprocess.run([execute_this], stdout=subprocess.PIPE, shell=True)
I have written a C code where I have converted one file format to another file format. To run my C code, I have taken one command line argument : filestem.
I executed that code using : ./executable_file filestem > outputfile
Where I have got my desired output inside outputfile
Now I want to take that executable and run within a python code.
I am trying like :
import subprocess
import sys
filestem = sys.argv[1];
subprocess.run(['/home/dev/executable_file', filestem , 'outputfile'])
But it is unable to create the outputfile. I think some thing should be added to solve the > issue. But unable to figure out. Please help.
subprocess.run has optional stdout argument, you might give it file handle, so in your case something like
import subprocess
import sys
filestem = sys.argv[1]
with open('outputfile','wb') as f:
subprocess.run(['/home/dev/executable_file', filestem],stdout=f)
should work. I do not have ability to test it so please run it and write if it does work as intended
You have several options:
NOTE - Tested in CentOS 7, using Python 2.7
1. Try pexpect:
"""Usage: executable_file argument ("ex. stack.py -lh")"""
import pexpect
filestem = sys.argv[1]
# Using ls -lh >> outputfile as an example
cmd = "ls {0} >> outputfile".format(filestem)
command_output, exitstatus = pexpect.run("/usr/bin/bash -c '{0}'".format(cmd), withexitstatus=True)
if exitstatus == 0:
print(command_output)
else:
print("Houston, we've had a problem.")
2. Run subprocess with shell=true (Not recommended):
"""Usage: executable_file argument ("ex. stack.py -lh")"""
import sys
import subprocess
filestem = sys.argv[1]
# Using ls -lh >> outputfile as an example
cmd = "ls {0} >> outputfile".format(filestem)
result = subprocess.check_output(shlex.split(cmd), shell=True) # or subprocess.call(cmd, shell=True)
print(result)
It works, but python.org frowns upon this, due to the chance of a shell injection: see "Security Considerations" in the subprocess documentation.
3. If you must use subprocess, run each command separately and take the SDTOUT of the previous command and pipe it into the STDIN of the next command:
p = subprocess.Popen(cmd, stdin=PIPE, stdout=PIPE)
stdout_data, stderr_data = p.communicate()
p = subprocess.Popen(cmd, stdin=stdout_data, stdout=PIPE)
etc...
Good luck with your code!
I have thoroughly confused myself with Python subprocess syntax!
I would like to decrypt a string using openssl from within a Python script.
Here is the bash script snippet that works:
readable_code=$(echo "$encrypted_code"| openssl enc -aes-128-cbc -a -d -salt -pass pass:$key)
So in a python script - I understand that to run this same bash command I should use subprocess.
I need to Pipe the echo to the openssl command and as well pass in the encrypted_code and key variables dynamically(its in a loop).
Anyone out there know the correct syntax for this ?
Below's snippet should give the background to what i'm trying to do.
thank-you
import subprocess
key = "my-secret-key"
file = list_of_ips #format ip:long-encrypted-code
with open(file_read) as f:
#read in all connecion requests
content=f.readlines()
#create list that will hold all ips whose decrypted codes have passed test
elements = []
for ip_code in content:
#grab the ip address before the colon
ip = ip_code.split(':', 1)[0]
#grab the encrypted code after the colon
code = ip_code.split(':',1)[1]
#here is where I want to run the bash command and assign to a python variable
decrypted_code = subprocess....using code and key variables
...on it goes....
To emulate the shell command:
$ readable_code=$(echo "$encrypted_code"| openssl enc -aes-128-cbc -a -d -salt -pass "pass:$key")
using subprocess module in Python:
from subprocess import Popen, PIPE
cmd = 'openssl enc -aes-128-cbc -a -d -salt -pass'.split()
p = Popen(cmd + ['pass:' + key], stdin=PIPE, stdout=PIPE)
readable_code = p.communicate(encrypted_code)[0]
I highly recommend you to use Plumbum Python library to write shell scripts.
Particularly it has a convenient way to do piping and redirection.
I don't really understood what exact task you trying to solve, but your code could look approximately like this:
from plubum.cmd import openssl
with open('file') as f:
for ip_code in f:
(openssl['whatever', 'params'] << ip_code)()
I'm using subprocess.popen with shlex to call a remote bash script using ssh. This command works quite fine on bash itself. But as soon as I try to translate it to python and shlex with subprocess.popen it errs out.
Remote bash script:
#!/bin/bash
tmp="";
while read -r line;
do
tmp="$tmp $line\n";
done;
echo $tmp;
BASH CMD RESULT(Invoking the remote bash script on the command line)
$> ssh x.x.x.x cat < /tmp/bef69a1d-e580-5780-8963-6a9b950e529f.txt " | /path/to/bash/script.sh;"
Bar\n
$>
Python code
import shlex
import subprocess
fn = '/tmp/bef69a1d-e580-5780-8963-6a9b950e529f.txt'
s = """
ssh x.x.x.x cat < {localfile} '| /path/to/bash/script.sh;'
""".format(localfile=fn)
print s
lexer = shlex.shlex(s)
lexer.quotes = "'"
lexer.whitespace_split = True
sbash = list(lexer)
print sbash
# print buildCmd
proc=subprocess.Popen(sbash,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
out,err=proc.communicate()
print "Out: " + out
print "Err: " + err
PYTHON SCRIPT RESULT:
$> python rt.py
ssh x.x.x.x cat < /tmp/bef69a1d-e580-5780-8963-6a9b950e529f.txt '| /path/to/bash/script.sh'
['ssh', 'x.x.x.x', 'cat', '<', '/tmp/bef69a1d-e580-5780-8963-6a9b950e529f.txt', "'| /path/to/bash/script.sh'"]
Out:
Err: bash: /tmp/bef69a1d-e580-5780-8963-6a9b950e529f.txt: No such file or directory
$>
What am I missing?
The problem is that you're using shell redirection in the command, but there's no shell spawned when using subprocess.
Consider the following (very simple) program:
import sys
print sys.argv
Now if we run it like you're running ssh (assuming foofile.txt exists), we get:
python argcheck.py ssh cat < foofile.txt " | /path/to/bash/script.sh;"
['argcheck.py', 'ssh', 'cat', ' | /path/to/bash/script.sh;']
Notice that < foofile.txt never make it to python's commandline arguments. That's because the bash parser intercepts the < and the file that comes after it and redirects the contents of that file to your program's stdin. In other words, ssh is reading the file from stdin. You want your file to be passed to stdin of ssh using python as well.
s = """
ssh x.x.x.x cat '| /path/to/bash/script.sh;'
"""
#<snip>
proc=subprocess.Popen(sbash,stdout=subprocess.PIPE,stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
out,err=proc.communicate(open(fn).read())
will work presumably.
The following works for me:
import subprocess
from subprocess import PIPE
with open('foo.h') as f:
p = subprocess.Popen(['ssh','mgilson#XXXXX','cat','| cat'],stdin=f,stdout=PIPE,stderr=PIPE)
out,err = p.communicate()
print out
print '#'*80
print err
And the equivalent command in bash:
ssh mgilson#XXXXX cat < foo.h '| cat'
where foo.h is a file on my local machine.