I have thoroughly confused myself with Python subprocess syntax!
I would like to decrypt a string using openssl from within a Python script.
Here is the bash script snippet that works:
readable_code=$(echo "$encrypted_code"| openssl enc -aes-128-cbc -a -d -salt -pass pass:$key)
So in a python script - I understand that to run this same bash command I should use subprocess.
I need to Pipe the echo to the openssl command and as well pass in the encrypted_code and key variables dynamically(its in a loop).
Anyone out there know the correct syntax for this ?
Below's snippet should give the background to what i'm trying to do.
thank-you
import subprocess
key = "my-secret-key"
file = list_of_ips #format ip:long-encrypted-code
with open(file_read) as f:
#read in all connecion requests
content=f.readlines()
#create list that will hold all ips whose decrypted codes have passed test
elements = []
for ip_code in content:
#grab the ip address before the colon
ip = ip_code.split(':', 1)[0]
#grab the encrypted code after the colon
code = ip_code.split(':',1)[1]
#here is where I want to run the bash command and assign to a python variable
decrypted_code = subprocess....using code and key variables
...on it goes....
To emulate the shell command:
$ readable_code=$(echo "$encrypted_code"| openssl enc -aes-128-cbc -a -d -salt -pass "pass:$key")
using subprocess module in Python:
from subprocess import Popen, PIPE
cmd = 'openssl enc -aes-128-cbc -a -d -salt -pass'.split()
p = Popen(cmd + ['pass:' + key], stdin=PIPE, stdout=PIPE)
readable_code = p.communicate(encrypted_code)[0]
I highly recommend you to use Plumbum Python library to write shell scripts.
Particularly it has a convenient way to do piping and redirection.
I don't really understood what exact task you trying to solve, but your code could look approximately like this:
from plubum.cmd import openssl
with open('file') as f:
for ip_code in f:
(openssl['whatever', 'params'] << ip_code)()
Related
I have written a C code where I have converted one file format to another file format. To run my C code, I have taken one command line argument : filestem.
I executed that code using : ./executable_file filestem > outputfile
Where I have got my desired output inside outputfile
Now I want to take that executable and run within a python code.
I am trying like :
import subprocess
import sys
filestem = sys.argv[1];
subprocess.run(['/home/dev/executable_file', filestem , 'outputfile'])
But it is unable to create the outputfile. I think some thing should be added to solve the > issue. But unable to figure out. Please help.
subprocess.run has optional stdout argument, you might give it file handle, so in your case something like
import subprocess
import sys
filestem = sys.argv[1]
with open('outputfile','wb') as f:
subprocess.run(['/home/dev/executable_file', filestem],stdout=f)
should work. I do not have ability to test it so please run it and write if it does work as intended
You have several options:
NOTE - Tested in CentOS 7, using Python 2.7
1. Try pexpect:
"""Usage: executable_file argument ("ex. stack.py -lh")"""
import pexpect
filestem = sys.argv[1]
# Using ls -lh >> outputfile as an example
cmd = "ls {0} >> outputfile".format(filestem)
command_output, exitstatus = pexpect.run("/usr/bin/bash -c '{0}'".format(cmd), withexitstatus=True)
if exitstatus == 0:
print(command_output)
else:
print("Houston, we've had a problem.")
2. Run subprocess with shell=true (Not recommended):
"""Usage: executable_file argument ("ex. stack.py -lh")"""
import sys
import subprocess
filestem = sys.argv[1]
# Using ls -lh >> outputfile as an example
cmd = "ls {0} >> outputfile".format(filestem)
result = subprocess.check_output(shlex.split(cmd), shell=True) # or subprocess.call(cmd, shell=True)
print(result)
It works, but python.org frowns upon this, due to the chance of a shell injection: see "Security Considerations" in the subprocess documentation.
3. If you must use subprocess, run each command separately and take the SDTOUT of the previous command and pipe it into the STDIN of the next command:
p = subprocess.Popen(cmd, stdin=PIPE, stdout=PIPE)
stdout_data, stderr_data = p.communicate()
p = subprocess.Popen(cmd, stdin=stdout_data, stdout=PIPE)
etc...
Good luck with your code!
I am appending text to a file that requires sudo permissions. When I run this python script below:
import subprocess
ssid= "testing"
psk= "testing1234"
p1 = subprocess.Popen(["wpa_passphrase", ssid, psk], stdout=subprocess.PIPE)
p2 = subprocess.Popen(["sudo","tee","-a","/etc/wpa_supplicant/wpa_supplicant.conf",">","/dev/null"], stdin=p1.stdout, stdout=subprocess.PIPE)
p1.stdout.close()
output,err = p2.communicate
It will append to the file as expected, but will append this:
network={
ssid="testing"
#psk="testing1234"
psk=9891dab18debe8308a5d3bf596f5277e4a5c158bff016145830b12673ef63360
}
When I want this:
network={
ssid="testing"
psk="testing1234"
key_mgmt=WPA-PSK
}
This subprocess syntax is complicated to me, so I am open to an alternative method! I tried to use f=open("appendedtext >> /etc/wpa_supplicant/wpa_supplicant.conf") but I need to run as sudo and I can't seem to find a way to do this via open(). I get a permission error.
Any help is appreciated!!
It's not python or subrpocess issue, you're getting expected output from wpa_passphrase, see the man page:
NAME
wpa_passphrase - Generate a WPA PSK from an ASCII passphrase for a SSID
SYNOPSIS
wpa_passphrase [ ssid ] [ passphrase ]
OVERVIEW
wpa_passphrase pre-computes PSK entries for network configuration blocks of a wpa_supplicant.conf file. An ASCII passphrase and SSID are
used to generate a 256-bit PSK.
if you need plain text password just write it to file without calling wpa_passphrase:
with open('/etc/wpa_supplicant/wpa_supplicant.conf', 'a') as conf:
conf.writelines(['network={\n', '\tssid="{0}"\n'.format(ssid), '\tpsk="{0}"\n'.format(psk), '\tkey_mgmt=WPA-PSK\n', '}\n'])
and don't forget to call it with sudo: sudo python script.py.
I'm trying to execute a perl script within another python script. My code is as below:
command = "/path/to/perl/script/" + "script.pl"
input = "< " + "/path/to/file1/" + sys.argv[1] + " >"
output = "/path/to/file2/" + sys.argv[1]
subprocess.Popen(["perl", command, "/path/to/file1/", input, output])
When execute the python script, it returned:
No info key.
All path leading to the perl script as well as files are correct.
My perl script is executed with command:
perl script.pl /path/to/file1/ < input > output
Any advice on this is much appreciate.
The analog of the shell command:
#!/usr/bin/env python
from subprocess import check_call
check_call("perl script.pl /path/to/file1/ < input > output", shell=True)
is:
#!/usr/bin/env python
from subprocess import check_call
with open('input', 'rb', 0) as input_file, \
open('output', 'wb', 0) as output_file:
check_call(["perl", "script.pl", "/path/to/file1/"],
stdin=input_file, stdout=output_file)
To avoid the verbose code, you could use plumbum to emulate a shell pipeline:
#!/usr/bin/env python
from plumbum.cmd import perl $ pip install plumbum
((perl["script.pl", "/path/to/file1"] < "input") > "output")()
Note: Only the code example with shell=True runs the shell. The 2nd and 3rd examples do not use shell.
I need to execute a shell command in python and need to store the result to a variable. How can I perform this.
I need to execute openssl rsautl -encrypt -inkey key and get the result to a variable.
---edit---
How can I execute
perl -e 'print "hello world"' | openssl rsautl -encrypt -inkey key
in python and get the output..
You can use subprocess.check_output
from subprocess import check_output
out = check_output(["openssl", "rsautl", "-encrypt", "-inkey", "key"])
The output will be stored in out.
A Simple way to execute a shell command is os.popen:
import os
cmdOutput1 = os.popen("openssl rsautl -encrypt -inkey key").readlines()
cmdOutput2 = os.popen("perl -e 'print \"hello world\"' | openssl rsautl -encrypt -inkey key").readlines()
All it takes is the command you want to run in the form of one String. It will return you an open file object. By using .readlines() this open file object will be converted to a list, where an Item in the List will correspond to a single line of Output from your command.
I was trying to invoke curl from subprocess to download images, but kept getting curl error (error code 2 ..which from doc refers to CURL_FAILED_INIT). I am not using urllib as i will eventually be executing a script using subprocess. Following is the code snippet
import subprocess
import multiprocessing
def worker(fname, k):
f = open(fname, 'r')
i = 0
for imgurl in f:
try:
op = subprocess.call(['curl', '-O', imgurl], shell=False)
except:
print 'problem downloading image - ', imgurl
def main():
flist = []
flist.append(sys.argv[1])
flist.append(sys.argv[2])
...
for k in range(1):
p = multiprocessing.Process(target=worker, args=(flist[k],k))
p.start()
O/P:
curl: try 'curl --help' or 'curl --manual' for more information
2
curl: try 'curl --help' or 'curl --manual' for more information
2
....
If you want to run a shell command, subprocess is the way to go. As this can start a shell command in its own process, use of multiprocessing is at best redundant. Multiprocessing comes in handy when you want to run a function of your python program in distinct process. You appear to intend to run a shell command, not a python function.
I am not familiar with curl. If your intent is to get the standard output from curl, use subprocess.Popen(). subprocess.call() returns the program return code, not stdout.
See http://docs.python.org/release/3.2/library/subprocess.html
Something like:
subp = subprocess.Popen(['curl', '-O', imgurl], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
curlstdout, curlstderr = subp.communicate()
op = str(curlstdout)
might be closer. Not familiar with curl as I said, so your program may vary.