How can I respond to a prompt in os.system? - python

I'm trying to do something like this:
from subprocess import Popen
p = Popen(["vagrant", "ssh", "vmname", "-c", '"pvcreate -ff /dev/sdb"'])
But it requires user input. Also, that didn't work anyway. They give the error: bash: pvcreate -ff /dev/sdb: command not found, because it's looking for a program pvcreate -ff /dev/sdb, instead of pvcreate with arguments. I also tried this first:
p = Popen(["vagrant", "ssh", "vmname", "-c", "pvcreate", "-ff", "/dev/sdb"])
And have resorted to using this:
os.system("vagrant ssh vmname -c 'pvcreate -ff /dev/sdb'")
However I need to say yes when it prompts me. I've already tried these options as well:
os.system("yes | vagrant ssh vmname -c 'pvcreate -ff /dev/sdb'")
os.system("echo y | vagrant ssh vmname -c 'pvcreate -ff /dev/sdb'")
Is it possible to respond to a prompt using os.system?

I'd suggest using the list form of invocation.
import subprocess
command = ["vagrant", "ssh", "vmname", "-c", "pvcreate -ff /db/sdb"]
output,error = subprocess.Popen(
command, universal_newlines=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
This represents the set of parameters that are going to be passed and eliminates the need to mess around with shell quoting.

Related

Passing variables to a script over ssh using gcloud command -- all variables treated as a single string?

I'm trying to setup a system to run some commands on VM's in google cloud, in my case we want to run a tcpdump at a certain time using the 'at' command. Right now I'm just trying to execute any commands succesfully, when I have to pass arguments along with the command and getting confusing behaviour, which appears to be that the command, and the arguments are executed as a single long command instead of seperate arguements.
I first tried in bash, and thinking my issue was one of quoting, I moved to using python to hopefully make things easier to understand, but I appear to be hitting the same issue and figure I must be doing something wrong.
I have the following functions defined in python, and call them
def execute(cmd):
popen = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
for stdout_line in iter(popen.stdout.readline, ""):
yield stdout_line
popen.stdout.close()
return_code = popen.wait()
if return_code:
raise subprocess.CalledProcessError(return_code, cmd)
def runCapture(project, instance, zone, time, duration):
## Run capture against server
print ("Running capture against Project: " + project + ", Instance: " + instance + ", Zone: " + zone, "at: " + time, "for " + str(duration) + " minutes")
## First connect, schedule capture
## Connect again, schedule upload of capture at capture time + duration time + some overrun.
## gcloud compute ssh --project=${PROJECT} ${INSTANCE} --zone="${ZONE}" --command="...do stuff..." --tunnel-through-iap
## CMD=\${1:-"/usr/sbin/tcpdump -nn -i ens4 -G \$(( ${DURATION}*60 )) -W 1 -w ./\$(uname -n)-%Y-%m-%d_%H.%M.%S.pcap"}
total_time=str(duration*60)
command="/bin/bash -c 'echo \"hello world\"'"
for path in execute(["/usr/bin/gcloud", "compute", "ssh", instance, "--project="+project, "--zone="+zone, "--tunnel-through-iap", "--command=\""+command+"\"", ]):
print(path, end="")
The resulting errors are as follows:
bash: /bin/bash -c 'echo hello: No such file or directory
Traceback (most recent call last):
File "./ingressCapture.py", line 79, in <module>
results = runCapture(project, instance, zone, time, duration)
File "./ingressCapture.py", line 33, in runCapture
for path in execute(["/usr/bin/gcloud", "compute", "ssh", instance, "--project="+project, "--zone="+zone, "--tunnel-through-iap", "--command=\""+command+"\"", ]):
File "./ingressCapture.py", line 17, in execute
raise subprocess.CalledProcessError(return_code, cmd)
subprocess.CalledProcessError: Command '['/usr/bin/gcloud', 'compute', 'ssh', 'tbtst-test3-app-egress-nztw', '--project=devops-tb-sandbox-250222', '--zone=europe-west1-b', '--tunnel-through-iap', '--command="/bin/bash -c \'echo "hello world"\'"']' returned non-zero exit status 127.
It appears to me, that instead of invoking the bash shell and running the echo command, it is instead invoking a command that includes the bash shell and then all the arguments too. I have a bash shell when I login normally via SSH, and can run the commands manually (and they work). Why are the arguments for the command from --command="....." getting called like this and how do I prevent it?
I'm pretty sure your problem is that you have too many quotes.
When you write --command="bash -c 'echo \"Hello World\"'" on the command line, the shell internally marks all the stuff inside the quotes as being in a quoted state and then removes the quotes. The actual argument that ends up going to the program is --command=bash -c 'echo "Hello World"' as a single string in argv (or your language's equivalent).
Try putting import sys ; print(sys.argv[1]) inside a small python script and calling it with ./test.py --command="bash -c 'echo \"Hello World\"'" to see for yourself.
However, in your arglist to subprocess, you're forming this string: --command="/bin/bash -c 'echo "hello world"'", presumably because you thought you needed to match what you'd normally type on the command line. You can see this in the stacktrace (minus the escaped single quotes, since that's syntax highlighting from python). Since python does not perform quote removal, those quotes are going through to the other side of your ssh connection where the login shell is attempting to reparse it as a shell command. The first "word" on the other end of the connection is /bin/bash -c 'echo hello because of those extra quotes so the shell attempts to find a command with that name on the path, and it clearly doesn't exist.
What you need to put into your arglist for subprocess is simply "--command="+command.

Can't assign bash variable in python subprocess

I am trying to assign to a variable the fingerprint of a pgp key in a bash subprocess of a python script.
Here's a snippet:
import subprocess
subprocess.run(
'''
export KEYFINGERPRINT="$(gpg --with-colons --fingerprint --list-secret-keys | sed -n 's/^fpr:::::::::\([[:alnum:]]\+\):/\1/p')"
echo "KEY FINGERPRINT IS: ${KEYFINGERPRINT}"
''',
shell=True, check=True,
executable='/bin/bash')
The code runs but echo shows an empty variable:
KEY FINGERPRINT IS:
and if I try to use that variable for other commands I get the following error:
gpg: key "" not found: Not found
HOWEVER, if I run the same exact two lines of bash code in a bash script, everything works perfectly, and the variable is correctly assigned.
What is my python script missing?
Thank you all in advance.
The problem is the backslashes in your sed command. When you paste those into a Python string, python is escaping the backslashes. To fix this, simply add an r in front of your string to make it a raw string:
import subprocess
subprocess.run(
r'''
export KEYFINGERPRINT="$(gpg --with-colons --fingerprint --list-secret-keys | sed -n 's/^fpr:::::::::\([[:alnum:]]\+\):/\1/p')"
echo "KEY FINGERPRINT IS: ${KEYFINGERPRINT}"
''',
shell=True, check=True,
executable='/bin/bash')
in order to run 2 commands in subprocess you need to run them one after each other or use ;
import subprocess
ret = subprocess.run('export KEYFINGERPRINT="$(gpg --with-colons --fingerprint --list-secret-keys | sed -n 's/^fpr:::::::::\([[:alnum:]]\+\):/\1/p')"; echo "KEY FINGERPRINT IS: ${KEYFINGERPRINT}"', capture_output=True, shell=True)
print(ret.stdout.decode())
you can use popen:
commands = '''
export KEYFINGERPRINT="$(gpg --with-colons --fingerprint --list-secret-keys | sed -n 's/^fpr:::::::::\([[:alnum:]]\+\):/\1/p')"
echo "KEY FINGERPRINT IS: ${KEYFINGERPRINT}"
'''
process = subprocess.Popen('/bin/bash', stdin=subprocess.PIPE, stdout=subprocess.PIPE)
out, err = process.communicate(commands)
print out

How can I use run instead of communicate when providing text on stdin?

trying to figure out how to do this:
command = f"adb -s {i} shell"
proc = Popen(command, stdin=PIPE, stdout=PIPE)
out, err = proc.communicate(f'dumpsys package {app_name} | grep version'.encode('utf-8'))
but in this:
command = f"adb -s {i} shell"
proc = run(command, stdin=PIPE, stdout=PIPE, shell=True)
out, err = run(f'dumpsys package {app_name} | grep version', shell=True, text=True, stdin=proc.stdout )
The idea is to make a command which require input of some kind( for example(entering shell)) and afterwards inserting another command to shell.
I've found a way online with communicate, But I wonder how to do it with run() func.
Thanks!
You only need to call run once -- pass the remote command in the input argument (and don't use shell=True in places where you don't need it).
import subprocess, shlex
proc = subprocess.run(['adb', '-s', i, 'shell'],
capture_output=True,
input=f'dumpsys package {shlex.quote(app_name)} | grep version')
shlex.quote prevents an app name that contains $(...), ;, etc from running unwanted commands on your device.

Cant use grep in subprocess command

I'm having a problem with my subprocess command, I like to grep out the lines that match with "Online" line.
def run_command(command):
p = subprocess.Popen(command,shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
return iter(p.stdout.readline, b'')
command = 'mosquitto_sub -u example -P example -t ITT/# -v | grep "Online" '.split()
for line in run_command(command):
print(line)
But I will get an error
Error: Unknown option '|'.
Use 'mosquitto_sub --help' to see usage.
But when running with linux shell
user#server64:~/Pythoniscriptid$ mosquitto_sub -u example -P example -t ITT/# -v | grep "Online"
ITT/C5/link Online
ITT/IoT/tester55/link Online
ITT/ESP32/TEST/link Online
I also tried shell = True, but with no success, because I will get another error, that dosen't recognize the topic ITT/#
Error: You must specify a topic to subscribe to.
Use 'mosquitto_sub --help' to see usage.
The "possible dublicate" didn't help me at all, So I think I'm having a different problem. I tried to change code to this, put in not getting any return
def run_command(command,command2):
p1 = subprocess.Popen(command,stdout=subprocess.PIPE,stderr=subprocess.STDOUT)
p2 = subprocess.Popen(command2,stdin=p1.stdout,stdout=subprocess.PIPE)
return iter(p2.stdout.readline,'')
command = 'mosquitto_sub -u example -P example -t ITT/# -v'.split()
command2 = 'grep Online'.split()
#subprocess.getoutput(command)
for line in run_command(command,command2):
print(line)
When you split the text, the list will look like
['mosquitto_sub', ..., 'ITT/#', '-v', '|', 'grep', '"Online"']
When you pass this list to subprocess.Popen, a literal '|' will be one of the arguments to mosquitto_sub.
If you use shell=True, you must escape any special characters like # in the command, for instance with double quotes:
import subprocess
command = 'echo -e "ITT/#\\ni am Online\\nbar Online\\nbaz" | grep "Online" '
p = subprocess.Popen(
command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in iter(p.stdout.readline, b''):
print(line)
Alternatively, connect the pipes as you wrote, but make sure to iterate until b'', not u'':
import subprocess
def run_command(command, command2):
p1 = subprocess.Popen(command,stdout=subprocess.PIPE,stderr=subprocess.STDOUT)
p2 = subprocess.Popen(command2,stdin=p1.stdout,stdout=subprocess.PIPE)
return iter(p2.stdout.readline, b'')
command = ['echo', '-e', 'ITT/#\\ni am Online\\nbar Online\\nbaz']
command2 = 'grep Online'.split()
for line in run_command(command,command2):
print(line)

Why does this valid shell command throw an error in python via subprocess?

The line awk -F'[][]' '/dB/ { print $2 }' <(amixer sget Master) in bash returns my system's current volume (e.g. "97%").
I tried to incorporate this in Python 3
#!/usr/bin/env python3
import subprocess
command = "awk -F'[][]' '/dB/ { print $2 }' <(amixer sget Master)"
output = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE).stdout.read()
print(output)
However the output from the shell returns
/bin/sh: 1: Syntax error: "(" unexpected
b''
Why does this fail and how do I fix my code?
As already pointed out, the syntax you are using is a bash syntax (a.k.a. bashism). The default shell used in subprocess.Popen is /bin/sh & it does not support process substitution.
You can specify the shell to be used via executable argument.
Try this:
output = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, executable="/bin/bash").stdout.read()
Because you are using bashism in form of a process substitution, and your /bin/sh doesn't support that:
<(...)
Changing this to a pipe should solve your problem:
command = "amixer sget Master | awk -F'[][]' '/dB/ { print $2 }'"
Alternative you can start bash from within sh:
command = "bash -c 'amixer sget Master | awk -F'\\''[][]'\\'' '\\''/dB/ { print $2 }'\\'''"
But as you will soon realize, quoting and escaping will become a nightmare

Categories

Resources