Python subprocess output read error - python

I have a command which works great at the terminal:
sudo tshark -V -l -i "any" -f 'udp port 4729'
I trying to read the output from my python script:
import subprocess
command = ['tshark', '-V', '-l', '-i', '"any"', '-f', '"udp port 4729"'] # the shell command
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=None)
output, error = process.communicate()
print output
It does not work. Maybe it's some troubles with writing a command in the list.
I receive the error:
gooman#ubuntu:~/workspace/glade_tests/src$ sudo ./main.py
tshark: Lua: Error during loading:
[string "/usr/share/wireshark/init.lua"]:45: dofile has been disabled
Running as user "root" and group "root". This could be dangerous.
Capturing on "any"
tshark: The capture session could not be initiated (No such device exists).
Please check to make sure you have sufficient permissions, and that you have the proper interface or pipe specified.
0 packets captured

Thy this:
import subprocess
command = "sudo tshark -V -l -i "any" -f 'udp port 4729'"
try:
output = subprocess.check_output(command, shell=True)
except subprocess.CalledProcessError as e:
print "An error has been occured", e
raise
print "The subprocess output:", output
Maybe, it will be needed to add stdout=subprocess.PIPE argument.

Related

subprocess command execution

What is the best way to execute the below command in Python in a single line?
echo $(readlink /sys/dev/block/$(mountpoint -d /))
Tried using individual os.system(cmd) by separating - "mountpoint -d /" first and taking the output and appending to "readlink /sys/dev/block/${0}".format(out.strip()) and doing an echo works. Tried using subprocess and subprocess.Popen and subprocess.check_output but it raises raise CalledProcessError
cmd = "echo $(readlink /sys/dev/block/$(mountpoint -d /))"
You have to call the subcommand separately. And you can use python methods to read the link:
import subprocess
import os
path = "/"
device = subprocess.run(["mountpoint", "-d", path], stdout=subprocess.PIPE, encoding="utf8").stdout.strip()
link = os.readlink("/sys/dev/block/" + device)
print(link)
You probably want to use something like the following:
cmd = "bash -c 'echo $(readlink /sys/dev/block/$(mountpoint -d /))'"
echo doesn't substitute $() blocks, that's what your shell does, so you have to call the shell. os.system(cmd) should work then.

wget missing url with shlex and subprocess

I'm struggling to understand why this fails with a wget: missing URL error:
import shlex
import subprocess
copy_command = "wget -O - 'http://example.com/somepath/somefile.txt?someparam=test' | sshpass -p pass ssh user#localhost -p 2222 \"cat - > /upload/somefile.txt\""
cmd = shlex.split(copy_command, posix=False)
with subprocess.Popen(
cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, shell=True
) as proc:
output, error = proc.communicate()
What am I missing here? If I just give subprocess the copy_command string directly, then it works without issues.
To set up a pipeline requires the parent process to spawn all the programs involved and the connect (pipe) the stdio of one to another.
The Python documentation for subprocess explains how to do this.
It works with string argument andshell=True because then it just hands off the command line to a sub shell, and that shell handles all those details.

Python subprocess sudo returns error: ERROR: ['sudo: sorry, you must have a tty to run sudo\n']

Here is my code:
import subprocess
HOST = 'host_name'
PORT = '111'
USER = 'user_name'
CMD = 'sudo su - ec2-user; ls'
process = subprocess.Popen(['ssh','{}#{}'.format(USER, HOST),
'-p', PORT, CMD],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = process.stdout.readlines()
if not result:
print "Im an error"
err = process.stderr.readlines()
print('ERROR: {}'.format(err))
else:
print "I'm a success"
print(result)
When I run this I receive the following output in my terminal:
dredbounds-computer: documents$ python terminal_test.py
Im an error
ERROR: ['sudo: sorry, you must have a tty to run sudo\n']
I've tried multiple things but I keep getting that error "sudo: sorry, you must have a tty to run sudo". It works fine if I just do it through the terminal manually, but I need to automate this. I read that a workaround might be to use '-t' or '-tt' in my ssh call, but I haven't been able to implement this successfully in subprocess yet (terminal just hangs for me). Anyone know how I can fix my code, or work around this issue? Ideally I'd like to ssh, then switch to the sudo user, and then run a file from there (I just put ls for testing purposes).
sudo is prompting you for a password, but it needs a terminal to do that. Passing -t or -tt provides a terminal for the remote command to run in, but now it is waiting for you to enter a password.
process = subprocess.Popen(['ssh','-tt', '{}#{}'.format(USER, HOST),
'-p', PORT, CMD],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
process.stdin.write("password\r\n")
Keep in mind, though, that the ls doesn't run until after the shell started by su exits. You should either log into the machine as ec2-user directly (if possible), or just use sudo to run whatever command you want without going through su first.
You can tell sudo to work without requiring a password. Just add this to /etc/sudoers on the remote server host_name.
user ALL = (ec2-user) NOPASSWD: ls
This allows the user named user to execute the command ls as ec2-user without entering a password.
This assumes you change your command to look like this, which seems more reasonable to me:
CMD = 'sudo -u ec2-user ls'

Running rsync from python subprocess in windows

I need to run rsync from Python 2.7 app in windows 7 x64 (using cwRsync 5.5.0).
Everything works fine from command line:
set CWRSYNCHOME in env to cwrsync binaries and run following command
rsync.exe "/cygdrive/e/test" test1#192.168.1.14:
But when trying to run same command as python subprocess:
process = subprocess.Popen(['rsync.exe', '/cygdrive/e/test', 'test1#192.168.1.14:'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True,
env={'CWRSYNCHOME': './bin'})
stdout, stderr = process.communicate()
print 'STDOUT:{}\nSTDERR:{}'.format(stdout, stderr)
I get following error in stderr:
rsync: pipe: Operation not permitted (1)
rsync error: error in IPC code (code 14) at pipe.c(59) [sender=3.1.2]
Here is verbose rsync stdout:
FILE_STRUCT_LEN=16, EXTRA_LEN=4
cmd=<NULL> machine=192.168.1.14 user=test1 path=.
cmd[0]=ssh cmd[1]=-l cmd[2]=test1 cmd[3]=192.168.1.14 cmd[4]=rsync cmd[5]=--server cmd[6]=-vvvvve.LsfxC cmd[7]=. cmd[8]=.
opening connection using: ssh -l test1 192.168.1.14 rsync --server -vvvvve.LsfxC . . (9 args)
[sender] _exit_cleanup(code=14, file=pipe.c, line=59): entered
[sender] _exit_cleanup(code=14, file=pipe.c, line=59): about to call exit(14)
Tryed set shell=False and pass command as single line (not cmd and args) - error stil repeats.
What am i doing wrong ?
To get it work, rsync needs to be runned under cygwin's shell:
process = subprocess.Popen(['sh.exe', '-c',
'rsync /cygdrive/e/test test1#192.168.1.14:'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE,
env={'CWRSYNCHOME': '/bin/',
'PATH': '/bin/'})
It's working (there is no ssh athorization in example above).

read -a unknown option

I'm executing shell commands using python script. This is the command:
ntpservlist=( $OMC_NTPSERV ) && IFS=',' read -ra ntplist <<< "$ntpservlist" && for i in "${ntplist[#]}" ; do echo "server $i" >> /etc/inet/ntp.conf ; done
When I execute the command using a script, I get the following error:
/bin/sh[1]: read: -a: unknown option
Usage: read [-ACprsv] [-d delim] [-u fd] [-t timeout] [-n count] [-N count]
[var?prompt] [var ...]
But if I execute the same command using the command line, it executes correctly without any errors.
I'm using:
proc = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
to execute the command.
Your interactive shell is bash, but your system shell, used by Popen, is some flavor of ksh. To use bash instead, use the executable option:
proc = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True,
executable="/bin/bash") # or whatever the right path is
(out, err) = proc.communicate()
Most of your command appears to be valid ksh, but one difference is that read -A, not read -a, is used to populate an array.

Categories

Resources