subprocess.Popen() not executing the command? [duplicate] - python

This question already has answers here:
File not found error when launching a subprocess containing piped commands
(6 answers)
How to use `subprocess` command with pipes
(7 answers)
Actual meaning of 'shell=True' in subprocess
(7 answers)
Closed 3 years ago.
I want to execute the command throughout the python and want to get output back
command is: ls /sys/class/net/ | sed -n '/e.*/p'
it returns the names of interfaces attached which starts with e
with terminal, I am getting the output as: eth0 eth1, which is expected
But with python, I am executing like this
out = subprocess.Popen(["ls", "/sys/class/net/", " | ", "sed -n '/e.*/p'"],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,)
stdout,stderr = out.communicate()
print(stdout)
the expected output is eth0 eth1
but output:
ls: cannot access ' | ': No such file or directory\nls: cannot access 'sed -n '\''/e.*/p'\''': No such file or directory /sys/class/net/:\neth0\neth1\nlo\nwifi0\nwifi1\nwifi2\n
what is the problem here?

Related

How can i find the default branch from a list of remote repos in python? [duplicate]

This question already has answers here:
How do I execute a program or call a system command?
(65 answers)
How to get output from subprocess.Popen(). proc.stdout.readline() blocks, no data prints out
(4 answers)
Running shell command and capturing the output
(21 answers)
Closed 11 days ago.
I need to pass from the command line a list of repos and detect their default branches. So far i only found this command that returns the default HEAD git remote show origin | grep 'HEAD' | cut -d':' -f2 | sed -e 's/^ *//g' -e 's/ *$//g'
However, i'm not sure how should i execute it in my code.
Here's the execution command python3 app.py testrepo.
And below is the code
#app.route('/test')
def get_default_branch():
repos = sys.argv[1:]
origin =repos[0]
return subprocess.Popen([f"'git', 'remote', 'show', '{origin}''" + "| grep 'HEAD' | cut -d':' -f2 | sed -e 's/^ *//g' -e 's/ *$//g''" ])
If you want to capture the output, using the check_output api is probably easier
#app.route('/test')
def get_default_branch():
repos = sys.argv[1:]
origin =repos[0]
return subprocess.check_output(
f"git remote show {origin} | grep 'HEAD' | cut -d':' -f2 | sed -e 's/^ *//g' -e 's/ *$//g'",
shell=True
)
https://docs.python.org/3/library/subprocess.html#subprocess.check_output
Withh shell=True it is also recommended to use strings instead of a list

Bash commands in python [duplicate]

This question already has answers here:
How to use `subprocess` command with pipes
(7 answers)
Closed 4 years ago.
I am running a code in python which calculates the count of the files present in a directory`
hadoop fs -count /user/a909983/sample_data/ | awk '{print $2}'
This successfully returns 0 in the linux command line as the dir is empty.However when I run this in python script it returns 1.The line of code in python is:
directoryEmptyStatusCommand = subprocess.call(
["hadoop", "fs", "-count", "/user/a909983/sample_data/", "|", "awk '{print $2}'"])
How can I correct this? or what am I missing ?. I have also tried using Popen, but the result is the same.
Use subprocess.Popen and don't use the pipe | because it requires shell=True which security risk. So, use the subprocess.PIPE and use that with subprocess.check_output without pipe thats the correct method.
So, you can try something like:
command = subprocess.Popen(("hadoop", "fs", "-count", "/user/a909983/sample_data/") , stdout=subprocess.PIPE)
output = subprocess.check_output(("awk '{print $2}'"), stdin=command.stdout)
In Case You want to try Shell commands by enabling shell=True:
cmd = "hadoop fs -count /user/a909983/sample_data/ | awk '{print $2}'"
command = subprocess.Popen(cmd,shell=True,stdout=subprocess.PIPE,stderr=subprocess.STDOUT)
output = command.communicate()[0]
print(output)

how to call python variables inside bash commands in a python script or can I do that? [duplicate]

This question already has answers here:
How to use python variable in os.system? [duplicate]
(2 answers)
Closed 5 years ago.
I am trying to list cronjobs of all users
user_file = open('/root/pys/userdomains', 'r')
for line in user_file:
print line
splitted_line = line.split(': ')
print splitted_line
user_cron = splitted_line[1].split()[0]
print user_cron
print "crons for", user_cron, "is", CronTab(user=user_cron)
On last line, can I use
os.system('crontab -l -u user_cron')
to get similar result? I know there is CronTab option, but in similar cases, can I use python variables (e.g. user_cron) inside bash commands.
Not quite: you need to construct the actual command string you want to use: Append the string value of user_cron to the literal part of the command.
os.system('crontab -l -u ' + user_cron)
Yes you can use os.system function, but you must import os.
import os
os.system('crontab -l -u user_cron')

Getting the output of os.system in python and processing it after [duplicate]

This question already has answers here:
Python: How to get stdout after running os.system? [duplicate]
(6 answers)
Closed 8 years ago.
I am trying to do something like:
f = subprocess.check_output("./script.sh ls -l test1/test2/test.log", shell=True)
when I print f, I get value 0. I tried using subprocess and then read() and even then i dont get the details of the file. I need to verify the size of the file..
Not sure how it can be done.
Any help?
When I used
f = os.system("./script.sh ls -l test1/test2/test.log"), I get the output but does not get saved in f. Something like stdoutput or something..
UPDATED:
I used
f = os.popen("./script.sh ls -l test1/test2/test.log 2>&1")
if I ran the same command in quotes above, directly on CLI myself, it works fine but if I used the above in a script OR used s = f.readline(), the script stops, I need to hit "return" before the script can proceed..
Why is that? I need 's' because I need to process it.
You can use subprocess.check_output:
f = subprocess.check_output("./script.sh ls -l test1/test2/test.log",shell=True)
print(f)
You can split into a list of individual args without using shell=True:
f = subprocess.check_output(['./script.sh', 'ls', '-l', 'test1/test2/test.log']))

How to get output of command line to python [duplicate]

This question already has answers here:
Retrieving the output of subprocess.call() [duplicate]
(7 answers)
Closed 8 years ago.
I run windows command line programm from python, the command line programm return strings, for example: I run that line
subprocess.call("RPiHubCMDTool.exe dev", shell=True)
and I see in cmd window the output dev0 FT2232H RPi HUB Module A 136241 A ,
dev1 FT2232H RPi HUB Module B 136242 B. I whant to work in python with that output. How to bring it from cmd window to python? Could you provide an example?
to get the output you can use
output=subprocess.check_output(["echo", "Hello World!"])
print output
# Hello World!
How about write the result to a file and read this file in python?
subprocess.call("RPiHubCMDTool.exe dev > result.txt", shell=True)
f = open('result.txt', 'r')
# do something with f

Categories

Resources