Saving the output of a process run by python [duplicate] - python

This question already has answers here:
How to use subprocess popen Python [duplicate]
(5 answers)
Closed 11 months ago.
I have the following code:
import os
commands = [r"cd c:\bugs\*722857\*722857 && chrome.exe --no-sandbox --single-process ..\..\cr-1195650-cve-2021-30588-stepped.html",
r"cd c:\bugs\*722857\*722857 && chrome.exe --no-sandbox --single-process --js-flags=-expose-gc ..\..\cr-1057593-cve-2020-6428-stepped.html",
r"cd c:\bugs\*722857\*722857 && chrome.exe --no-sandbox --single-process ..\..\cr-1032000-stepped.html",
]
for i in commands:
os.system(i)
I would like to save the output of each command, either as a string or list of strings. Any advice?

You need to have a look at subprocess module. Either subprocess.getstatusoutput or subprocess.getoutput is matching your requirement. If you just want to save the output, the return value from subprocess.getoutput is enough.

Replace os.system(i) with output = os.popen(i).read(), as the first one only passes command to the system, while the second creates a pipe to communicate with whatever is executed.

Related

how to call python variables inside bash commands in a python script or can I do that? [duplicate]

This question already has answers here:
How to use python variable in os.system? [duplicate]
(2 answers)
Closed 5 years ago.
I am trying to list cronjobs of all users
user_file = open('/root/pys/userdomains', 'r')
for line in user_file:
print line
splitted_line = line.split(': ')
print splitted_line
user_cron = splitted_line[1].split()[0]
print user_cron
print "crons for", user_cron, "is", CronTab(user=user_cron)
On last line, can I use
os.system('crontab -l -u user_cron')
to get similar result? I know there is CronTab option, but in similar cases, can I use python variables (e.g. user_cron) inside bash commands.
Not quite: you need to construct the actual command string you want to use: Append the string value of user_cron to the literal part of the command.
os.system('crontab -l -u ' + user_cron)
Yes you can use os.system function, but you must import os.
import os
os.system('crontab -l -u user_cron')

Capture dynamic command prompt output in text file [duplicate]

This question already has answers here:
how to direct output into a txt file in python in windows
(6 answers)
Closed 6 years ago.
I am running a python script which checks for the modifications of files in a folder. I want that output to be printed in a file. The problem is that the output is DYNAMIC , the cmd is always open and when a file is modified, I will have an information right-ahead about that in the cmd window. All the solutions which I found were matching the situations were I just run a command and I finish with that.
I tryed with:
python script.py > d:\output.txt but the output.txt file is empty
An example of the command prompt windows, after I run the command python script.py and I touch the 2 files, the command prompt will look like this. I want to capture that output.
Solution: In the python script which I use, add to the logging.basicConfig function, one more argument : filename='d:\test.log'
The issue is output buffering. If you wait long enough, you'll eventually see data show up in the file in "blocks". There are a few ways around it, for example:
Run python with the -u (unbuffered) flag
Add a sys.stdout.flush() after all print statements (which can be simplified by replacing stdout with a custom class to do it for you; see the linked question for more)
Add flush=True option to print statements if your version of Python supports it
If appropriate, use the logging module instead of print statements.
python test.py>test.txt
It's working for me in windows cmd prompt
As I see it the simplest would be to add the file handling (the writing to output.txt ) inside your script. Thus, when it is time to print the information you need to have (as your example shows when you touch two files you print two lines), you can open the file, write the specific line and close it after it is done (then you can see the updated output.txt).
Get the file path for the output.txt as a command line argument like
python script.py --o 'd:\output.txt'
for example.

How to Accept Command Line Arguments With Python Using < [duplicate]

This question already has answers here:
Python command line 'file input stream'
(3 answers)
Closed 8 years ago.
Is it possible to run a python script and feed in a file as an argument using <? For example, my script works as intended using the following command python scriptname.py input.txt and the following code stuffFile = open(sys.argv[1], 'r').
However, what I'm looking to do, if possible, is use this command line syntax: python scriptname.py < input.txt. Right now, running that command gives me only one argument, so I likely have to adjust my code in my script, but am not sure exactly how.
I have an automated system processing this command, so it needs to be exact. If that's possible with a Python script, I'd greatly appreciate some help!
< file is handled by the shell: the file doesn't get passed as an argument. Instead it becomes the standard input of your program, i.e., sys.stdin.
When you use the < operator in a shell you are actually opening the file and adding its contents to your scripts stdin
However there is is a python module that can do both. It's called fileinput.
https://docs.python.org/2/library/fileinput.html
It was shown in this post
How do you read from stdin in Python?
You can use the sys module's stdin attribute as a file like object.

Getting the output of os.system in python and processing it after [duplicate]

This question already has answers here:
Python: How to get stdout after running os.system? [duplicate]
(6 answers)
Closed 8 years ago.
I am trying to do something like:
f = subprocess.check_output("./script.sh ls -l test1/test2/test.log", shell=True)
when I print f, I get value 0. I tried using subprocess and then read() and even then i dont get the details of the file. I need to verify the size of the file..
Not sure how it can be done.
Any help?
When I used
f = os.system("./script.sh ls -l test1/test2/test.log"), I get the output but does not get saved in f. Something like stdoutput or something..
UPDATED:
I used
f = os.popen("./script.sh ls -l test1/test2/test.log 2>&1")
if I ran the same command in quotes above, directly on CLI myself, it works fine but if I used the above in a script OR used s = f.readline(), the script stops, I need to hit "return" before the script can proceed..
Why is that? I need 's' because I need to process it.
You can use subprocess.check_output:
f = subprocess.check_output("./script.sh ls -l test1/test2/test.log",shell=True)
print(f)
You can split into a list of individual args without using shell=True:
f = subprocess.check_output(['./script.sh', 'ls', '-l', 'test1/test2/test.log']))

Passing a variable into bash command using python [duplicate]

This question already has answers here:
How to insert a variable value in a string in python
(3 answers)
Closed 2 years ago.
I'm creating a small script and I'd like to pass a varable into a bash command using the python programming language, so for example:
number = raw_input("digit: ")
then i'd like to take the number variable and put it in bash command so for example:
ssh 'foo%s.bar'(number) <- where the %s is located id like it be replaced with the input
Finally I'd like to take that and run it in a bash command still within the python script:
ssh foo45.bar
How can I make this work?
import subprocess
number = raw_input("digit: ")
subprocess.call(('ssh', 'foo{}.bar'.format(number)))
For some good reading, try the python docs for string formatting and for subprocess.
If you want to integrate the above with sshpass, replace the subprocess call with:
subprocess.call(('sshpass', '-p', 'YOUR_PASSWORD', 'ssh', '-o', 'StrictHostKeyChecking=no', 'foo{}.bar'.format(number)))

Categories

Resources