This question already has answers here:
How to give input streams in python?
(2 answers)
Closed 9 years ago.
To run shell command from python script, I generally use subprocess or os.system module.
Using that I am running some shell command from python script which is initiating another application and that application also has command line interface.
How would I pass commands to that application CLI from my python script?
How can I capture the output of application CLI from my python script?
It is highly appreciated if someone can suggest material or example code.
The application you're initiating might behave differently when running through a subprocess. Specifically, when connected to a process pipe, some applications buffer their output by default instead of flushing line by line. If the application you're running flushes its output, you can get it realtime, otherwise, you'll only get output when the buffer is full.
That said, here's an example to run some application:
p = subprocess.Popen(['someapp', 'param1', 'param2'],
stdin=subprocess.PIPE, stdout=subprocess.PIPE,)
# sends the command "some_command" to the app:
p.stdin.write('some_command\n')
# waits for a single line from the output
result = p.stdout.readline()
If it hangs on p.stdout.readline() that means the output is being buffered.
Related
I am running a python on Linux terminal and it requires some bash commands to run while the test is running. So, I am using the subprocess module and running my test commands (bash script). These so-called bash commands might print something on the CLI which I need to know if it does while I am running my python code in parallel.
for Ex :
# running my python TCP server
subprocess.call(['.\run_some_shell_commands.sh'],shell=True)
while True:
# I am doing some other python stuff
if (CLI_HAS_SOME_OUTPUT_DETECTED):
#record the output to some variable
# doing some more python stuff
If I know for sure that run_some_shell_commands.sh returns some output for sure, I could simply use A = subprocess.checkoutput(['.\run_some_shell_commands.sh'],shell=True) which would save its output in variable A ..
Is there any way to grab the last n lines of the terminal ?? so that I can check if that event has occurred and I can assign that to CLI_HAS_SOME_OUTPUT_DETECTED
Any suggestions are highly appreciated.
Saira
import subprocess
import time as t
cmd = [' ']
P = subprocess.check_output(cmd,shell=True)
while True :
print(P)
t.sleep(0.1)
This is answered in Running shell command and capturing the output. There are two classes of shell commands, executables and inbuilt commands, in some programming languages this can make a difference, see How do I listen for a response from shell command in android studio?
This question already has answers here:
How do I execute a program or call a system command?
(65 answers)
Closed 5 years ago.
Is there a way for the Python print statement containing a bash command to run in the terminal directly from the Python script?
In the example below, the awk command is printed on the terminal.
#!/usr/bin/python
import sys
print "awk 'END{print NF}' file"
I can of course think of writing the print statement in a separate file and run that file as a bash script but is there a way to run the awk command directly from the python script rather than just printing it?
is there a way to run the awk command directly from the python script rather than just printing it?
Yes, you can use subprocess module.
import subprocess
subprocess.call(["ls", "-l"])
You can pipe your Python output into a Bash process, for example,
python -c "print 'echo 5'" | bash
will output
5
You could even use the subprocess module to do that from inside Python, if you wanted to.
But I am sure this is pretty bad design, and not a good idea. If you get your coding wrong, there's a risk you could allow hostile users to execute arbitrary commands on the machine running your code.
One solution is to use subprocess to run a shell command and capture its output, for example:
import subprocess
command = "awk 'END{print NF}' file"
p = subprocess.Popen([command], shell=True, bufsize=2000,
stdin=subprocess.PIPE, stdout=subprocess.PIPE, close_fds=True)
(child_stdout, child_stdin) = (p.stdout, p.stdin)
print(''.join([line for line in child_stdout]))
child_stdout.close()
p.stdout.close()
Adjust bufsize accordingly based on the size of your file.
This question already has an answer here:
How to get the output from os.system()? [duplicate]
(1 answer)
Closed 1 year ago.
I am using a script where I issue a shell command to a remote server (ssh) using os.system() command. I need to collect the output of the command I executed on the remote server. The problem is the double redirection. I am using the os.system() to execute a ssh command which executes the intended command on a remote server. It is this output I intend to make use of. I just need some pointers as to how this can be achieved ?
Use the subprocess module:
subprocess.check_output returns the output of a command as a string.
>>> import subprocess
>>> print subprocess.check_output.__doc__
Run command with arguments and return its output as a byte string.
If the exit code was non-zero it raises a CalledProcessError. The
CalledProcessError object will have the return code in the returncode
attribute and output in the output attribute.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Calling an external command in Python
I want to run commands in another directory using python.
What are the various ways used for this and which is the most efficient one?
What I want to do is as follows,
cd dir1
execute some commands
return
cd dir2
execute some commands
Naturally if you only want to run a (simple) command on the shell via python, you do it via the system function of the os module. For instance:
import os
os.system('touch myfile')
If you would want something more sophisticated that allows for even greater control over the execution of the command, go ahead and use the subprocess module that others here have suggested.
For further information, follow these links:
Python official documentation on os.system()
Python official documentation on the subprocess module
If you want more control over the called shell command (i.e. access to stdin and/or stdout pipes or starting it asynchronously), you can use the subprocessmodule:
import subprocess
p = subprocess.Popen('ls -al', shell=True, stdout=subprocess.PIPE)
stdout, stderr = p.communicate()
See also subprocess module documentation.
os.system("/dir/to/executeble/COMMAND")
for example
os.system("/usr/bin/ping www.google.com")
if ping program is located in "/usr/bin"
Naturally you need to import the os module.
os.system does not wait for any output, if you want output, you should use
subprocess.call or something like that
You can use Python Subprocess ,which offers many modules to execute commands, checking outputs and receive error messages etc.
i want to run and control PSFTP from a Python script in order to get log files from a UNIX box onto my Windows machine.
I can start up PSFTP and log in but when i try to run a command remotely such as 'cd' it isn't recognised by PSFTP and is just run in the terminal when i close PSFTP.
The code which i am trying to run is as follows:
import os
os.system("<directory> -l <username> -pw <password>" )
os.system("cd <anotherDirectory>")
i was just wondering if this is actually possible. Or if there is a better way to do this in Python.
Thanks.
You'll need to run PSFTP as a subprocess and speak directly with the process. os.system spawns a separate subshell each time it's invoked so it doesn't work like typing commands sequentially into a command prompt window. Take a look at the documentation for the standard Python subprocess module. You should be able to accomplish your goal from there. Alternatively, there are a few Python SSH packages available such as paramiko and Twisted. If you're already happy with PSFTP, I'd definitely stick with trying to make it work first though.
Subprocess module hint:
# The following line spawns the psftp process and binds its standard input
# to p.stdin and its standard output to p.stdout
p = subprocess.Popen('psftp -l testuser -pw testpass'.split(),
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
# Send the 'cd some_directory' command to the process as if a user were
# typing it at the command line
p.stdin.write('cd some_directory\n')
This has sort of been answered in: SFTP in Python? (platform independent)
http://www.lag.net/paramiko/
The advantage to the pure python approach is that you don't always need psftp installed.