How to run commands on shell through python [duplicate] - python

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Calling an external command in Python
I want to run commands in another directory using python.
What are the various ways used for this and which is the most efficient one?
What I want to do is as follows,
cd dir1
execute some commands
return
cd dir2
execute some commands

Naturally if you only want to run a (simple) command on the shell via python, you do it via the system function of the os module. For instance:
import os
os.system('touch myfile')
If you would want something more sophisticated that allows for even greater control over the execution of the command, go ahead and use the subprocess module that others here have suggested.
For further information, follow these links:
Python official documentation on os.system()
Python official documentation on the subprocess module

If you want more control over the called shell command (i.e. access to stdin and/or stdout pipes or starting it asynchronously), you can use the subprocessmodule:
import subprocess
p = subprocess.Popen('ls -al', shell=True, stdout=subprocess.PIPE)
stdout, stderr = p.communicate()
See also subprocess module documentation.

os.system("/dir/to/executeble/COMMAND")
for example
os.system("/usr/bin/ping www.google.com")
if ping program is located in "/usr/bin"
Naturally you need to import the os module.
os.system does not wait for any output, if you want output, you should use
subprocess.call or something like that

You can use Python Subprocess ,which offers many modules to execute commands, checking outputs and receive error messages etc.

Related

How to run bash commands inside of a Python script [duplicate]

This question already has answers here:
Running Bash commands in Python
(11 answers)
Closed 2 years ago.
I am trying to run both Python and bash commands in a bash script.
In the bash script, I want to execute some bash commands enclosed by a Python loop:
#!/bin/bash
python << END
for i in range(1000):
#execute‬ some bash command such as echoing i
END
How can I do this?
Use subprocess, e.g.:
import subprocess
# ...
subprocess.call(["echo", i])
There is another function like subprocess.call: subprocess.check_call. It is exactly like call, just that it throws an exception if the command executed returned with a non-zero exit code. This is often feasible behaviour in scripts and utilities.
subprocess.check_output behaves the same as check_call, but returns the standard output of the program.
If you do not need shell features (such as variable expansion, wildcards, ...), never use shell=True (shell=False is the default). If you use shell=True then shell escaping is your job with these functions and they're a security hole if passed unvalidated user input.
The same is true of os.system() -- it is a frequent source of security issues. Don't use it.
Look in to the subprocess module. There is the Popen method and some wrapper functions like call.
If you need to check the output (retrieve the result string):
output = subprocess.check_output(args ....)
If you want to wait for execution to end before proceeding:
exitcode = subprocess.call(args ....)
If you need more functionality like setting environment variables, use the underlying Popen constructor:
subprocess.Popen(args ...)
Remember subprocess is the higher level module. It should replace legacy functions from OS module.
I used this when running from my IDE (PyCharm).
import subprocess
subprocess.check_call('mybashcommand', shell=True)

Which is the best python module for command execution [duplicate]

This question already has answers here:
How do I execute a program or call a system command?
(65 answers)
Closed 9 years ago.
I want to perform various Linux commands/operations using python script. I will be using the output, verifying/processing it and continue with some more commands execution in my script, may be remote execution also sometimes.
I have tried with both os and subprocess modules. The caveat here is, I am not able to combine both of them i.e. system calls or commands executed from one module does not affect "program/python" environment variables rather only considered by that particular module.
For. ex.
os.chdir(dirname)
os.system(cmd)
# p = subprocess.Popen(cmd)
Now, here changes from os.chdir are not useful for subprocess call. We have to stick with any one of them. If I use subprocess, I have to pass/create shell commands for it.
Added: cwd= is a solution for subprocess.Popen but every time I would have to pass option cwd as argument to future commands, if they all should be run from that dir
Is there a better way where we can use both of these modules together?
Or
Is there any other better module available for command executions.
Also I would like to know "Pros-Cons/Caveats" of both these modules.
os.system always runs /bin/sh, which parses the command string. This can be a security risk if you have whitespace, $ etc. in the command arguments, or the user has a shell config file. To avoid all such risks, use subprocess with a list or tuple of strings as the command (shell=False) instead.
To emulate os.chdir in the command, use the cwd= argument in subprocess.

Shell expansion in subprocess? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Python subprocess wildcard usage
Using the Python 2.6 subprocess module, I need to run a command on a src.rpm file that I am building with a previous subprocess call.
Unfortunately, I am working with spec files that are not consistent, so I only have a vague idea of what the filename of the src.rpm should look like (for instance, I know the name of the package and the extension in something named "{package}-{version}.src.rpm" but not the version).
I do know, however, that I will only have one src.rpm file in the directory that I am looking, so I can call mock with a command like
mock {options} *.src.rpm
and have it work in shell, but subprocess doesn't seem to want to accept the expansion. I've tried using (shell=True) as an argument to subprocess.call() but even if it worked I would rather avoid it.
How do I get something like
subprocess.call("mock *.src.rpm".split())
to run?
Use the glob package:
import subprocess
from glob import glob
subprocess.call(["mock"] + glob("*.src.rpm"))
The wildcard * has to be interpreted by the SHELL. When you run subprocess.call, by default it doesn't load a shell, but you can give it shell=True as an argument:
subprocess.call("mock *.src.rpm".split(), shell=True)

How do I send command with parameters to external cmd application in Python?

I need to execute and send command to external app from python:
.\Ext\PrintfPC /p “C:\Leica\DBX” /l “.\joblist.log”
It is cmd app, Is it possible to hide its console and terminate after all also using only
python?
You are probably looking for the subprocess module. Example for executing the ls -l bash command on a unix system:
subprocess.call(['ls', '-l'])
So, in your case it should probably look something like:
subprocess.call(['.\Ext\PrintfPC', '/p', 'C:\Leica\DBX', '/l', '.\joblist.log'])
Have a look at the linked documentation though, because you can also get the output back from the command line execution by using pipes / Popen objects.

Can i control PSFTP from a Python script?

i want to run and control PSFTP from a Python script in order to get log files from a UNIX box onto my Windows machine.
I can start up PSFTP and log in but when i try to run a command remotely such as 'cd' it isn't recognised by PSFTP and is just run in the terminal when i close PSFTP.
The code which i am trying to run is as follows:
import os
os.system("<directory> -l <username> -pw <password>" )
os.system("cd <anotherDirectory>")
i was just wondering if this is actually possible. Or if there is a better way to do this in Python.
Thanks.
You'll need to run PSFTP as a subprocess and speak directly with the process. os.system spawns a separate subshell each time it's invoked so it doesn't work like typing commands sequentially into a command prompt window. Take a look at the documentation for the standard Python subprocess module. You should be able to accomplish your goal from there. Alternatively, there are a few Python SSH packages available such as paramiko and Twisted. If you're already happy with PSFTP, I'd definitely stick with trying to make it work first though.
Subprocess module hint:
# The following line spawns the psftp process and binds its standard input
# to p.stdin and its standard output to p.stdout
p = subprocess.Popen('psftp -l testuser -pw testpass'.split(),
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
# Send the 'cd some_directory' command to the process as if a user were
# typing it at the command line
p.stdin.write('cd some_directory\n')
This has sort of been answered in: SFTP in Python? (platform independent)
http://www.lag.net/paramiko/
The advantage to the pure python approach is that you don't always need psftp installed.

Categories

Resources