How to run bash commands inside of a Python script [duplicate] - python

This question already has answers here:
Running Bash commands in Python
(11 answers)
Closed 2 years ago.
I am trying to run both Python and bash commands in a bash script.
In the bash script, I want to execute some bash commands enclosed by a Python loop:
#!/bin/bash
python << END
for i in range(1000):
#execute‬ some bash command such as echoing i
END
How can I do this?

Use subprocess, e.g.:
import subprocess
# ...
subprocess.call(["echo", i])
There is another function like subprocess.call: subprocess.check_call. It is exactly like call, just that it throws an exception if the command executed returned with a non-zero exit code. This is often feasible behaviour in scripts and utilities.
subprocess.check_output behaves the same as check_call, but returns the standard output of the program.
If you do not need shell features (such as variable expansion, wildcards, ...), never use shell=True (shell=False is the default). If you use shell=True then shell escaping is your job with these functions and they're a security hole if passed unvalidated user input.
The same is true of os.system() -- it is a frequent source of security issues. Don't use it.

Look in to the subprocess module. There is the Popen method and some wrapper functions like call.
If you need to check the output (retrieve the result string):
output = subprocess.check_output(args ....)
If you want to wait for execution to end before proceeding:
exitcode = subprocess.call(args ....)
If you need more functionality like setting environment variables, use the underlying Popen constructor:
subprocess.Popen(args ...)
Remember subprocess is the higher level module. It should replace legacy functions from OS module.

I used this when running from my IDE (PyCharm).
import subprocess
subprocess.check_call('mybashcommand', shell=True)

Related

Executing python module through popen from a python script/shell

I have a python module that is executed by the following command:
python3 -m moduleName args
Trying to execute it from a script using subprocess.popen.
subprocess.Popen(command, shell=True, text=True, stdout=subprocess.PIPE)
Based on subprocess documentation, we are recommended to pass a sequence rather than a string. So when I try to pass the below command as the argument
command = ['python3','-m','moduleName','args']
I end up getting a python shell instead of the module being executed. If I pass it as a string, things are working as expected. I'm not able to find documentation or references for this.
Can someone please help throw some light into this behavior?
What would be the best way to make this work?
Thanks!
This behavior is caused by the shell=True option. When Popen runs in shell mode (under POSIX), the command is appended to the shell command after a "-c" option (subprocess.py, Python 3.9):
args = [unix_shell, "-c"] + args
When the list of arguments is expanded, the first argument after '-c' (in your case, 'python3') is treated as the parameter to '-c'. The other arguments are interpreted as further arguments to the unix_shell command. The -m, for example, activates job control in bash, as outlined in the bash manual.
The solution is to either
pass the command as a single string, as you did, or
do not set the shell option for Popen, which is a good idea anyway, as it is lighter on resources and avoids pitfalls like the one you encountered.

How to achieve Perl's exec function in Python?

Assume using Linux:
In Perl, the exec function executes an external program and immediately exits itself, leaving the external program in same shell session.
A very close answer using Python is https://stackoverflow.com/a/13256908
However, the Python solution using start_new_session=True starts an external program using setsid method, that means that solution is suitable for making a daemon, not an interactive program.
Here is an simple example of using perl:
perl -e '$para=qq(-X --cmd ":vsp");exec "vim $para"'
After vim is started, the original Perl program has exited and the vim is still in the same shell session(vim is not sent to new session group).
How to get the same solution with Python.
Perl is just wrapping the exec* system call functions here. Python has the same wrappers, in the os module, see the os.exec* documentation:
These functions all execute a new program, replacing the current process; they do not return. On Unix, the new executable is loaded into the current process, and will have the same process id as the caller.
To do the same in Python:
python -c 'import os; para="-X --cmd \":vsp\"".split(); os.execlp("vim", *para)'
os.execlp accepts an argument list and looks up the binary in $PATH from the first argument.
The subprocess module is only ever suitable for running processes next to the Python process, not to replace the Python process. On POSIX systems, the subprocess module uses the low-level exec* functions to implement it's functionality, where a fork of the Python process is then replaced with the command you wanted to run with subprocess.

Invoking C compiler using Python subprocess command

I am trying to compile a C program using Python and want to give input using "<" operator but it's not working as expected.
If I compile the C program and run it by giving input though a file it works; for example
./a.out <inp.txt works
But similarly if I try to do this using a Python script, it did not quite work out as expected.
For example:
import subprocess
subprocess.call(["gcc","a.c","-o","x"])
subprocess.call(["./x"])
and
import subprocess
subprocess.call(["gcc","a.c","-o","x"])
subprocess.call(["./x","<inp.txt"])
Both script ask for input though terminal. But I think in the second script it should read from file. why both the programs are working the same?
To complement #Jonathan Leffler's and #alastair's helpful answers:
Assuming you control the string you're passing to the shell for execution, I see nothing wrong with using the shell for convenience. [1]
subprocess.call() has an optional Boolean shell parameter, which causes the command to be passed to the shell, enabling I/O redirection, referencing environment variables, ...:
subprocess.call("./x <inp.txt", shell = True)
Note how the entire command line is passed as a single string rather than an array of arguments.
[1]
Avoid use of the shell in the following cases:
If your Python code must run on platforms other than Unix-like ones, such as Windows.
If performance is paramount.
If you find yourself "outsourcing" tasks better handled on the Python side.
If you're concerned about lack of predictability of the shell environment (as #alastair is):
subprocess.call with shell = True always creates non-interactive non-login instances of /bin/sh - note that it is NOT the user's default shell that is used.
sh does NOT read initialization files for non-interactive non-login shells (neither system-wide nor user-specific ones).
Note that even on platforms where sh is bash in disguise, bash will act this way when invoked as sh.
Every shell instance created with subprocess.call with shell = True is its own world, and its environment is neither influenced by previous shell instances nor does it influence later ones.
However, the shell instances created do inherit the environment of the python process itself:
If you started your Python program from an interactive shell, then that shell's environment is inherited. Note that this only pertains to the current working directory and environment variables, and NOT to aliases, shell functions, and shell variables.
Generally, that's a feature, given that Python (CPython) itself is designed to be controllable via environment variables (for 2.x, see https://docs.python.org/2/using/cmdline.html#environment-variables; for 3.x, see https://docs.python.org/3/using/cmdline.html#environment-variables).
If needed, you can supply your own environment to the shell via the env parameter; note, however, that you'll have to supply the entire environment in that event, potentially including variables such as USER and HOME, if needed; simple example, defining $PATH explicitly:
subprocess.call('echo $PATH', shell = True, \
env = { 'PATH': '/sbin:/bin:/usr/bin' })
The shell does I/O redirection for a process. Based on what you're saying, the subprocess module does not do I/O redirection like that. To demonstrate, run:
subprocess.call(["sh","-c", "./x <inp.txt"])
That runs the shell and should redirect the I/O. With your code, your program ./x is being given an argument <inp.txt which it is ignoring.
NB: the alternative call to subprocess.call is purely for diagnostic purposes, not a recommended solution. The recommended solution involves reading the (Python 2) subprocess module documentation (or the Python 3 documentation for it) to find out how to do the redirection using the module.
import subprocess
i_file = open("inp.txt")
subprocess.call("./x", stdin=i_file)
i_file.close()
If your script is about to exit so you don't have to worry about wasted file descriptors, you can compress that to:
import subprocess
subprocess.call("./x", stdin=open("inp.txt"))
By default, the subprocess module does not pass the arguments to the shell. Why? Because running commands via the shell is dangerous; unless they're correctly quoted and escaped (which is complicated), it is often possible to convince programs that do this kind of thing to run unwanted and unexpected shell commands.
Using the shell for this would be wrong anyway. If you want to take input from a particular file, you can use subprocess.Popen, setting the stdin argument to a file descriptor for the file inp.txt (you can get the file descriptor by calling fileno() a Python file object).

Which is the best python module for command execution [duplicate]

This question already has answers here:
How do I execute a program or call a system command?
(65 answers)
Closed 9 years ago.
I want to perform various Linux commands/operations using python script. I will be using the output, verifying/processing it and continue with some more commands execution in my script, may be remote execution also sometimes.
I have tried with both os and subprocess modules. The caveat here is, I am not able to combine both of them i.e. system calls or commands executed from one module does not affect "program/python" environment variables rather only considered by that particular module.
For. ex.
os.chdir(dirname)
os.system(cmd)
# p = subprocess.Popen(cmd)
Now, here changes from os.chdir are not useful for subprocess call. We have to stick with any one of them. If I use subprocess, I have to pass/create shell commands for it.
Added: cwd= is a solution for subprocess.Popen but every time I would have to pass option cwd as argument to future commands, if they all should be run from that dir
Is there a better way where we can use both of these modules together?
Or
Is there any other better module available for command executions.
Also I would like to know "Pros-Cons/Caveats" of both these modules.
os.system always runs /bin/sh, which parses the command string. This can be a security risk if you have whitespace, $ etc. in the command arguments, or the user has a shell config file. To avoid all such risks, use subprocess with a list or tuple of strings as the command (shell=False) instead.
To emulate os.chdir in the command, use the cwd= argument in subprocess.

How to run commands on shell through python [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Calling an external command in Python
I want to run commands in another directory using python.
What are the various ways used for this and which is the most efficient one?
What I want to do is as follows,
cd dir1
execute some commands
return
cd dir2
execute some commands
Naturally if you only want to run a (simple) command on the shell via python, you do it via the system function of the os module. For instance:
import os
os.system('touch myfile')
If you would want something more sophisticated that allows for even greater control over the execution of the command, go ahead and use the subprocess module that others here have suggested.
For further information, follow these links:
Python official documentation on os.system()
Python official documentation on the subprocess module
If you want more control over the called shell command (i.e. access to stdin and/or stdout pipes or starting it asynchronously), you can use the subprocessmodule:
import subprocess
p = subprocess.Popen('ls -al', shell=True, stdout=subprocess.PIPE)
stdout, stderr = p.communicate()
See also subprocess module documentation.
os.system("/dir/to/executeble/COMMAND")
for example
os.system("/usr/bin/ping www.google.com")
if ping program is located in "/usr/bin"
Naturally you need to import the os module.
os.system does not wait for any output, if you want output, you should use
subprocess.call or something like that
You can use Python Subprocess ,which offers many modules to execute commands, checking outputs and receive error messages etc.

Categories

Resources