I am using python3 under git-bash environment, and sometimes it does not run shell command well.
#!/usr/bin/env python3
import subprocess as sp
print("hello")
print(sp.getoutput("ls -l")) # This works.
print(sp.getoutput("date")) # This hangs and cannot terminate with ctrl-c.
This does not happen when running under normal linux/bash environment.
Then I come across this one: Python not working in the command line of git bash.
I can run using "winpty python ...", however it still cannot terminate even with ctrl-c.
I take back, getoutput("date") hangs but check_output works.
You are needlessly running those commands with a shell.
Prefer this form:
print(sp.check_output(["ls", " -l"]))
print(sp.check_output(["date"]))
Related
I have a Python script that repeatedly plays a sound using the cvlc command. The Python script works fine when run through the command line or from a bash shell script through the command line but when the bash shell script is run automatically at startup, it runs without any sound. I am using the os.system call to run the cvlc command however, I noticed that it is adding “sh -c” to the beginning of the command. I believe this is causing an issue as vlc as won’t play any sound with this command. Does anyone know how to prevent the “sh -c” from being added to the beginning of the command? Below is my code for the Python script and the bash shell script that runs automatically on startup.
Python script:
import is
import time
import multiprocessing
def make_sound():
os.system(‘cvlc air_horn.wav’)
while True:
process = multiprocessing.Process(target=make_sound, name=“Make Sound”)
process.start()
time.sleep(3)
process.terminate()
process.join()
Bash script:
#!/bin/bash
cd path/to/Python/script
sleep 10
python3 make_sound.py
I have a shell script (test.sh -> example shown below) which has a infinte while loop and prints some data to screen.
I am calling all my .sh scripts from python and I need to stop the test.sh before calling my other commands
I am using python 2.7 and linux system is on propritary hardware where I cannot install any python modules.
Here is my test.sh
#!/bin/sh
while :
do
echo "this code is in infinite while loop"
sleep 1
done
Here is my python Scripts
import subprocess as SP
SP.call(['./test.sh']) # I need to stop the test.sh in order for python to
# go and execute more commands and call
# another_script.sh
# some code statements
SP.call(['./another_script.sh'])
Well, quick google search made me look into subprocess call and Popen modules . and Popen has a terminate option and it doesn't work for me (or) I'm doing something wrong here
cmd=['test.sh']
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True)
p.terminate()
Any other suggestions on how I can stop the test.sh from python are highly appreciated
PS: I don't mind to run the test.sh for like T seconds and then stop it
I use tmux for these type of processes, python has a good package libtmux which should solve your problem.
Basically you create a tmux session:
import libtmux
server = libtmux.Server()
session = server.new_session(session_name='my_session_name')
then you create a window to run the command in
window = session.new_window(attach=False, window_name='my_window_name')
command = './my_bash_file.sh'
window.select_pane('0').send_keys(command, enter=True)
You'll be able to run subsequent commands right after this one. To access the tmux session from your bash terminal use tmux attach -t my_session_name you'll then be in a tmux window, the one which ran your bash script.
To kill the tmux window use window.kill_window() there's a lot of options look at the libtmux docs.
The project aileen has some useful tmux commands if you want to see some more implementations.
I have ten python scripts in the same directory. How to run all of these from command line, that it will work in background?
I use SSH terminal to connect to server CentOS and run Python script as:
python index.py
But when I close client terminal SSH, proccess is died
You can use the & command to make things run in the background, and nohup so it continues on logout, such as
nohup python index.py &
If you want to run multiple things this way, it's probably easiest to just make a script to start them all (with a shell of your choice):
#!/bin/bash
nohup python index1.py &
nohup python index2.py &
...
As long as you don't need to interact with the scripts once they are started (and don't need any stdout printing) this could be pretty easily automated with another python script using the subprocess module:
for script in listofscripts:
#use subprocess.run() for python 3.x (this blocks until each script terminates)
subprocess.call(["python", script], *args) #use popen if you want non - blocking
*args is a link (it's coloring got overwritten by code highliting
also of note: stdout / stderr printing is possible, just more work..
I have always been able to use Python's subprocess.Popen to run bash scripts without any issues.
However, I am now trying to run a bash script with Popen, and then that bash script is trying to run another script. I did not redirect the output that Popen was getting at all, so all of the output would appear on the terminal. I'm using Ubuntu Linux.
However, when the script that is being called by the bash script finishes, the output on the terminal freezes while the rest of the bash script and the python script continues in the background.
I understand that it might not be the smoothest practice to have a python script run a bash script which also runs a bash script, but I'm hoping to fix this issue. I sense that it is an issue with how I'm running the original bash script inside my python script. Here is the code I'm using:
p = subprocess.Popen(["bash", "myScript.sh", param1, param2, param3])
p.wait()
I originally was using shell=True for the Popen, but that was resulting in the same issue. I also tried removing p.wait() but that also does not resolve the issue.
Any ideas? Should I use a different python method to run the bash script?
Lets say I issue a command from the Linux command line. This will cause Linux to create a new Process and lets say that the Process expects to receive the command from the user.
For Example: I will run a python script test.py which will accept a command from the user.
$python test.py
TEST>addController(192.168.56.101)
Controller added
TEST>
The question I have is can I write a script which will go into the command line (TEST>) and issue a command? As far as I know if I write a script to run multiple commands it will wait for the first process to exit before running the next command.
Regards,
Vinay Pai B.H.
You should look into expect. It's a tool that is designed to automate user interaction with commands that need it. The man page explains how to use it.
Seems like there is also pexpect, a Python version of similar functionality.
Assuming the Python script is reading its commands from stdin, you can pass them in with a pipe or a redirection:
$ python test.py <<< 'addController(192.168.56.101)'
$ echo $'addController(192.168.56.101)\nfoo()\nbar()\nbaz()' | python test.py
$ python test.py <<EOF
addController(192.168.56.101)
foo()
bar()
baz()
EOF
If you don't mind waiting for the calls to finish (one at a time) before returning control to your program, you can use the subprocess library. If you want to start something running and not wait for it to finish, you can use the multiprocessing library.