I have a task where I have to SSH to different unix servers to run some command and get output of each server in one single file. The script is in python 2 as follows-
import subprocess
server_list = ['server1', 'server2', 'server3']
for srv in server_list:
subprocess.call(['ssh', srv])
# DO PROCESS
The problem is that when I run the script from command line on server myserver, it ssh to server1 and remains there until I manually run exit on command line. Then it goes to server2, again it remains there and I have to run exit. This process goes on until I return to myserver.
Unless I run the final exit, the file is not completed. So, I tried to give "exit" as command using subprocess module (to automate whole process), as follows
import subprocess
server_list = ['server1', 'server2', 'server3']
for srv in server_list:
subprocess.call(['ssh', srv])
# DO PROCESS
subprocess.call(['exit'])
When I run this, It ssh to 'server1', again it does not return to myserver. When I run exit, I get error-
Traceback (most recent call last):
File "test_return_ssh.py", line 9, in <module>
subprocess.call(['exit'])
File "/usr/lib64/python2.7/subprocess.py", line 524, in call
return Popen(*popenargs, **kwargs).wait()
File "/usr/lib64/python2.7/subprocess.py", line 711, in __init__
errread, errwrite)
File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
And the execution stops.
What should I do to make this script work? I want to automate the whole process - ssh into servers, run the command and then after execution of whole script, exit and return to myserver.
I also have searched for this, but no link seems to provide the answer.
Related
I tried to setup "airflow worker" to run after system start via rc.local(centos 7).
I have installed python and airflow as root. Path is /root/airflow and /root/anaconda3.
I added this to rc.local:
#!/bin/bash
# THIS FILE IS ADDED FOR COMPATIBILITY PURPOSES
#
# It is highly advisable to create own systemd services or udev rules
# to run scripts during boot instead of using this file.
#
# In contrast to previous versions due to parallel execution during boot
# this script will NOT be run after all other services.
#
# Please note that you must run 'chmod +x /etc/rc.d/rc.local' to ensure
# that this script will be executed during boot.
touch /var/lock/subsys/local
exec 2> /home/centos/rc.local.log # send stderr from rc.local to a log file
exec 1>&2 # send stdout to the same log file
set -x # tell sh to display commands before execution
export C_FORCE_ROOT="true"
/root/anaconda3/bin/python /root/anaconda3/bin/airflow worker
exit 0
When I try it to run manually it works (sh /etc/rc.local)
But when it runs after boot, it crashes with this error in log file.
It seems like it can't find path to airflow, but I have written it in full.
+ export C_FORCE_ROOT=true
+ C_FORCE_ROOT=true
+ /root/anaconda3/bin/python /root/anaconda3/bin/airflow worker
Traceback (most recent call last):
File "/root/anaconda3/bin/airflow", line 37, in <module>
args.func(args)
File "/root/anaconda3/lib/python3.7/site-packages/airflow/utils/cli.py", line 75, in wrapper
return f(*args, **kwargs)
File "/root/anaconda3/lib/python3.7/site-packages/airflow/bin/cli.py", line 1129, in worker
sp = _serve_logs(env, skip_serve_logs)
File "/root/anaconda3/lib/python3.7/site-packages/airflow/bin/cli.py", line 1065, in _serve_logs
sub_proc = subprocess.Popen(['airflow', 'serve_logs'], env=env, close_fds=True)
File "/root/anaconda3/lib/python3.7/subprocess.py", line 800, in __init__
restore_signals, start_new_session)
File "/root/anaconda3/lib/python3.7/subprocess.py", line 1551, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'airflow': 'airflow'
A place to start is to change this line...
/root/anaconda3/bin/python /root/anaconda3/bin/airflow worker
to
/root/anaconda3/bin/airflow worker
As you only need to invoke the airflow bin you need and pass it a single service. Bear in mind you can pass more arguments. But calling a version of Python doesn't feel necessary.
This question already has answers here:
Can't execute shell script from python subprocess: permission denied
(2 answers)
Closed 6 years ago.
Currently I am testing a very simple piece of code. I simply want to write a python script that sets a variable and then pass that variable into a bash script for use.
Python Script:
from subprocess import check_call
a = str(3)
check_call(["/home/desktop/bash2pyTest/test.sh", a], shell=False)
Bash Script:
#!/bin/bash
echo "This number was sent from the py script: " $1
I have read other Q&As that are related to this topic; however, I am not finding a solution that I am conceptually understand; thus, the syntax above might be incorrect. I have tried a few other methods as well; however, I keep receiving the following error:
Traceback (most recent call last):
File "/home/cassandra/desktop/bash2pyTest/callVar.py", line 3, in <module>
check_call(["/home/cassandra/desktop/bash2pyTest/test.sh", a], shell=False)
File "/usr/lib64/python2.7/subprocess.py", line 537, in check_call
retcode = call(*popenargs, **kwargs)
File "/usr/lib64/python2.7/subprocess.py", line 524, in call
return Popen(*popenargs, **kwargs).wait()
File "/usr/lib64/python2.7/subprocess.py", line 711, in __init__
errread, errwrite)
File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child
raise child_exception
OSError: [Errno 13] Permission denied
Process finished with exit code 1
Any help would be greatly appreciated. I'm stumped.
Try
chmod +x /home/desktop/bash2pyTest/test.sh
in shell. The file you are trying to execute is not executable.
Or another option in python script:
check_call(["sh","/home/desktop/bash2pyTest/test.sh", a], shell=False)
The error says that permission is denied. I tested your code, and it works fine on my machine. However, you will need to be sure that the user running the command has sufficient privileges for the test.sh script. Most importantly, be sure that test.sh has execute permissions set. That's often the most easily missed permission.
I'm using subprocess to execute a Python script called trace.py that is located in a different folder. The script trace.py then uses subprocess to run a traceroute command and then prints the output. When I go to the folder that trace.py is located in and type this in the terminal:
python trace.py
or
./trace.py
or from any location:
python /home/.../cgi-bin/trace.py
it works fine and the traceroute is printed to the terminal. However, when I try to execute trace.py from main.py by using subprocess, it doesn't seem to work. I've tested this by creating test.py and using subprocess to execute it from main.py and this works. I do this with the following:
output = subprocess.check_output([sys.executable, script_path])
Where script_path is the absolute path to trace.py.
The full error I get is this (paths are shortened):
Traceback (most recent call last):
File "/home/.../cgi-bin/trace.py", line 11, in <module>
traceroute = subprocess.check_output(["traceroute", "www.google.com"])
File "/usr/lib/python2.7/subprocess.py", line 566, in check_output
process = Popen(stdout=PIPE, *popenargs, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 710, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1327, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
Traceback (most recent call last):
File "main.py", line 97, in <module>
serve(args.port, public_html, cgibin)
File "main.py", line 55, in serve
process = subprocess.check_output(["/usr/bin/python", script_path])
File "/usr/lib/python2.7/subprocess.py", line 573, in check_output
raise CalledProcessError(retcode, cmd, output=output)
subprocess.CalledProcessError: Command '['/usr/bin/python', '/home/.../cgi-bin/trace.py']' returned non-zero exit status 1
Why does this not work, but executing it from the terminal does?
The child can't find traceroute executable.
Compare os.environ['PATH'] in your shell with the value within runnning trace.py
Check file permissions -- whether it is readable and executable by the user that runs trace.py.
I have a bash script which helps establish a local SimpleHTTPServer.
python -m SimpleHTTPServer 8080
I have put this inside my project folder. While I am running the program by using:
subprocess.call('./setup.sh')
an error message comes out:
Traceback (most recent call last):
File "test.py", line 2, in <module>
subprocess.call('./setup.sh')
File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 522, in call
return Popen(*popenargs, **kwargs).wait()
File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 709, in __init__
errread, errwrite)
File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 1326, in _execute_child
raise child_exception
OSError: [Errno 13] Permission denied
I retried this in terminal
localhost:Desktop XXXX$ sh setup.sh
Serving HTTP on 0.0.0.0 port 8080 ...
It is working fine.
I remember there are a few times where the terminal has popped up a window ask me about the permission for python about something related to firewall and I allowed it. Can you help me?
Run it exactly as you would on the shell, i.e., as sh ./setup.sh:
subprocess.call('sh ./setup.sh', shell=True)
That should do the trick. Most likely, your setup.sh is not set to executable or is missing the first #! line that marks its interpreter.
EDIT:
Make sure to set shell=True to execute it via the shell, if you pass it as a single string, or separate the parameters into a list, as you might with execve:
subprocess.call(['sh', './setup.sh'])
Give subprocess.Popen() a try, with cwd param:
subprocess.Popen(['sh', './setup.sh'], cwd='/dir/contains/setup.sh/')
I have OSX and am running the python script out of the Unix shell
I'm running a python code that should open an application. I've been testing with Firefox.app and have been getting
Traceback (most recent call last):
File "/Users/brennan/Desktop/Coding/Wilix/wilix.py", line 453, in <module>
subprocess.call(["open -a "+cp2])
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 493, in call
return Popen(*popenargs, **kwargs).wait()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 679, in __init__
errread, errwrite)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 1228, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
My code is:
subprocess.call(["open -a "+cp2])
where cp2 is user input. (Firefox in this case)
if I cd into the programs directory and then do
open -a Firefox
firefox opens fine.
if I change my code to
subprocess.call(["open -a Firefox"])
I still get the error message.
You're passing open -a Firefox as one argument, as if you ran this in the shell:
$ "open -a Firefox"
You need to split up the items:
subprocess.call(['open', '-a', 'Firefox'])
Try giving the full path of firefox app.
It's wrong to use subprocess.call without shell=True or providing command as a list. Please, take a look at first examples in the docs:
http://docs.python.org/2/library/subprocess.html
Full path to Firefox may be needed or may be not needed.