Hi I am trying to create a script using python to log on to a server and to check the status of the cluster by running a clustat command. When I do this I get the following error:
/bin/sh: clustat: command not found
As I understand it, it's not able to run the command as this is a non standard bash command that is being used. I was hoping someone would have some ideas to get around this to get it work.
Below is the method used to run the command:(I have antoher method to ssh onto the system it works fine)
def run_cmd(command):
"""Function for running command on the system."""
proc = subprocess.Popen([command], stdout=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
return out
This is where it seems to go wrong. I know the run_cmd method works as I am able to use it with other commands:
run_cmd("clustat >> out.txt")
return ""
subprocess runs the commands locally.
You will have to use paramiko.SSHClient to run commands on the remote machine.
ssh_client = paramiko.SSHClient()
ssh_client.connect(host='some_host', username='username', password='password')
ssh_client.exec_command('clustat >> out.txt')
Related
I would like to connect a remote machine and run background script in that machine from python.
I tried:
os.system("ssh root#10.0.0.1 \' nohup script.sh & \')
But it seems not working. And if I put nohup in script.sh, and simply run
os.system("ssh root#10.0.0.1 \' script.sh \'")
The nohup command would not work in either cases.
I'm confused why so, and is there anybody knows how to do background job from python or it's just impossible doing it this way?
What kind of errors are you getting? What version of Python are you using?
You should take a look at this Python subprocess - run multiple shell commands over SSH
import subprocess
sshProcess = subprocess.Popen(["ssh", "root#10.0.0.1"],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE,
universal_newlines=True,
bufsize=0)
sshProcess.stdin.write("nohup script.sh &")
For example you have a local script (python, bash, etc. Here I am demonstrating you using a python script)
First you create a python file locally. Lets say hello.py
# 'hello.py'
import os
print os.system('hostname')
Secondly now a python script which would execute the above hello.py on a remote machine
import pathos
copy = pathos.core.copy('hello.py', destination='abc.remote.com:~/hello.py')
exec = pathos.core.execute('python hello.py', host='.remote.com')
print exec.response()
I have the script below (test.py on 1.1.1.1) to run another remote script on another server (script.py on 2.2.2.2). I have set up the ssh keys so I don't get prompted for password.
import subprocess
USER="user"
SERVER_IP="2.2.2.2"
SCRIPT_PATH="/home/abc/script.py"
print ("ssh {0}#{1} '/usr/bin/python {2} aaa bbb'".format(USER, SERVER_IP, SCRIPT_PATH))
rc = subprocess.check_output("ssh {0}#{1} '/usr/bin/python {2} aaa bbb'".format(USER, SERVER_IP, SCRIPT_PATH))
script.py itself is on 1.2.3.4, and takes in 2 arguments.
If I copy the command that is printed out in the script, I can execute script.py successfully on 1.1.1.1. But running test.py on 1.1.1.1 gives me an error:
OSError: [Errno 2] No such file or directory
I don't understand why the script didn't work but the exact same command works on its own.
Use the additional argument:
shell=True
Your command will be:
rc = subprocess.check_output("ssh {0}#{1} '/usr/bin/python {2} aaa bbb'".format(USER, SERVER_IP, SCRIPT_PATH),shell=True)
I assume you need a shell to run a python script.
If your question is to address the need of executing a remote command and not making your script working - then if I could introduce Paramiko:
import paramiko
ssh_handle = paramiko.SSHClient()
ssh_handle.load_system_host_keys()
ssh_handle.connect(
hostname=address,
port=int(port),
username=login)
stdin, stdout, stderr = ssh_handle.exec_command("whoami")
IMO it's currently the most "usable" SSH library and works just fine in my projects.
The subprocess.Popen() function has a "env" parameter. But it doesn't seem to have the desired effect with sudo. This is what I get when I do this in the interactive Python shell:
import subprocess
env={"CVS_RSH":"ssh"}
command = "sudo -u user cvs -d user#1.8.7.2:/usr/local/ncvs co file.py"
p = subprocess.Popen(command, stdout=subprocess.PIPE,
stderr=subprocess.PIPE,env=env,shell=True)
(command_output, error_output) = p.communicate()
p.wait()
1
>>> error_output
b'cvs [checkout aborted]: cannot exec rsh: No such file or
directory\ncvs [checkout aborted]: end of file from server (consult
above messages if any)\n'
The message is distracting, so let me explain. I'm forced to use ancient CVS and the environment variable tells it to use ssh to connect to the server, rather than the default which sadly is rsh. It also needs an environment variable called CVS_ROOT, but fortunately there's a "-d" option for that, but none for the CVS_RSH that I know of.
Interestingly enough, if I do:
command = "sudo -u user echo $CVS_RSH"
env={"CVS_RSH":"something_else"}
p = subprocess.Popen(command, stdout=subprocess.PIPE,
stderr=subprocess.PIPE,env=env,shell=True)
(command_output, error_output) = p.communicate()
p.wait()
0
>>> command_output
b'something_else\n'
Maybe this worked because echo wasn't actually started as a child process? Is it possible to pass an environment to a process executed as another user with sudo?
This doesn't seem possible using the env parameter. The solution seems to be to just pass the environment as I was doing on the shell, for example:
command = "sudo -u user CVS_RSH=ssh
CVSROOT=:ext:user#2.8.7.2:/usr/local/ncvs cvs co dir/file.py"
p = subprocess.Popen(command, stdout=subprocess.PIPE,
stderr=subprocess.PIPE,env=env,shell=True)
The weird thing is, if I do this in a Python CGI script, I can see:
cvs [checkout aborted]: cannot exec ssh: Permission denied
cvs [checkout aborted]: end of file from server (consult above messages if
any)
But if I try on the interactive Python shell, it goes past this, so it must be another weird (because the user has permission to ssh) issue, unrelated to this question.
Here is my code:
import subprocess
HOST = 'host_name'
PORT = '111'
USER = 'user_name'
CMD = 'sudo su - ec2-user; ls'
process = subprocess.Popen(['ssh','{}#{}'.format(USER, HOST),
'-p', PORT, CMD],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = process.stdout.readlines()
if not result:
print "Im an error"
err = process.stderr.readlines()
print('ERROR: {}'.format(err))
else:
print "I'm a success"
print(result)
When I run this I receive the following output in my terminal:
dredbounds-computer: documents$ python terminal_test.py
Im an error
ERROR: ['sudo: sorry, you must have a tty to run sudo\n']
I've tried multiple things but I keep getting that error "sudo: sorry, you must have a tty to run sudo". It works fine if I just do it through the terminal manually, but I need to automate this. I read that a workaround might be to use '-t' or '-tt' in my ssh call, but I haven't been able to implement this successfully in subprocess yet (terminal just hangs for me). Anyone know how I can fix my code, or work around this issue? Ideally I'd like to ssh, then switch to the sudo user, and then run a file from there (I just put ls for testing purposes).
sudo is prompting you for a password, but it needs a terminal to do that. Passing -t or -tt provides a terminal for the remote command to run in, but now it is waiting for you to enter a password.
process = subprocess.Popen(['ssh','-tt', '{}#{}'.format(USER, HOST),
'-p', PORT, CMD],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
process.stdin.write("password\r\n")
Keep in mind, though, that the ls doesn't run until after the shell started by su exits. You should either log into the machine as ec2-user directly (if possible), or just use sudo to run whatever command you want without going through su first.
You can tell sudo to work without requiring a password. Just add this to /etc/sudoers on the remote server host_name.
user ALL = (ec2-user) NOPASSWD: ls
This allows the user named user to execute the command ls as ec2-user without entering a password.
This assumes you change your command to look like this, which seems more reasonable to me:
CMD = 'sudo -u ec2-user ls'
I have a python script in my cgi-bin where I am calling another perl script which in turn calls a python script to perform some tests. It works perfectly fine when I run it from the command line, but throws import error: no module named paramiko when I run from Apache. I have even printed:
import site
sys.path.append('/usr/local/lib/python2.7/site-packages/')
import paramiko
from paramiko import SSHClient
in my python script, but I get an error when I run from browser.
My python code which calls the perl script is as below:
def execute_ACL():
print "Going to execute"
for line in run_command("perl systemtcpcheck.pl acl/config_files/G2AXStageg2ax-stage.hosts acl/config_files/G2AXStageg2ax-live.tests"):
print(line)
def run_command(command):
p = subprocess.Popen(command, shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
return p.communicate()
Can anyone please suggest a solution ?