I am executing a script that resides on the remote server.
This bash script makes use of a variable.
This variable is defined in ~/.profile.
For this purpose lets say it
$MYVAR=/a/b/c
So on remote server, or even ssh to remote and I execute
echo $MYVAR returns /a/b/c as you would expect.
But if I execute the remote script locally using python subprocess, the script fails. It fails as the script uses the $MYVAR which translates as soemthing incorrect.
This is because I am executing it via SSH, the ~./profile must not be getting loaded and instead it is using some other profile.
see here https://superuser.com/questions/207200/how-can-i-set-environment-variables-for-a-remote-rsync-process/207262#207262
Here is the command executed from a python script
ssh = subprocess.Popen(['ssh', '%s' % env.host, 'cd /script/dir | ./myscript arg1 arg2'],shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
My question is how can I run a script locally, that will ssh to remote, load the users ~/.profile then execute a bash script.
you can use paramiko module to run the scripts on remote server from locally.
http://www.paramiko.org/
Once installed , you can run the command as shown in example below
import paramiko
command=" 'ssh', '%s' % env.host, 'cd /script/dir | ./myscript arg1 arg2' "
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('192.18.1.26',port=22,username='root',password='defassult') #This will connect to remote server
stdin,stdout,stderr=ssh.exec_command(command) #This will execute the command on remote server
output=stdout.readlines()
print '\n'.join(output)
The easiest solution for this was to create a sort of wrapper script like so
#!/bin/bash
. /users/me/.profile
cd /dir/where/script/exists/
exec script LIVE "$#"
So now then create a method in the python script to scp the wrapper script to tmp dir
scp wrapper user#remote:/tmp
Now ssh command in the question becomes
subprocess.Popen(['ssh', '%s' % env.host, env.installPatch],shell=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
env.installPatch translates to:
cd /tmp; ./wrapper 'patch_name'
Now the .profile is loaded and the patch script has the correct variable vals.
With using exec I get all the output back from the patch o/p and can write to file.
This was the cleanest solution for my case.
Related
I have a shell script stored on my local machine. The script needs arguments as below:
#!/bin/bash
echo $1
echo $2
I need to run this script on a remote machine (without copying the script on the remote machine). I am using Python's Paramiko module to run the script and can invoke on the remote server without any issue.
The problem is I am not able to pass the two arguments to the remote server. Here is the snippet from my python code to execute the local script on the remote server:
with open("test.sh", "r") as f:
mymodule = f.read()
c = paramiko.SSHClient()
k = paramiko.RSAKey.from_private_key(private_key_str)
c.set_missing_host_key_policy(paramiko.AutoAddPolicy())
c.connect( hostname = "hostname", username = "user", pkey = k )
stdin, stdout, stderr = c.exec_command("/bin/bash - <<EOF\n{s}\nEOF".format(s=mymodule))
With bash I can simply use the below command:
ssh -i key user#IP bash -s < test.sh "$var1" "$var2"
Can someone help me with how to pass the two arguments to the remote server using Python?
Do the same, what you are doing in the bash:
command = "/bin/bash -s {v1} {v2}".format(v1=var1, v2=var2)
stdin, stdout, stderr = c.exec_command(command)
stdin.write(mymodule)
stdin.close()
If you prefer the heredoc syntax, you need to use the single quotes, if you want the argument to be expanded:
command = "/bin/bash -s {v1} {v2} <<'EOF'\n{s}\nEOF".format(v1=var1,v2=var1,s=mymodule)
stdin, stdout, stderr = c.exec_command(command)
The same way as you would have to use the quotes in the bash:
ssh -i key user#IP bash -s "$var1" "$var2" <<'EOF'
echo $1
echo $2
EOF
Though as you have the script in a variable in your Python code, why don't you just modify the script itself? That would be way more straightforward, imo.
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".
I would like to connect a remote machine and run background script in that machine from python.
I tried:
os.system("ssh root#10.0.0.1 \' nohup script.sh & \')
But it seems not working. And if I put nohup in script.sh, and simply run
os.system("ssh root#10.0.0.1 \' script.sh \'")
The nohup command would not work in either cases.
I'm confused why so, and is there anybody knows how to do background job from python or it's just impossible doing it this way?
What kind of errors are you getting? What version of Python are you using?
You should take a look at this Python subprocess - run multiple shell commands over SSH
import subprocess
sshProcess = subprocess.Popen(["ssh", "root#10.0.0.1"],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE,
universal_newlines=True,
bufsize=0)
sshProcess.stdin.write("nohup script.sh &")
For example you have a local script (python, bash, etc. Here I am demonstrating you using a python script)
First you create a python file locally. Lets say hello.py
# 'hello.py'
import os
print os.system('hostname')
Secondly now a python script which would execute the above hello.py on a remote machine
import pathos
copy = pathos.core.copy('hello.py', destination='abc.remote.com:~/hello.py')
exec = pathos.core.execute('python hello.py', host='.remote.com')
print exec.response()
I have the script below (test.py on 1.1.1.1) to run another remote script on another server (script.py on 2.2.2.2). I have set up the ssh keys so I don't get prompted for password.
import subprocess
USER="user"
SERVER_IP="2.2.2.2"
SCRIPT_PATH="/home/abc/script.py"
print ("ssh {0}#{1} '/usr/bin/python {2} aaa bbb'".format(USER, SERVER_IP, SCRIPT_PATH))
rc = subprocess.check_output("ssh {0}#{1} '/usr/bin/python {2} aaa bbb'".format(USER, SERVER_IP, SCRIPT_PATH))
script.py itself is on 1.2.3.4, and takes in 2 arguments.
If I copy the command that is printed out in the script, I can execute script.py successfully on 1.1.1.1. But running test.py on 1.1.1.1 gives me an error:
OSError: [Errno 2] No such file or directory
I don't understand why the script didn't work but the exact same command works on its own.
Use the additional argument:
shell=True
Your command will be:
rc = subprocess.check_output("ssh {0}#{1} '/usr/bin/python {2} aaa bbb'".format(USER, SERVER_IP, SCRIPT_PATH),shell=True)
I assume you need a shell to run a python script.
If your question is to address the need of executing a remote command and not making your script working - then if I could introduce Paramiko:
import paramiko
ssh_handle = paramiko.SSHClient()
ssh_handle.load_system_host_keys()
ssh_handle.connect(
hostname=address,
port=int(port),
username=login)
stdin, stdout, stderr = ssh_handle.exec_command("whoami")
IMO it's currently the most "usable" SSH library and works just fine in my projects.
I have a very strange issue that I can't seem to figure out.
When I execute a python script containing the following lines while inside a SSH terminal (putty), it works fine. But the moment I run the script via crontab or even nohup python myscript >/dev/null 2>&1& it doesn't seem to execute these commands.
subprocess.call('rsync -avr /path/to/folder/. --include "delta.*" --exclude "*" -e "ssh -o StrictHostKeyChecking=no -i /path/to/key.pem" ec2-user#'+server+':/path/to/folder/', shell=True)
local('ssh -t -o StrictHostKeyChecking=no -i /path/to/key.pem ec2-user#'+server+' "sudo /usr/bin/indexer -c /path/to/sphinx.conf --merge main delta --rotate"')
Basically all the above is doing is syncing a folder with new sphinx search engine updates to a remote server, then the second line runs a remote ssh command to force the search engine to rotate updates into production.
I do have fabric installed (hence the local command) but to avoid having to fab a second file I was hoping a single line of code could allow me to execute sudo commands on a remote server.
Can someone help me out?
I found the answer, for ssh commands in a script run in the background, you need to to have -t -t to force a pseudo terminal.
Reference:
Pseudo-terminal will not be allocated because stdin is not a terminal
I have mysql dump command that I would like to run from from windows shell
or command prompt. I have used shell it does work.
d= 'BkSql_'+datetime.datetime.now().strftime("%Y-%m-%d")+".sql"
fn = dn+d
cmd="""mysqldump -u hapopdy -p > %s""" %fn
print cmd
Edit:::::::
The -p needs to be a raw input.
Using the subprocess module
import subprocess
subprocess.call(cmd)
If you're running a shell command add shell=True
subprocess.call(cmd, shell=True)
You should save the password in mysql's local configuration file for the user.(In Unix it's ~/.my.cnf) or you can give it on the command line with --password=MYPASSWORD.
Either way, the password will be visible to a large audience. In the .my.cnf case, it will be visible to anyone with read access to the file. In the second case, it will be visible to anyone who can get a process listing on the system, in addition to those who have read access to your script.