paramiko - invoke script.py on remote server and continue - python

Im using paramiko to execute python script on remote machine.
I want paramiko to execute the script on the remote machine and continue without closing the script.
this my code now:
command = "python3 script.py"
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy)
client.connect(ip,port,user,password)
stdin, stdout, stderr = self.client.exec_command(command)
while(True):
"do something"
So right now the python script is starting and finishing really fast (altought it suppose to run for hour) and then "do something" happend.
I want the script.py to finish on the remoth machine while "do something" happend.
"Invoke the script and forget about it"

That's what your code does. But if the command prints (lots of) output, once the output buffers fill, the script will hang and never finish.
If you are not interested in the output, an easy solution is to discard the it:
command = "python3 script.py > /dev/null 2>&1"

Related

Send Multiple Terminal Commands in Gnome Terminals With Subprocess

So I am currently trying to run two different gnome-terminal windows in Ubuntu that I can send individual commands to after they are initially open.
def ssh_command(cmd):
ssh_terminal_1 = subprocess.Popen(['gnome-terminal', '--', 'bash', '-c', cmd], stderr=subprocess.STDOUT, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
ssh_terminal_2 = subprocess.Popen(['gnome-terminal', '--', 'bash', '-c', cmd], stderr=subprocess.STDOUT, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
# Activate the conda environment for our multilateration server
spyder_activate('conda activate flyhound')
time.sleep(10)
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.flush()
ssh_terminal_1.stdin.write(b'cd srsRAN22.04/build')
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.write(b'cd srsRAN22.04/build')
ssh_terminal_2.stdin.flush()
ssh_terminal_1.stdin.write(b'sudo ./srsepc/src/srsepc ../srsepc/epc.conf.example --hss.db_file=../srsepc/user_db_unknown.csv.example\n')
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.write(b'bladeRF-cli -l /home/administrator/Downloads/hostedxA5-latest.rbf\n')
ssh_terminal_2.stdin.flush()
ssh_terminal_2.stdin.write(b'bladeRF-cli -f /home/administrator/Downloads/bladeRF_fw_v2.4.0.img\n')
ssh_terminal_2.stdin.flush()
ssh_terminal_2.stdin.write(b'sudo ./srsenb/src/srsenb ../srsenb/enb.conf.example --enb_files.sib_config=../srsenb/sib.conf.example --e nb.n_prb=50 --enb_files.rr_config=../srsenb/rr.conf.example\n')
However when I start the original subprocess command the terminals open up fine with the command given during the function call but all the following commands don't work and I get a broken pipe error errno 32. While I try to run these commands I also need to keep previous terminal open that looks like this below
def access_command(cmd):
while True:
process = subprocess.Popen(shlex.split(cmd), stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
if b"f0:9e:4a:5f:a4:5b" and b"handshake" in output:
ssh_command("sshpass -p earth ssh -o StrictHostKeyChecking=no administrator#ipaddress; clear; screen")
I am really not sure how I can send multiple commands to the ssh terminals after they ssh into that ip address. I am very new to subprocess and sending commands to terminals via python so any help would be amazing on this!!
As I explained in the comments, your pipe goes to gnome-terminal and neither to ssh nor to bash. gnome-terminal is not listening to stdin but is only listening to the user at the console. Here is what you do.
Make a FIFO (named pipe) -- os.mkfifo -- for each terminal, give it a name that won't collide with any other file (such as put your process ID in it).
Issue the command gnome-terminal -- bash -c ssh <options> < <fifo name> for each terminal. Do not make this a Popen call, use os.system or something like that.
Do your spydy magic (anaconda).
Open the FIFO as a file
Write your commands to the open file; they will be executed by the bash process in the ssh connection. You will probably have to flush, unless there is a way to open it in line-buffered mode.
When you want to close the terminal, close the file.
What this accomplishes is that we move the pipe from gnome-terminal to ssh and hence across the connection to bash. We feed it on one end and it comes out and gets digested by the shell.

Paramiko exec_command does not execute the command

What is the way to run a .bat (or Python) script on a remote Windows server 2016? SSH server is installed and works correctly.
I tried using Paramiko, but it didn't bring any result:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('ip', username='root', password='pass')
ssh_stdin, ssh_stdout, ssh_stder = ssh.exec_command('C:/Users/Administrator/Desktop/main/videos/uniq.bat')
What am I doing wrong? The script does not run. However, if you run it manually, it works well.
The batch file is:
FOR /F "tokens=*" %%G IN ('dir /b *.mp4') DO ffmpeg -i "%%G" -vf noise=alls=1:allf=t "%%~nG_1.mp4"
The SSHClient.exec_command only starts an execution of the command. If you do not wait for it to complete and immediately kill the session, the command is killed along with it.
The most trivial way to wait for command to complete is reading its output to the end:
stdin, stdout, stderr = ssh.exec_command(command)
stdout.channel.set_combine_stderr(True)
output = stdout.readlines()
If this won't fix the problem on its own, it will at least collect any error output to help you identifying the (other) problem.

Pass arguments to a bash script stored locally and needs to be executed on a remote machine using Python Paramiko

I have a shell script stored on my local machine. The script needs arguments as below:
#!/bin/bash
echo $1
echo $2
I need to run this script on a remote machine (without copying the script on the remote machine). I am using Python's Paramiko module to run the script and can invoke on the remote server without any issue.
The problem is I am not able to pass the two arguments to the remote server. Here is the snippet from my python code to execute the local script on the remote server:
with open("test.sh", "r") as f:
mymodule = f.read()
c = paramiko.SSHClient()
k = paramiko.RSAKey.from_private_key(private_key_str)
c.set_missing_host_key_policy(paramiko.AutoAddPolicy())
c.connect( hostname = "hostname", username = "user", pkey = k )
stdin, stdout, stderr = c.exec_command("/bin/bash - <<EOF\n{s}\nEOF".format(s=mymodule))
With bash I can simply use the below command:
ssh -i key user#IP bash -s < test.sh "$var1" "$var2"
Can someone help me with how to pass the two arguments to the remote server using Python?
Do the same, what you are doing in the bash:
command = "/bin/bash -s {v1} {v2}".format(v1=var1, v2=var2)
stdin, stdout, stderr = c.exec_command(command)
stdin.write(mymodule)
stdin.close()
If you prefer the heredoc syntax, you need to use the single quotes, if you want the argument to be expanded:
command = "/bin/bash -s {v1} {v2} <<'EOF'\n{s}\nEOF".format(v1=var1,v2=var1,s=mymodule)
stdin, stdout, stderr = c.exec_command(command)
The same way as you would have to use the quotes in the bash:
ssh -i key user#IP bash -s "$var1" "$var2" <<'EOF'
echo $1
echo $2
EOF
Though as you have the script in a variable in your Python code, why don't you just modify the script itself? That would be way more straightforward, imo.
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".

Start a background shell script from python

I would like to connect a remote machine and run background script in that machine from python.
I tried:
os.system("ssh root#10.0.0.1 \' nohup script.sh & \')
But it seems not working. And if I put nohup in script.sh, and simply run
os.system("ssh root#10.0.0.1 \' script.sh \'")
The nohup command would not work in either cases.
I'm confused why so, and is there anybody knows how to do background job from python or it's just impossible doing it this way?
What kind of errors are you getting? What version of Python are you using?
You should take a look at this Python subprocess - run multiple shell commands over SSH
import subprocess
sshProcess = subprocess.Popen(["ssh", "root#10.0.0.1"],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE,
universal_newlines=True,
bufsize=0)
sshProcess.stdin.write("nohup script.sh &")
For example you have a local script (python, bash, etc. Here I am demonstrating you using a python script)
First you create a python file locally. Lets say hello.py
# 'hello.py'
import os
print os.system('hostname')
Secondly now a python script which would execute the above hello.py on a remote machine
import pathos
copy = pathos.core.copy('hello.py', destination='abc.remote.com:~/hello.py')
exec = pathos.core.execute('python hello.py', host='.remote.com')
print exec.response()

SSH session held open when running a forked script

I have a parent script (start.py) who's primary purpose is to start background processes and exit. When I ssh directly to remote_host and run the script, it works as expected.
[user#local_host ~]# ssh remote_host
user#remote_host's password: ****
[user#remote_host ~]# time python start.py --config_file /data/workload.pg
real 0m0.037s
user 0m0.025s
sys 0m0.012s
The exit code of this script:
[root#perf72 ~]# echo $?
0
To simplify, instead of establishing the ssh session first and running the command, I want to just execute the command remotely from local_host:
[user#local_host ~]# ssh -o StrictHostKeyChecking=no -i /tmp/tmpqcz5l5il user#remote_host -p 22 "python start.py --config_file /data/workload.pg"
real 12m6.594s
user 0m0.027s
sys 0m0.016s
The problem here is that the ssh session remains open during the life of the background processes and not the life of the start.py script which is less than one second. It should just disconnect when the start.py script exits, but it doesn't.
Do I need a specific sys.exit() signal in the start.py script which will prompt the ssh session to disconnect?
ssh is awaiting output on the called process's stdout, so it can print it if there is any. That file handle is inherited by the subprocesses you're spawning, so it's still open even though the python script has exited, and as long as it's open, ssh will keep waiting.
If you change your ssh command line to run the remote script as "python start.py --config_file /data/workload.pg > /dev/null" instead, the ssh connection will close as soon as the python script does.

Categories

Resources