Running rsync from python subprocess in windows - python

I need to run rsync from Python 2.7 app in windows 7 x64 (using cwRsync 5.5.0).
Everything works fine from command line:
set CWRSYNCHOME in env to cwrsync binaries and run following command
rsync.exe "/cygdrive/e/test" test1#192.168.1.14:
But when trying to run same command as python subprocess:
process = subprocess.Popen(['rsync.exe', '/cygdrive/e/test', 'test1#192.168.1.14:'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True,
env={'CWRSYNCHOME': './bin'})
stdout, stderr = process.communicate()
print 'STDOUT:{}\nSTDERR:{}'.format(stdout, stderr)
I get following error in stderr:
rsync: pipe: Operation not permitted (1)
rsync error: error in IPC code (code 14) at pipe.c(59) [sender=3.1.2]
Here is verbose rsync stdout:
FILE_STRUCT_LEN=16, EXTRA_LEN=4
cmd=<NULL> machine=192.168.1.14 user=test1 path=.
cmd[0]=ssh cmd[1]=-l cmd[2]=test1 cmd[3]=192.168.1.14 cmd[4]=rsync cmd[5]=--server cmd[6]=-vvvvve.LsfxC cmd[7]=. cmd[8]=.
opening connection using: ssh -l test1 192.168.1.14 rsync --server -vvvvve.LsfxC . . (9 args)
[sender] _exit_cleanup(code=14, file=pipe.c, line=59): entered
[sender] _exit_cleanup(code=14, file=pipe.c, line=59): about to call exit(14)
Tryed set shell=False and pass command as single line (not cmd and args) - error stil repeats.
What am i doing wrong ?

To get it work, rsync needs to be runned under cygwin's shell:
process = subprocess.Popen(['sh.exe', '-c',
'rsync /cygdrive/e/test test1#192.168.1.14:'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE,
env={'CWRSYNCHOME': '/bin/',
'PATH': '/bin/'})
It's working (there is no ssh athorization in example above).

Related

Send Multiple Terminal Commands in Gnome Terminals With Subprocess

So I am currently trying to run two different gnome-terminal windows in Ubuntu that I can send individual commands to after they are initially open.
def ssh_command(cmd):
ssh_terminal_1 = subprocess.Popen(['gnome-terminal', '--', 'bash', '-c', cmd], stderr=subprocess.STDOUT, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
ssh_terminal_2 = subprocess.Popen(['gnome-terminal', '--', 'bash', '-c', cmd], stderr=subprocess.STDOUT, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
# Activate the conda environment for our multilateration server
spyder_activate('conda activate flyhound')
time.sleep(10)
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.flush()
ssh_terminal_1.stdin.write(b'cd srsRAN22.04/build')
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.write(b'cd srsRAN22.04/build')
ssh_terminal_2.stdin.flush()
ssh_terminal_1.stdin.write(b'sudo ./srsepc/src/srsepc ../srsepc/epc.conf.example --hss.db_file=../srsepc/user_db_unknown.csv.example\n')
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.write(b'bladeRF-cli -l /home/administrator/Downloads/hostedxA5-latest.rbf\n')
ssh_terminal_2.stdin.flush()
ssh_terminal_2.stdin.write(b'bladeRF-cli -f /home/administrator/Downloads/bladeRF_fw_v2.4.0.img\n')
ssh_terminal_2.stdin.flush()
ssh_terminal_2.stdin.write(b'sudo ./srsenb/src/srsenb ../srsenb/enb.conf.example --enb_files.sib_config=../srsenb/sib.conf.example --e nb.n_prb=50 --enb_files.rr_config=../srsenb/rr.conf.example\n')
However when I start the original subprocess command the terminals open up fine with the command given during the function call but all the following commands don't work and I get a broken pipe error errno 32. While I try to run these commands I also need to keep previous terminal open that looks like this below
def access_command(cmd):
while True:
process = subprocess.Popen(shlex.split(cmd), stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
if b"f0:9e:4a:5f:a4:5b" and b"handshake" in output:
ssh_command("sshpass -p earth ssh -o StrictHostKeyChecking=no administrator#ipaddress; clear; screen")
I am really not sure how I can send multiple commands to the ssh terminals after they ssh into that ip address. I am very new to subprocess and sending commands to terminals via python so any help would be amazing on this!!
As I explained in the comments, your pipe goes to gnome-terminal and neither to ssh nor to bash. gnome-terminal is not listening to stdin but is only listening to the user at the console. Here is what you do.
Make a FIFO (named pipe) -- os.mkfifo -- for each terminal, give it a name that won't collide with any other file (such as put your process ID in it).
Issue the command gnome-terminal -- bash -c ssh <options> < <fifo name> for each terminal. Do not make this a Popen call, use os.system or something like that.
Do your spydy magic (anaconda).
Open the FIFO as a file
Write your commands to the open file; they will be executed by the bash process in the ssh connection. You will probably have to flush, unless there is a way to open it in line-buffered mode.
When you want to close the terminal, close the file.
What this accomplishes is that we move the pipe from gnome-terminal to ssh and hence across the connection to bash. We feed it on one end and it comes out and gets digested by the shell.

Run jar file as Administrator

I am running a jar file from python.
This jar must be run as administrator in order to work.
the python script is run in a jenkins job.
Is there a way to run the jar/python script as and administrator?
either from the jenkins job - or modify the python script.
with subprocess.Popen(command,
cwd=tool_dir,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True) as proc:
try:
outs, errs = proc.communicate(timeout=1000)
Thank you!

Using $PWD with subprocess.Popen() results in a Docker error, works from shell

I want to run a docker command from python using the subprocess Popen:
proc = subprocess.Popen(
shlex.split(r'docker run -v $PWD:/data blang/latex pdflatex main.tex'),
cwd=temp_dir, shell=False, stdout=subprocess.PIPE,
stdin=subprocess.PIPE, stderr=subprocess.PIPE)
proc.communicate()
While the command from the terminal works perfect, this returns:
(b'',
b'docker: Error response from daemon: create $PWD: "$PWD" includes invalid characters for a local volume name, only "[a-zA-Z0-9][a-zA-Z0-9_.-]" are allowed.\nSee \'docker run --help\'.\n')
"$PWD" is a shell expansion. If you don't have a shell (as with shell=False), it doesn't get expanded.
'%s:/data' % os.getcwd() is a Python expression which will have the same result as "$PWD:/data" in shell. Thus:
import os, subprocess
proc = subprocess.Popen(
['docker', 'run',
'-v', '%s:/data' % os.getcwd(),
'blang/latex', 'pdflatex', 'main.tex'],
cwd=temp_dir, shell=False, stdout=subprocess.PIPE,
stdin=subprocess.PIPE, stderr=subprocess.PIPE)
It's important not to use shlex.split() in this case: If you did, and were in a directory with spaces in its name, each segment of that directory would become a separate argument.

read -a unknown option

I'm executing shell commands using python script. This is the command:
ntpservlist=( $OMC_NTPSERV ) && IFS=',' read -ra ntplist <<< "$ntpservlist" && for i in "${ntplist[#]}" ; do echo "server $i" >> /etc/inet/ntp.conf ; done
When I execute the command using a script, I get the following error:
/bin/sh[1]: read: -a: unknown option
Usage: read [-ACprsv] [-d delim] [-u fd] [-t timeout] [-n count] [-N count]
[var?prompt] [var ...]
But if I execute the same command using the command line, it executes correctly without any errors.
I'm using:
proc = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
to execute the command.
Your interactive shell is bash, but your system shell, used by Popen, is some flavor of ksh. To use bash instead, use the executable option:
proc = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True,
executable="/bin/bash") # or whatever the right path is
(out, err) = proc.communicate()
Most of your command appears to be valid ksh, but one difference is that read -A, not read -a, is used to populate an array.

Python subprocess output read error

I have a command which works great at the terminal:
sudo tshark -V -l -i "any" -f 'udp port 4729'
I trying to read the output from my python script:
import subprocess
command = ['tshark', '-V', '-l', '-i', '"any"', '-f', '"udp port 4729"'] # the shell command
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=None)
output, error = process.communicate()
print output
It does not work. Maybe it's some troubles with writing a command in the list.
I receive the error:
gooman#ubuntu:~/workspace/glade_tests/src$ sudo ./main.py
tshark: Lua: Error during loading:
[string "/usr/share/wireshark/init.lua"]:45: dofile has been disabled
Running as user "root" and group "root". This could be dangerous.
Capturing on "any"
tshark: The capture session could not be initiated (No such device exists).
Please check to make sure you have sufficient permissions, and that you have the proper interface or pipe specified.
0 packets captured
Thy this:
import subprocess
command = "sudo tshark -V -l -i "any" -f 'udp port 4729'"
try:
output = subprocess.check_output(command, shell=True)
except subprocess.CalledProcessError as e:
print "An error has been occured", e
raise
print "The subprocess output:", output
Maybe, it will be needed to add stdout=subprocess.PIPE argument.

Categories

Resources