Cannot use gcloud compute ssh command in python subprocess - python

I have a compute engine already set up, I can use ssh command in command prompt, like
gcloud compute ssh test-instance#cloud-engine-noapi --zone us-central1-f --command "rm -f test.txt"
and successfully delete the test.txt in server.
However, when I call these in python.
subprocess.call(['gcloud', 'compute', '--project', 'test-project','ssh', temp_instance, '--zone', zone, '--command', '"cd /home"'], shell=True)
subprocess.call(['gcloud', 'compute', '--project', 'test-project','ssh', temp_instance, '--zone', zone, '--command', '"ls -l"'], shell=True)
subprocess.call(['gcloud', 'compute', '--project', 'test-project','ssh', temp_instance, '--zone', zone, '--command', '"rm -f /home/test.txt"'], shell=True)
The return is always like
bash: /command : No such file or directory
and for command like ls
bash: /command : command not found
Is there any process I have to do first?

Though, probably no one has this problem... but I finally come up with a method to solve it.
As the problem only occurs when using subprocess, I bypass it by writing a (bat/sh) file to temporarily save the commands.
Like,
with open(os.path.join(__location__, 'gcloud_command.bat'), 'w') as bat:
command_arr = []
for instance in all_instances_name:
temp_instance = user_name + "#" + instance
temp_file_path = '/home/' + user_name + '/'
command_arr.append('call gcloud compute ssh ' + temp_instance + ' --zone ' + zone + ' --command "cd ' + temp_file_path + '; rm -rf ' + temp_file_path + projectname.split('.')[0] + '; rm -f ' + temp_file_path + projectname.split('.')[0] + '*"\n')
command_arr.append('call gcloud compute scp "' + fullpath_projectname + '" ' + instance + ':' + temp_file_path + '\n')
if is_zip:
command_arr.append('call gcloud compute ssh ' + temp_instance + ' --zone ' + zone + ' ' + ' --command "cd ' + temp_file_path + '; unzip ' + temp_file_path + projectname + '"' + '\n')
bat.writelines(command_arr)
And execute with
subprocess.Popen(os.path.join(__location__, 'gcloud_command.bat'))

I know that this is quite old but it took me some time to figure it out.
In my case helped avoiding double quotes (despite it's recommended by gcloud cli help).
Example:
subprocess.call(['gcloud', 'compute', f'--project={test-project}','ssh', temp_instance, f'--zone={zone}, '--command=cd /home'], shell=True)

Related

How can I start Tensorboard dev from within Python as a subprocess parallel to the Python session?

I want to monitor training progress of a CNN which is trained via a slurm process on a server (i.e., the Python script is executed through a bash script whenever the server has resources available; the session is not interactive. Hence, I cannot simply open a terminal and run Tensorboard dev).
So far, I have tried the following without finding a new experiment on my Tensorboard dev site:
mod = "SomeModelType"
logdir = "/some/directory/used/in/Tensorboard/callback"
PARAMETERS = "Some line of text describing the training settings"
subprocess.Popen(["tensorboard", "dev upload --logdir '" + logdir + \
"' --name Myname_" + mod + " --description '" + \
PARAMETERS + "'"])
If I insert the text string "tensorboard dev upload --logdir 'some/directory..." in a terminal, Tensorboard will start as expected.
If I include the code showed above, no new Tensorboard experiment will be started.
I also tried this:
subprocess.run(["/pfs/data5/home/kit/ifgg/mp3890/.local/bin/tensorboard", \
"dev", "upload", "--logdir", "'" + logdir + \
"'", "--name", "LeleNet" + mod#, "--description" + "'" + \
#PARAMETERS + "'"
], \
capture_output = False, text = False)
which starts Tensorboard, but it will not continue the Python script. Hence, Tensorboard, will be listening to output that never comes, because the Python session is listening to its own output instead of training the CNN.
Edit
This:
subprocess.Popen(["/pfs/data5/home/kit/ifgg/mp3890/.local/bin/tensorboard", \
"dev", "upload", "--logdir", "'" + logdir + \
"'", "--name", "LeleNet" + mod#, "--description" + "'" + \
#PARAMETERS + "'"
])
led to some message "Listening for new data in the log dir..." popping up all the time in interactive mode and led to cancellation of the slurm job (job disappeared). Moreover, Tensorboard does not work correcty this way. The experiment is created, but never receives any data.
I got it to work as follows:
logdir = "/some/directory"
tbn = "some_name"
DESCRIPTION = "some description of the experiment"
subprocess.call("tensorboard dev upload --logdir '" + logdir + \
"' --name " + tbn + " --description '" + \
DESCRIPTION + "' &", shell = True)

Python ffmpeg subprocess never exits on Linux, works on Windows

I wonder if someone can help explain what is happening?
I run 2 subprocesses, 1 for ffprobe and 1 for ffmpeg.
popen = subprocess.Popen(ffprobecmd, stderr=subprocess.PIPE, shell=True)
And
popen = subprocess.Popen(ffmpegcmd, shell=True, stdout=subprocess.PIPE)
On both Windows and Linux the ffprobe command fires, finishes and gets removed from taskmanager/htop. But only on Windows does the same happen to ffmpeg. On Linux the command remains in htop...
Can anyone explain what is going on, if it matters and how I can stop it from happening please?
EDIT: Here are the commands...
ffprobecmd = 'ffprobe' + \
' -user_agent "' + request.headers['User-Agent'] + '"' + \
' -headers "Referer: ' + request.headers['Referer'] + '"' + \
' -timeout "5000000"' + \
' -v error -select_streams v -show_entries stream=height -of default=nw=1:nk=1' + \
' -i "' + request.url + '"'
and
ffmpegcmd = 'ffmpeg' + \
' -re' + \
' -user_agent "' + r.headers['User-Agent'] + '"' + \
' -headers "Referer: ' + r.headers['Referer'] + '"' + \
' -timeout "10"' + \
' -i "' + r.url + '"' + \
' -c copy' + \
' -f mpegts' + \
' pipe:'
EDIT: Here is a example that behaves as described...
import flask
from flask import Response
import subprocess
app = flask.Flask(__name__)
#app.route('/', methods=['GET'])
def go():
def stream(ffmpegcmd):
popen = subprocess.Popen(ffmpegcmd, stdout=subprocess.PIPE, shell=True)
try:
for stdout_line in iter(popen.stdout.readline, ""):
yield stdout_line
except GeneratorExit:
raise
url = "https://bitdash-a.akamaihd.net/content/MI201109210084_1/m3u8s/f08e80da-bf1d-4e3d-8899-f0f6155f6efa.m3u8"
ffmpegcmd = 'ffmpeg' + \
' -re' + \
' -timeout "10"' + \
' -i "' + url + '"' + \
' -c copy' + \
' -f mpegts' + \
' pipe:'
return Response(stream(ffmpegcmd))
if __name__ == '__main__':
app.run(host= '0.0.0.0', port=5000)
You have the extra sh process due to shell=True, and your copies of ffmpeg are allowed to try to attach to the original terminal's stdin because you aren't overriding that file handle. To fix both those issues, and also some security bugs, switch to shell=False, set stdin=subprocess.DEVNULL, and (to stop zombies from potentially being left behind, note the finally: block below that calls popen.poll() to see if the child exited, and popen.terminate() to tell it to exit if it hasn't):
#!/usr/bin/env python
import flask
from flask import Response
import subprocess
app = flask.Flask(__name__)
#app.route('/', methods=['GET'])
def go():
def stream(ffmpegcmd):
popen = subprocess.Popen(ffmpegcmd, stdin=subprocess.DEVNULL, stdout=subprocess.PIPE)
try:
# NOTE: consider reading fixed-sized blocks (4kb at least) at a time
# instead of parsing binary streams into "lines".
for stdout_line in iter(popen.stdout.readline, ""):
yield stdout_line
finally:
if popen.poll() == None:
popen.terminate()
popen.wait() # yes, this can cause things to actually block
url = "https://bitdash-a.akamaihd.net/content/MI201109210084_1/m3u8s/f08e80da-bf1d-4e3d-8899-f0f6155f6efa.m3u8"
ffmpegcmd = [
'ffmpeg',
'-re',
'-timeout', '10',
'-i', url,
'-c', 'copy',
'-f', 'mpegts',
'pipe:'
]
return Response(stream(ffmpegcmd))
if __name__ == '__main__':
app.run(host= '127.0.0.1', port=5000)
Mind, it's not appropriate to be parsing a binary stream as a series of lines at all. It would be much more appropriate to use blocks (and to change your response headers so the browser knows to parse the content as a video).
What type is the ffmpegcmd variable? Is it a string or a list/sequence?
Note that Windows and Linux/POSIX behave differently with the shell=True parameter enabled or disabled. It matters whether ffmpegcmd is a string or a list.
Direct excerpt from the documentation:
On POSIX with shell=True, the shell defaults to /bin/sh. If args is a
string, the string specifies the command to execute through the shell.
This means that the string must be formatted exactly as it would be
when typed at the shell prompt. This includes, for example, quoting or
backslash escaping filenames with spaces in them. If args is a
sequence, the first item specifies the command string, and any
additional items will be treated as additional arguments to the shell
itself. That is to say, Popen does the equivalent of:
Popen(['/bin/sh', '-c', args[0], args[1], ...])
On Windows with shell=True, the COMSPEC environment variable specifies
the default shell. The only time you need to specify shell=True on
Windows is when the command you wish to execute is built into the
shell (e.g. dir or copy). You do not need shell=True to run a batch
file or console-based executable.

popen is not executing the command line parameters

I am trying to execute the following command via popen but not sure why it is not working and not error message thrown too.
import os
class Pabot():
def run_pabot(self, folderOrSuiteName, tags=None):
print("pabot --testlevelsplit -r " + folderOrSuiteName + " --i " + tags + " " + folderOrSuiteName + "")
os.popen("pabot --testlevelsplit -r " + folderOrSuiteName + " --i " + tags + " " + folderOrSuiteName + "")
run = Pabot()
run.run_pabot("o/boo/test.robot", "Sequence_TC1")
From print statement:
pabot --testlevelsplit -r foo/boo/test.robot --i Sequence_TC1 foo/boo/test.robot
Right after execution, the window console gets disappeared.
Note: The same command (from print statement) works fine in commandline.
Any idea why popen does not function in this case?

Is there any difference in using subprocess.check_output() in Windows and OS X?

I want to use subprocess.check_output(cmd, shell=True) to execute cmd in Windows. It turns out that there is no output after executing this statement, but it works in OS X. I want to know if there is some problem when using shell=True. Here's my original source.
paper_name = sheet[location].value
name = '"' + paper_name + '"'
cmd = py + options + name + ' -t'
out_str = subprocess.check_output(cmd,shell=True)
pdb.set_trace()
#a = out_str.split('\n')
fp_str = to_str(out_str)
a = fp_str.split('\n')
cmd is like below
cmd

Python raw_input doesn't work after using subprocess module

I'm using the subprocess module to invoke plink and run some commands on a remote server. This works as expected, but after a successful call to subprocess.check_call or subprocess.check_output the raw_input method seems to block forever and doesn't accept input at the command line.
I've reduced it to this simple example:
import subprocess
def execute(command):
return subprocess.check_call('plink.exe -ssh ' + USER + '#' + HOST + ' -pw ' + PASSWD + ' ' + command)
input = raw_input('Enter some text: ')
print('You entered: ' + input)
execute('echo "Hello, World"')
# I see the following prompt, but it's not accepting input
input = raw_input('Enter some more text: ')
print('You entered: ' + input)
I see the same results with subprocess.check_call and subprocess.check_output. If I replace the final raw_input call with a direct read from stdin (sys.stdin.read(10)) the program does accept input.
This is Python 2.7 on Windows 7 x64. Any ideas what I'm doing wrong?'
Edit: If I change execute to call something other than plink it seems to work okay.
def execute(command):
return subprocess.check_call('cmd.exe /C ' + command)
This suggests that plink might be the problem. However, I can run multiple plink commands directly in a console window without issue.
I was able to resolve this by attaching stdin to devnull:
def execute(command):
return subprocess.check_call('plink.exe -ssh ' + USER + '#' + HOST + ' -pw ' + PASSWD + ' ' + command, stdin=open(os.devnull))

Categories

Resources