Aws batch: sending log for a specific command to cloudwatch - python

I run an aws batch jobs. For that i use a python script. This script also runs bash subprocesses.
I used the following function to put message on stdout:
def printf(msg):
print(msg, file=sys.stdout)
sys.stdout.flush()
And for exemple :
printf('Downloading misc inputs')
With this i can send some messages to aws cloudwatch.
But i would like to know if there is a way to send the log of one specific Bash command to a specific cloudwatch stream ?
To exec my Bash command i use the function:
def exec_cmd(cmd, shell=True):
printf("Executing %s" % cmd.strip())
if not shell:
p = subprocess.Popen(cmd.split())
else:
p = subprocess.Popen(cmd, shell=True, executable='/bin/bash')
err = p.wait()
return err
And then :
my_command1 = "/usr/local/bin/mytool arg1 arg2 > /ephemaral/output/log_command1.log 2>&1"
Exit_code1 = exec_cmd(my_command1)
my_command2 = "/usr/local/bin/mytool arg1 arg2 > /ephemaral/output/log_command2.log 2>&1"
Exit_code2 = exec_cmd(my_command2)
I want to know if i can send in Real time the content of log_command1.log to one cloudwatch stream, and the content of log_command2 to another cloudwatch stream.

Related

Python getting output from running shell command - gcloud create dataproc cluster

I trying to get the expire-dataproc-tag from running gcloud-dataproc-create-cluster using python
I tried subprocess.Popen, the-issue I think due to it's an ERROR or it taking long time to retrieve the result, I end-up with and empty string
I tried command, and command_1 worked fine, the issue appeares when running command_2
import subprocess
command = "echo hello world"
command_1 = "gcloud compute images list --project {project-id} --no-standard-images"
command_2 = 'gcloud beta dataproc clusters create cluster-name --bucket {bucket} --region europe-west1 --zone europe-west1-b --subnet {subnet} --tags {tag} --project {project-id} --service-account {service-account} --master-machine-type n1-standard-16 --master-boot-disk-size 100 --worker-machine-type n1-standard-1 --worker-boot-disk-size 100 --image {image} --max-idle 2h --metadata enable-oslogin=true --properties {properties} --optional-components=ANACONDA,JUPYTER,ZEPPELIN --enable-component-gateway --single-node --no-address'.split(' ')
process = subprocess.Popen(command_2, stdout=subprocess.PIPE, shell=True)
# process.wait()
try:
print('inside-try')
result, err = process.communicate()
result = result.decode('utf-8')
except Exception as e:
print('The Error', e)
print('the result: ', result)
print("the-error: ", err)
the output is
inside-try
ERROR: (gcloud.beta.dataproc.clusters.create) INVALID_ARGUMENT: Dataproc custom image '{image-name}' has expired. Please rebuild this custom image. To extend the custom image expiration date to '2022-02-11T08:29:58.322549Z', please use this cluster property during cluster creation: 'dataproc:dataproc.custom.image.expiration.token=1.{image-name-properties......}'
the result:
the-error: None
I'm trying to get the ERROR: .... output to the result-variable (to be printed after the result)
You're not capturing stderr from the process.
Try:
process = subprocess.Popen(
command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True
)
And so err wasn't being set by result, err = process.communicate()
With the above change, err will contain the error message that you're receiving.
I strongly encourage you to consider using Google's SDKs to interact with its services. Not only are these easier to use but, instead of shipping strings in/out of sub-processes, you can ship Python objects.
Here's the documentation for Creating a Dataproc cluster in Python.

paramiko equivalent of "cat File.gz | ssh addres script.sh" in python 3.7

Command i'm trying to run using paramiko in python 3.7:
Windows:
type file.ext4.gz | ssh user#address sudo update.sh
Mac:
cat file.ext4.gz | ssh user#address sudo update.sh
From the cmd / terminals and from .bat / .sh this works, after entering the password. I've been working on a simple python gui (PysimpleGui) to allow the user to fo this, but without the need to enter the password (this is saved from initial connection).
I've tried:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(config["IP_ADDRESS"], username=config["USERNAME"], password=config["PASSWORD"], timeout=5)
a = client.open_sftp()
a.put(file_location, "sh update.sh", callback=sent)
While this works to send the file, it doesn't run it and gives the error:
OSError: Failure
I don't want to do this in subprocess, as this tool is to prevent the use of terminal for the "end user"
I've been beating my head against this for 2 days now. Thank you.
EDIT:
Here is the STDIO Code:
def send_ssh(value, input=None):
if input:
transport = client.get_transport()
channel = transport.open_session()
channel.exec_command(value)
with open(input, "rb") as file:
for chunk in iter(functools.partial(file.read, read_size), b''):
if channel.send_ready():
channel.sendall(chunk)
if channel.recv_ready():
print(channel.recv(1024).decode().strip())
if channel.recv_stderr_ready():
print(channel.recv_stderr(1024).decode().strip())
while not channel.exit_status_ready():
if channel.recv_ready():
print(channel.recv(1024).decode().strip())
if channel.recv_stderr_ready():
print(channel.recv_stderr(1024).decode().strip())
else:
w, r, e = client.exec_command(value, get_pty=True)
error = e.read().strip().decode()
if error != "":
return error
else:
return r.read().strip().decode()
Once the file is cat to the script it's the verified by the script. I worked around this by just using SFTP to send the file and running my
cat file | sudo script.sh
this works, but does require that i transfer a 600mb file (thankfully always over a local connection (LAN)) each time. The above code does transfer the file, but it doesn't complete. If i just try sending it via for line in file: i'll corrupt.
Keeping things simpler, below we're using threading to allow synchronous APIs to be used rather than needing to write explicit asynchronous code:
import shutil
client = SSHClient()
client.load_system_host_keys()
client.connect('user#address')
# here's the important part: we're using the file handles returned by exec_command()
update_stdin, update_stdout, update_stderr = client.exec_command('sudo update.sh')
# copy stdout and stderr from the remote thread to our own process's stdout and stderr
t_out = Thread(target=shutil.copyfileobj, args=[update_stdout, sys.stdout]); t_out.start()
t_err = Thread(target=shutil.copyfileobj, args=[update_stderr, sys.stderr]); t_err.start()
# write your local file to the remote stdin, in the foreground: we don't exit until done.
shutil.copyfileobj(open('file.ext4.gz', 'r'), update_stdin)
update_stdin.close()
# optional, but let's be graceful: wait for the threads to exit, and collect exit status
t_out.join(); t_err.join()
result = stdout.channel.recv_exit_status()
print(f"Remote process exited with status {result}")

mailutils not working via subprocess.run in python

I am trying to send a mail via python's subprocess.run methode. Unfortunately, it is not working.
import subprocess
message = "Hello World"
process = subprocess.run(["mail", "-s", "Test, "xyz#xyz.com", "<<<", message],
stdout=subprocess.PIPE,
universal_newlines=True)
print (process.stdout)
I received the following Error:
mail: Cannot parse address <<<' (while expanding<<<'): Malformed email address
mail: Cannot parse address Hello World' (while expandingHello World'): Malformed email address
The command is working in the shell though (Linux Mint > 19.0).
The <<< syntax is a feature of bash. If you want to use that you need to run your command as an argument of the bash shell:
import subprocess
message = "Hello World"
command = "mail -s Test abc#def.com <<< "+message
process = subprocess.run(
["bash","-c",command],
stdout=subprocess.PIPE,
universal_newlines=True)
print (process.stdout)
However, using shell expansion on dynamic content can be a security issue. A better way is, to use the input feature of subprocess.run ( python3 only )
import subprocess
message = "Hello World"
command = ["mail", "-s", "Test", "abc#def.com"]
process = subprocess.run(
command,
input=message,
stdout=subprocess.PIPE,
universal_newlines=True)
print (process.stdout)
See also Python - How do I pass a string into subprocess.Popen (using the stdin argument)?

How to use os.system (or alternative) to remotely shutdown any system?

I am trying to use the call below to shut down my system, but I would like it to work on all major OS distro's. Is there a catch all shutdown command?
import os
os.system("shutdown /s /t 1")
Is there any other way to shutdown a machine remotely through python code?
For managing nodes remotely, ansible is a very nice tool, with gather facts you can get the current node os, then conditionally shut down accordingly.
Following function provides portable way of sending commands to remote host:
def run_shell_remote_command(remote_host, remote_cmd, pem_file=None, ignore_errors=False):
remote_cmd = remote_cmd.split(' ')
cmd = [SSH_PATH, '-o ConnectTimeout=30', '-o BatchMode=yes', '-o StrictHostKeyChecking=no']
if pem_file:
cmd.extend(['-i', pem_file])
cmd.extend([remote_host] + remote_cmd)
print(f"SSH CMD: {cmd}")
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
if p.returncode != 0:
if not ignore_errors:
raise RuntimeError("%r failed, status code %s stdout %r stderr %r" % (
remote_cmd, p.returncode, stdout, stderr))
return stdout.strip() # This is the stdout from the shell command
That way you can run any command on remote host, which are supported by remote OS.

subprocess timeout when exceeds a certain time limit

I'm using the subprocess module of python to run an ssh command over my servers for collecting their disk usage. The one thing on which i'm stuck is if the ssh is not configured in any server then subprocess prompt for the password input which makes my whole script stuck and then i have to voluntarily kill the script itself. I just want it to let go all the servers which asks for password prompt(where ssh is not configured) and continue processing the rest.
def MyFunction(server):
msg=""
ps = subprocess.Popen("ssh -l mygroup %s 'df -k /some/directory'" % server,stdout=subprocess.PIPE,shell=True)
out, err = ps.communicate()
if err != None:
msg += "\n"+err
else:
msg = out
return msg
server_list= ['server A','server B','server C','server D']
for server in server_list:
Final_msg+=MyFunction(server)
Any help would be appreciated! :)
If it is just the thing that you want to avoid ssh ask you for anything, then you can forbid it to do so.
You can use the SSH option
BatchMode
If set to “yes”, passphrase/password querying will be disabled.
This option is useful in scripts and other batch jobs where no user is present to supply the password.
The argument must be “yes” or “no”. The default is “no”.
So just add -o BatchMode=yes:
ps = subprocess.Popen("ssh -o BatchMode=yes -l mygroup %s 'df -k /some/directory'" % server, stdout=subprocess.PIPE, shell=True)
BTW, why do you need shell=True here? Better do
ps = subprocess.Popen(["ssh", "-o", "BatchMode=yes", "-l", "mygroup", server, "df -k /some/directory"], stdout=subprocess.PIPE)
as it is cleaner, safer and internally simpler.

Categories

Resources