Running python script via AWS lambda on EC2 - python

I am using paramiko package in lambda to run python script on EC2 and get the output back in Lambda. I was under the impression that when I run the python script from lambda then the scipt gets executed on EC2 and returns the output to lambda. But this is not happening. I installed pandas on my EC2 and ran a simple python script with import pandas. Lambda gives me an error that pandas module not found. But why do lambda need pandas module. Shouldn't it just take the output from EC2 ?
Below is my lambda function
import boto3
import paramiko
def lambda_handler(event, context):
# boto3 client
client = boto3.client('ec2')
s3_client = boto3.client('s3')
# getting instance information
describeInstance = client.describe_instances()
hostPublicIP=["59.53.239.242"]
# fetchin public IP address of the running instances
# for i in describeInstance['Reservations']:
# for instance in i['Instances']:
# if instance["State"]["Name"] == "running":
# hostPublicIP.append(instance['PublicIpAddress'])
print(hostPublicIP)
# downloading pem filr from S3
s3_client.download_file('paramiko','ec2key.pem', '/tmp/file.pem')
# reading pem file and creating key object
key = paramiko.RSAKey.from_private_key_file("/tmp/file.pem")
# an instance of the Paramiko.SSHClient
ssh_client = paramiko.SSHClient()
# setting policy to connect to unknown host
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
host=hostPublicIP[0]
print("Connecting to : " + host)
# connecting to server
ssh_client.connect(hostname=host, username="ubuntu", pkey=key)
print("Connected to :" + host)
# command list
commands = ["python3 test.py"]
# executing list of commands within server
for command in commands:
print("Executing {command}")
stdin , stdout, stderr = ssh_client.exec_command(command)
print(stdout.read())
print(stderr.read())
return {
'statusCode': 200,
'body': json.dumps('Thanks!')
}
Below is my test.py
import pandas
result = 2+2
print(result)

Related

Azure HTTP Function works localy but not on azure / Azure URL gives result website cant be found

I'am working on Azure HTTP function, what I'am trying to achieve is:
Azure function based on python after calling via URL it shall connect to Linux VPS execute command and return the response from VPS.
It does exactly that after running it on localhost via Visual Studio code
Then exactly same code is uploaded via Azure Pipeline which runs without issues
However after calling function via Azure URL it gives 404 error.
Function IS enabled and Code is uploaded sucessfully and can be seen in 'Code + Test' section
import logging
import azure.functions as func
import sys
import paramiko
def execute_command_on_remote_machine(ip_addr, command):
try:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(str(ip_addr), username='username', password='password')
chan = client.get_transport().open_session()
#chan.get_pty()
stdout = [], [], []
stdout = client.exec_command(command, get_pty=True)
#err_list = [line for line in stderr.read().splitlines()]
out_list = [line for line in stdout.read().splitlines()]
client.close()
return out_list
except Exception as e:
print(str(e))
sys.exit(1)
def main(req: func.HttpRequest) -> func.HttpResponse:
output = execute_command_on_remote_machine("IP", "sh /home/script.sh")
logging.info(str(output))
return func.HttpResponse(str(output))

How to pass a command-line ssh parameter with Paramiko?

I'm trying to migrate from using Popen to directly run a ssh command to using Paramiko instead, because my code is moving to an environment where the ssh command won't be available.
The current invocation of the command is passed a parameter that is then used by the remote server. In other words:
process = subprocess.Popen(['ssh', '-T', '-i<path to PEM file>', 'user#host', 'parameter'])
and authorised_keys on the remote server has:
command="/home/user/run_this_script.sh $SSH_ORIGINAL_COMMAND", ssh-rsa AAAA...
So I've written the following code to try and emulate that behaviour:
def ssh(host, user, key, timeout):
""" Connect to the defined SSH host. """
# Start by converting the (private) key into a RSAKey object. Use
# StringIO to fake a file ...
keyfile = io.StringIO(key)
ssh_key = paramiko.RSAKey.from_private_key(keyfile)
host_key = paramiko.RSAKey(data=base64.b64decode(HOST_KEYS[host]))
client = paramiko.SSHClient()
client.get_host_keys().add(host, "ssh-rsa", host_key)
print("Connecting to %s" % host)
client.connect(host, username=user, pkey=ssh_key, allow_agent=False, look_for_keys=False)
channel = client.invoke_shell()
... code here to receive the data back from the remote host. Removed for relevancy.
client.close()
What do I need to change in order to pass a parameter to the remote host so that it uses it as $SSH_ORIGINAL_COMMAND?
From an SSH perspective, what you are doing is not passing a parameter, but simply executing a command. That in the end the "command" is actually injected as a parameter to some script is irrelevant from the client's perspective.
So use a standard Paramiko code for executing commands:
Python Paramiko - Run command
(stdin, stdout, stderr) = s.exec_command('parameter')
# ... read/process the command output/results

Kubernetes REST API call using python

I'm looking for a way to find a pod by name, and run REST call using python. I thought of using port forwarding and kubernetes client
Can someone share a code sample or any other way to do it?
Here is what I started to do:
from kubernetes import client, config
config.load_kube_config(config_file="my file") client = client.CoreV1Api()
pods = client.list_namespaced_pod(namespace="my namespace") # loop pods to
#find my pod
Next I thought of using:
stream(client.connect_get_namespaced_pod_portforward_with_http_info ...
In kubectl command line tool I do the following:
1. List pods
2. Open port forward
3. Use curl to perform the REST call
I want to do the same in python
List all the pods:
from kubernetes import client, config
# Configs can be set in Configuration class directly or using helper utility
config.load_kube_config()
v1 = client.CoreV1Api()
print("Listing pods with their IPs:")
ret = v1.list_pod_for_all_namespaces(watch=False)
for i in ret.items:
print("%s\t%s\t%s" % (i.status.pod_ip, i.metadata.namespace, i.metadata.name))
then in the for loop above you can check your pod name , if it matches then return.
Calling your pod using kubernetes API is very un Kubernetes or container Like. You are coupling a microservice (or service) with a deployment technology. You should configure a service and call it using a standard Python call to Rest API.
If you are calling from inside the cluster use the service name for the URL domain If you are calling from outside the cluster use the cluster ip e.g. using docker http://localhost:8000/ with the same code different arg,
make sure you configure the service to expose the port outside correctly.
like so:
#!/usr/bin/env python
import sys
import json
import requests
def call_service(outside_call=True, protocol='http', domain='localhost', service_name='whisperer', service_port='8002', index='hello', payload=None, headers=None):
if outside_call:
url = f'{protocol}://{domain}:{service_port}/{index}'
else:
url = f'{protocol}://{service_name}:{service_port}/{index}'
try:
g = requests.get(url=url)
print(f'a is: {g}')
r = requests.post(f'{url}', data=json.dumps(payload), headers=headers)
print(f'The text returned from the server: {r.text}')
return r.text
# return json.loads(r.content)
except Exception as e:
raise Exception(f"Error occurred while trying to call service: {e}")
if __name__ == "__main__":
args = sys.argv
l = len(args)
if l > 1:
outside_call = True if args[1] == 'true' else False
else:
outside_call = False
a = call_service(payload={'key': 'value'}, outside_call=outside_call)
print(f'a is: {a}')

Cannot SCP file to AWS using boto

I'm trying to automate some uploading with python to my AWS EC2 server. I cannot get the ssh_client.put_file() to work. It either keeps giving me either IOERROR: Failire or IOError: [Errno 2] No such file
Any ideas as to what I'm missing? Can this ssh_client not be used to scp upload?
import boto
import boto.ec2
from boto.manage.cmdshell import sshclient_from_instance
import argparse
#Parse input
parser = argparse.ArgumentParser(description='Upload and train images for detection')
parser.add_argument('path_to_key', help='Path to Pem key')
parser.add_argument('path_to_tar', help='Path to positives.tar')
args = parser.parse_args()
args_keypath = args.path_to_key
args_tarpath = args.path_to_tar
# Connect to your region of choice
print "Connecting to server..."
access_key = ""
secret_access_key = ""
conn = boto.ec2.connect_to_region('us-east-1', aws_access_key_id=access_key, aws_secret_access_key=secret_access_key)
print "Connecting to instance..."
# Connect to an existing instance
reservations = conn.get_all_instances(['i-c8aab576'])
instance = reservations[0].instances[0]
# Create an SSH client for our instance
# key_path is the path to the SSH private key associated with instance
# user_name is the user to login as on the instance (e.g. ubuntu, ec2-user, etc.)
print "Creating CommandShell..."
key_path = args_keypath
ssh_client = boto.manage.cmdshell.sshclient_from_instance(instance,
key_path,
host_key_file='~/.ssh/known_hosts',
user_name='ubuntu')
status, stdout, stderr = ssh_client.run('ls -al')
print(status)
print(stdout)
print(stderr)
#Upload positives - WELL THIS ISN'T WORKING
print "Uploading file..."
local_filepath = args_tarpath
remote_filepath = "~/Sharing/"
ssh_client.put_file("/home/willem/.ssh/test.txt", "/home/ubuntu/Sharing/")
#ssh_client.put_file(local_filepath, remote_filepath)
If you have ssh login access you can use the .pem and run the command locally, my solution was:
Create the server by creating the reservation:
reservation = conn.run_instances(my_AMI,
key_name=my_key,
instance_type='c4.xlarge',
security_group_ids=security_group,
placement='us-east-1d')
instance = reservation.instances[0]
print colored("Instance IP: %s" % instance.ip_address, 'yellow')
Then later I could scp the file:
instance_IP = instance.ip_address
os.system('scp -i %s %s ubuntu#%s:~/Sharing' % (key_path, args_tarpath, instance_IP) )
One simple solution: specify the file name in the destination path as well. Boto in turn uses the paramiko module in which sftp.put requires the file name to be specified.

Amazon Web Service/Boto: Upload and execute remote python/bash script via SSH on localhost

I am able to fire up AWS Ubuntu EC2 instance with boto. Have anyone tried to upload the script to the remote Ubuntu EC2 (More than 1) and execute the script via SSH locally?
The main objective is to automate the whole process using a Python script written on localhost. Is there an alternative way or Amazon api tools to made this possible?
I'd recommend Fabric, it's made for this kind of thing.
Use paramiko API
Here, Paramiko code to execute in remote AWS EC2 Python :
import paramiko
sftp, transport= None, None, None
try:
if keyfilepath=='': keyfilepath= AWS_KEY_PEM
if keyfiletype == 'DSA': key = paramiko.DSSKey.from_private_key_file(keyfilepath)
else: key = paramiko.RSAKey.from_private_key_file(keyfilepath)
if contype== 'sftp' :
transport = paramiko.Transport((host, port))
transport.add_server_key(key)
transport.connect(None, username, pkey=key)
sftp = paramiko.SFTPClient.from_transport(transport)
if isprint : print('Root Directory :\n ', sftp.listdir())
return sftp
except Exception as e:
print('An error occurred creating client: %s: %s' % (e.__class__, e))
if sftp is not None: sftp.close()
if transport is not None: transport.close()
if ssh is not None: ssh.close()

Categories

Resources