Send command to Amazon EC2 instance using Boto3 in Python - python

I am able to create an instance from an image on Amazon EC2 using boto3 using the following code:
ec2 = boto3.client('ec2',region_name = 'eu-west-2')
instance = ec2.run_instances(
ImageId='ami-011c936382e4e2g9c',
MinCount=1,
MaxCount=1,
InstanceType = 't2.micro',
SecurityGroupIds= ['sg-0c08ad7b130e3hf3',],)
id = str(instance['Instances'][0]['InstanceId'])
This works fine, but I then wish to send a simple command to the instance which will execute a python script stored on the instance. From what I can gather, boto3 has the AWS command line functionality built in and so I shouldn't have to SSH to the instance; I should be able to send a command through boto3. However I'm strugging to do this, after trying different variations of the below code:
client = boto3.client('ssm',region_name='eu-west-2')
commands = ['echo "hello world" > hello.txt'] #this would be replaced by the command to execute the python script
instance_id = [id] #id being the instance id established from above
response = client.send_command(DocumentName='AWS-RunShellScript',
Parameters= 'commands':commands},
InstanceIds=instance_id,)
I'm aware that it takes time for the server to fire up etc but this isn't the problem. I have executed this second chunk of code after a large delay when I know the server is indeed ready to go.
As mentioned, I think this might be to do with the pem file that I normally need to use in order to putty/ssh into an instance as this isn't configured anywhere into my code. Any clues would be greatly appreciated!

Related

How to pass userdata script on lambda python while launching ec2 instance?

I am trying to launch instance using aws lambda using python, but I cannot pass my base64 encoded userdata script.
The script looks like this:
import os
import boto3
AMI = "ami-052efd3df9dad4825"
INSTANCE_TYPE = "c6a.32xlarge"
ec2 = boto3.resource('ec2')
def lambda_handler(event, context):
instance = ec2.create_instances(
ImageId=AMI,
InstanceType=INSTANCE_TYPE,
MaxCount=1,
MinCount=1,
UserData=*my script here*,
)
print("New instance created:", instance[0].id)
You should be able to specify it without needing to base64 encode it, such as:
user_data = '''
#!/bin/bash
echo 'test' > /home/ec2/test.txt
'''
Make sure the first line starts with a #!.
See also: EC2 User Data not working via python boto command (It's old, but shows examples.)

sqs boto3: The address 'https://us-west-2.queue.amazonaws.com/xxxx/my-name' is not valid for this endpoint

I'm having a very hard time trying to find out how to correctly configure sqs in boto3 to be able to send messages to my sqs queue. It looks like there is some confusion around boto3 and legacy endpoints but I'm getting the error message The address 'https://us-west-2.queue.amazonaws.com/xxxx/my-name' is not valid for this endpoint. for each permutation of the config I can imagine. Here's the code.
# Tried both of these
sqs_queue_url = 'https://sqs.us-west-2.amazonaws.com/xxxx/my-queue'
sqs_queue_url = 'https://us-west-2.queue.amazonaws.com/xxxx/my-queue'
# Tried both of these
sqs = boto3.client("sqs", endpoint_url="https://sqs.us-west-2.amazonaws.com")
sqs = boto3.client("sqs")
# _endpoint updates
logger.info("sqs endpoint: %s", sqs._endpoint)
# Keeps failing
sqs.send_message(QueueUrl=sqs_queue_url, MessageBody=message_json)
I'm hoping this is a silly mistake. What config am I missing?
From docs, AWS CLI and Python SDK use legacy endpoints:
If you use the AWS CLI or SDK for Python, you can use the following legacy endpoints.
Also, when you set endpoint you need to add https:
sqs = boto3.client("sqs", endpoint_url="https://us-west-2.queue.amazonaws.com")

boto3 lambda script to shutdown RDS not working

I'm just starting out with boto3 and lambda and was trying to run the below function via Pycharm.
import boto3
client = boto3.client('rds')
response = client.stop_db_instance(
DBInstanceIdentifier='dummy-mysql-rds'
)
But i receive the below error:
botocore.errorfactory.DBInstanceNotFoundFault: An error occurred (DBInstanceNotFound) when calling the StopDBInstance operation: DBInstance dummy-mysql-rds not found.
Do you know what may be causing this?
For the record, I have the AWS toolkit installed for Pycharm and can run simple functions to list and describe ec2 instances and my AWS profile has admin access.
By explicitly defining the profile name the below function now works via Pycharm. Thank you #OleksiiDonoha for your help in getting this resolved.
import boto3
rds = boto3.setup_default_session(profile_name='dev')
client = boto3.client('rds')
response = client.stop_db_instance(
DBInstanceIdentifier='dev-mysql-rds'
)

Boto3 Error in AWS SDK: botocore.exceptions.NoCredentialsError: Unable to locate credentials

When I simply run the following code, I always gets this error.
import boto3 as boto
import sys
import json
role_to_assume_arn="arn:aws:iam::xxxxxxxxxxxx:role/AWSxxxx_xxxxxxAdminaccess_xxxxx24fexxx"
role_session_name='AssumeRoleSession1'
sts_client=boto.client('sts')
assumed_role_object=sts_client.assume_role(
RoleArn="arn:aws:iam::xxxxxxxxxxxx:role/AWSxxxx_xxxxxxAdminaccess_xxxxx24fexxx",
RoleSessionName="Sess1",
)
creds=assumed_role_object['Credentials']
sts_assumed_role = boto3.client('sts',
aws_access_key_id=creds['AccessKeyId'],
aws_secret_access_key=creds['SecretAccessKey'],
aws_session_token=creds['SessionToken'],
)
rds_client = boto.client('rds',
aws_access_key_id=creds['AccessKeyId'],
aws_secret_access_key=creds['SecretAccessKey'],
aws_session_token=creds['SessionToken']
)
I don't want to set and change the temporary session keys frequently, instead I want them to be set directly through a code like I've just written.
Am I wrong? Is there a way to set the credentials like this directly in the program or not?
Or is it mandatory to give the credentials in the "~/.aws/credentials"
I assume you are running this code in your local machine.
The STS client you created is expecting access key and secret access key.
You have to either configure it using credentials file or you can directly hardcode your access key and secret access key like below(Not recommended).
client = boto3.client('sts', aws_access_key_id=key, aws_secret_access_key=sec_key, region_name=region_name)
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_credentials_profiles.html
If you are running this code in EC2 instance, install boto3 and do AWS Configure. Follow the below link.
https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html

"TypeError: expected string, tuple found" when passing aws credentials to amazon client constructor

I have a python script that calls the Amazon SES api using boto3. It works when I create the client like this client = boto3.client('ses') and allow the aws credentials to come from ~/.aws/credentials, but I wanted to pass the aws_access_key_id and aws_secret_access_key into the constructor somehow.
I thought I had found somewhere that said it was acceptable to do something like this
client = boto3.client(
'ses',
aws_access_key_id=kwargs['aws_access_key_id'],
aws_secret_access_key=kwargs['aws_secret_access_key'],
region_name=kwargs['region_name']
)
but then when I try to send an email, it tells me that there is a TypeError: sequence item 0: expected string, tuple found when it tries to return '/'.join(scope) in botocore/auth.py (line 276).
I know it's a bit of a long shot, but I was hoping someone had an idea of how I can pass these credentials to the client from somewhere other than the aws credentials file. I also have the full stack trace from the error, if that's helpful I can post it as well. I just didn't want to clutter up the question initially.
You need to configure your connection info elsewhere and then connect using:
client = boto3.client('ses', AWS_REGION)
An alternative way, using Session can be done like this:
from boto3.session import Session
# create boto session
session = Session(
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
region_name=settings.AWS_REGION
)
# make connection
client =session.client('s3')

Categories

Resources