Cannot connect to EC2 using python boto - python

I'm a complete noob with Python and boto and trying to establish a basic connection to ec2 services.
I'm running the following code:
ec2Conn = boto.connect_ec2('username','password')
group_name = 'python_central'
description = 'Python Central: Test Security Group.'
group = ec2Conn.create_security_group(group_name, description)
group.authorize('tcp', 8888,8888, '0.0.0.0/0')
and getting the following error:
AWS was not able to validate the provided access credentials
I've read some posts that this might be due to time difference between my machine and the EC2 server but according to the logs, they are the same:
host:ec2.us-east-1.amazonaws.com x-amz-date:20161213T192005Z
host;x-amz-date
515db222f793e7f96aa93818abf3891c7fd858f6b1b9596f20551dcddd5ca1be
2016-12-13 19:20:05,132 boto [DEBUG]:StringToSign:
Any idea how to get this connection running?
Thanks!

Call made to the AWS API require authentication via Access Key and Secret Key. These can be obtained from the Identity and Access Management (IAM) console, under the Security Credentials tab for a user.
See: Getting Your Access Key ID and Secret Access Key
If you are unfamiliar with Python, you might find it easier to call AWS services by using the AWS Command-Line Interface (CLI). For example, this single-line command can launch an Amazon EC2 instance:
aws ec2 run-instances --image-id ami-c2d687ad --key-name joe --security-group-id sg-23cb34f6 --instance-type t1.micro
See: AWS CLI run-instances documentation

Related

ManagedIdentityCredential authentication unavailable, no managed identity endpoint found

Im trying to allow an app service (python) to get secrets from azure keyvault without the usage of hardcoded client id/secrets, therefore I`m trying to use ManagedIdentity.
I have enabled system & user assigned functions in my service app
I have created a policy in vault where the service app is granted access to the secrets
code:
credentials_object = ManagedIdentityCredential()
client = SecretClient(vault_url=VAULT_URL, credential=credentials_object)
value = client.get_secret('MYKEY').value
error (when app is deployed and when running locally):
azure.identity._exceptions.CredentialUnavailableError: ManagedIdentityCredential authentication unavailable, no managed identity endpoint found.
What am I missing?
Thank you!
It's important to understand that Managed Identity feature in Azure is ONLY relevant when, in this case, the App Service is deployed. This would mean you would probably want to use DefaultAzureCredential() from the Azure.Identity library which is compatible both when running locally and for the deployed web app.
This class will run down the hierarchy of possible authentication methods and when running locally I prefer to use a service principal which can created by running the following in Azure CLI: az ad sp create-for-rbac --name localtest-sp-rbac --skip-assignment. You then add the service principal localtest-sp-rbac in the IAM for the required Azure services.
I recommend reading this article for more information and how to configure your local environment: https://learn.microsoft.com/en-us/azure/developer/python/configure-local-development-environment
You can see the list of credential types that DefaultAzureCredential() goes through in the Azure docs.
In my case, it was the issue of having multiple Managed Identities attached to my VMs. I am trying to access Azure Storage Account from AKS using ManagedIdentityCredential. When I specified the client_id of the MI as:
credentials_object = ManagedIdentityCredential(client_id='XXXXXXXXXXXX')
it started to work! It's also mentioned in here that we need to specify the client_id of the MI if the VM or VMSS has multiple identities attached to it.

How to correctly/safely access parameters from AWS SSM Parameter store for my Python script on EC2 instance?

I have a Python script that I want to run and text me a notification if a certain condition is met. I'm using Twilio, so I have a Twilio API token and I want to keep it secret. I have it successfully running locally, and now I'm working on getting it running on an EC2 instance.
Regarding AWS steps, I've created an IAM user with permissions, launched the EC2 instance (and saved the ssh keys), and created some parameters in the AWS SSM Parameter store. Then I ssh'd into the instance and installed boto3. When I try to use boto3 to grab a parameter, I'm unable to locate the credentials:
# test.py
import boto3
ssm = boto3.client('ssm', region_name='us-west-1')
secret = ssm.get_parameter(Name='/test/cli-parameter')
print(secret)
# running the file in the console
>> python test.py
...
raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
I'm pretty sure this means it can't find the credentials that were created when I ran aws configure and it created the .aws/credentials file. I believe the reason for this is because I ran aws configure on my local machine, rather than running it while ssh'd into the instance. I did this to keep my AWS ID and secret key off of my EC2 instance, because I thought I'm supposed to keep that private and not put tokens/keys on my EC2 instance. I think I can solve the issue by running aws configure while ssh'd into my instance, but I want to understand what happens if there's a .aws/credentials file on my actual EC2 instance, and whether or not this is dangerous. I'm just not sure how this is all supposed to be structured, or what is a safe/correct way of running my script and accessing secret variables.
Any insight at all is helpful!
I suspect the answer you're looking for looks something like:
Create an IAM policy which allows access to the SSM parameter (why not use the SecretStore?)
Attach that IAM policy to a role.
Attach the role to your EC2 instance (instance profile)
boto3 will now automatically collect an AWS secret key, etc.. from the meta data service when it needs to talk to the parameter store.

How to instantiate an AWS Linux using python API?

1) Instantiate an AWS Linux, micro instance using the AWS python API (include authentication to AWS)
2) Update the instance with tags: customer=ACME, environment=PROD
3) Assign a security group to the instance
To program in Python on AWS, you should use the boto3 library.
You will need to do the following:
supply credentials to the library (link)
create an EC2 client (link)
use the EC2 client to launch EC2 instances using run_instances (link)
You can specify both tags and security groups in the run_instances call. Additionally, the boto3 documentation provides some Amazon EC2 examples that will help.
Maybe you want to observe this project:
https://github.com/nchammas/flintrock
This is a hadoop and apache spark clustering project. But, it can inspire you.
Actually, there is many feature that you want like security group or filtering by tag name. Just, look around of code

AWS Python IAM API - how to get AWS IAM privileges programmatically?

Is there a way to know if an AWS IAM account has the right privileges to create VPC, EC2, SQS, SNS and CloudTrail?
Given the IAM's access key and security access key, I would like to, programmatically, block it from going further to create VPC, SQS, SNS if the IAM does not have the right privileges.
Is there a AWS Python API that I can do this kind of check?
There is DryRun option for VPC, and EC2. But there is no such option for SQS, SNS, S3, and CloudTrail APIs.
Could anyone help? Thanks in advance.
AWS provides a CLI command for that:
aws iam simulate-principal-policy \
--policy-source-arn arn:aws:iam::123456789:user/SomeUser \
--action-names sqs:CreateQueue
https://docs.aws.amazon.com/cli/latest/reference/iam/simulate-principal-policy.html
You can use it with Python boto3 package as well:
http://boto3.readthedocs.io/en/latest/reference/services/iam.html#IAM.Client.simulate_principal_policy
As a workaround you can also check if specific Policy is attached to the specific User/Role. In case you have order and good structure in your IAM, for both AWS and Customer Managed Policies it could be as simple as:
aws iam list-attached-user-policies --user-name your SomeUser
aws iam list-attached-role-policies --role-name SomeRole
aws iam list-attached-group-policies --group-name SomeGroup

Boto Ec2 and elastic IP's

Is it possible to associate an an elastic IP address with an ec2 instance using python boto? I'm trying to automate a deploy. I searched the api documentation in the ec2 section and found nothing.
Don't know what documentation you were looking at, but it's in there:
http://boto.readthedocs.org/en/latest/ref/ec2.html#boto.ec2.address.Address.associate
associate(instance_id=None, network_interface_id=None, private_ip_address=None, allow_reassociation=False, dry_run=False)
Associate this Elastic IP address with a currently running instance. :see: boto.ec2.connection.EC2Connection.associate_address()

Categories

Resources