I'm trying to submit documents to an Elasticsearch index from a Python Lambda script running as a role which has the AmazonESFullAccess policy attached. I am creating the requests with the python elasticsearch library, and signing them with the aws_requests_auth library.
cred = boto3.session.Session().get_credentials()
es_host = '<elasticsearch-server>'
auth = AWSRequestsAuth(
aws_access_key=cred.access_key,
aws_secret_access_key=cred.secret_key,
aws_host=es_host,
aws_region='us-east-1',
aws_service='es')
ES_CLIENT = Elasticsearch(
host=es_host,
port=80,
connection_class=RequestsHttpConnection,
http_auth=auth)
Then sending bulk create requests as follows:
ES_CLIENT.bulk(
index='test_index',
body=docs)
This is failing with the following:
TransportError(403, u'{"message": "The security token included in the
request is invalid." }'): AuthorizationException ...
Though the same code works when running with admin user access keys.
Why do these requests fail when running as a role with 'full ES access'?
Because Elasticsearch is a seperate entity to AWS, but a version is hosted by AWS it seems that elasticsearch doesn't treat IAM the same way you'd expect.. AWS has something in place that tries, but doesn't quite work with just an access_key and secret_key. It needs the session Token as well.
The answer to this is to use the cred.token along with the access key and secret key and then pass it into your AWSRequestsAuth object:
auth = AWSRequestsAuth(
aws_access_key=cred.access_key,
aws_secret_access_key=cred.secret_key,
aws_token=cred.token,
aws_host=es_host,
aws_region='us-east-1',
aws_service='es')
Related
My Sql server DB password is saved on Azure app vault which has DATAREF ID as a identifier. I need that password to create spark dataframe from table which is present in SQL server. I am running this .py file on google Dataproc cluster. How can I get that password using python?
Since you are accessing an Azure service from a non-Azure service, you will need a service principal. You can use certificate or secret. See THIS link for the different methods. You will need to give the service principal proper access and this will depend if you are using RBAC or access policy for your key vault.
So the steps you need to follow are:
Create a key vault and create a secret.
Create a Service principal or application registration. Store the clientid, clientsecret and tenantid.
Give the service principal proper access to the key vault(if you are using access policies) or to the specific secret(if you are using RBAC model)
The python link for the code is HERE.
The code that will work for you is below:
from azure.identity import ClientSecretCredential
from azure.keyvault.secrets import SecretClient
tenantid = <your_tenant_id>
clientsecret = <your_client_secret>
clientid = <your_client_id>
my_credentials = ClientSecretCredential(tenant_id=tenantid, client_id=clientid, client_secret=clientsecret)
secret_client = SecretClient(vault_url="https://<your_keyvault_name>.vault.azure.net/", credential=my_credentials)
secret = secret_client.get_secret("<your_secret_name>")
print(secret.name)
print(secret.value)
I have my config file set up with multiple profiles and I am trying to assume an IAM role, but all the articles I see about assuming roles are starting with making an sts client using
import boto3 client = boto3.client('sts')
which makes sense but the only problem is, It gives me an error when I try to do it like this. but when I do it like this, while passing a profile that exists in my config file, it works. here is the code below:
import boto3 session = boto3.Session(profile_name="test_profile")
sts = session.client("sts")
response = sts.assume_role(
RoleArn="arn:aws:iam::xxx:role/role-name",
RoleSessionName="test-session"
)
new_session = Session(aws_access_key_id=response['Credentials']['AccessKeyId'], aws_secret_access_key=response['Credentials']['SecretAccessKey'], aws_session_token=response['Credentials']['SessionToken'])
when other people are assuming roles in their codes without passing a profile in, how does that even work? does boto3 automatically grabs the default profile from the config file or something like that in their case?
Yes. This line:
sts = session.client("sts")
tells boto3 to create a session using the default credentials.
The credentials can be provided in the ~/.aws/credentials file. If the code is running on an Amazon EC2 instance, boto3 will automatically use credentials associated with the IAM Role associated with the instance.
Credentials can also be passed via Environment Variables.
See: Credentials — Boto3 documentation
I am trying to migrate my simple python script that accesses an O365 email account using basic auth to modern auth:
Here's the current setup.
import exchangelib as exch
credentials = exch.Credentials('my_username', 'my_password')
configInfo = exch.Configuration(server='smtp.office365.com', credentials=credentials)
tz = exch.EWSTimeZone.localzone()
account = exch.Account(primary_smtp_address='my_email_address', config=configInfo, autodiscover=False, access_type=exch.DELEGATE)
It works fine. From here I can access items such as account.inbox to iterate emails, etc.
Moving to Azure Identity, my devops team assigned me the following:
a registered app in the Azure Portal
an app ID
an object ID
a tenant ID
a secret value ID
a secret key ID
an https://ps.outlook.com/ URL
I've run this...
pip install azure-identity
And now I can run this...
from azure.identity import DefaultAzureCredential
# Now what? A service client perhaps?
I'm at a loss as to what comes next. My goal is to authenticate using the IDs above, then create an account object as before, and continue processing. Can anyone help? Thank you.
I've setup a VM machine in Azure that has a managed identity.
I follow the guide here https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-linux-vm-access-arm
So now I have an access token. But what I fail to understand is how do I use this token to access my key vault? I'm using the Python SDK. Looking at the docs for the SDK here https://learn.microsoft.com/en-us/python/api/azure-keyvault/azure.keyvault?view=azure-python
There exist a access token class AccessToken(scheme, token, key)
I assume i can use my token i generated earlier here. But what is scheme and key? The docs does not explain it. Or am I looking at the wrong class to use with the token?
If you're using a VM with a managed identity, then you can create a credential for a Key Vault client using azure-identity's ManagedIdentityCredential class. The credential will fetch and use access tokens for you as you use the Key Vault client:
from azure.identity import ManagedIdentityCredential
from azure.keyvault.secrets import SecretClient
credential = ManagedIdentityCredential()
client = SecretClient("https://{vault-name}.vault.azure.net", credential)
secret = client.get_secret("secret-name")
Note that I'm using a SecretClient to fetch secrets from Key Vault; there are new packages for working with Key Vault in Python that replace azure-keyvault:
azure-keyvault-certificates (Migration guide)
azure-keyvault-keys (Migration guide)
azure-keyvault-secrets (Migration guide)
Clients in each of these packages can use any credential from azure-identity for authentication.
(I work on the Azure SDK in Python)
I wouldn't recommend you use the managed identity of a VM to access KeyVault. You should create a service principal if you intend on running scripts / code.
The best way of doing this is with the Azure CLI. See here for instructions on installing the CLI, and refer to this, or this for creating your service principal.
The best way to manage resources in Python is by using ADAL, which is documented:
https://github.com/AzureAD/azure-activedirectory-library-for-python
In your case, however, managing KeyVault is made a little easier since the KeyVault library for Python also provides the means for you to authenticate without directly using ADAL to obtain your access token. See here:
https://learn.microsoft.com/en-us/python/api/overview/azure/key-vault?view=azure-python
from azure.keyvault import KeyVaultClient
from azure.common.credentials import ServicePrincipalCredentials
credentials = ServicePrincipalCredentials(
client_id = '...',
secret = '...',
tenant = '...'
)
client = KeyVaultClient(credentials)
# VAULT_URL must be in the format 'https://<vaultname>.vault.azure.net'
# KEY_VERSION is required, and can be obtained with the KeyVaultClient.get_key_versions(self, vault_url, key_name) API
key_bundle = client.get_key(VAULT_URL, KEY_NAME, KEY_VERSION)
key = key_bundle.key
In the above, client_id, secret, and tenant (id) are all outputs of the az ad sp create-for-rbac --name {APP-NAME} CLI command.
Remember to review and adjust the role assignments for the sp you created. And your KeyVault is only as secure as the devices which have access to your sp's credentials.
This is a follow-up question for this question:
I have successfully created a private key and have read the various pages of Google documentation on the concepts of server to server authentication.
I need to create a JWT to authorize my App Engine application (Python) to access the Google calendar and post events in the calendar. From the source in oauth2client it looks like I need to use oauth2client.client.SignedJwtAssertionCredentials to create the JWT.
What I'm missing at the moment is a stylised bit of sample Python code of the various steps involved to create the JWT and use it to authenticate my App Engine application for Google Calendar. Also, from SignedJwtAssertionCredentials source it looks like I need some App Engine compatible library to perform the signing.
Can anybody shed some light on this?
After some digging I found a couple of samples based on the OAuth2 authentication. From this I cooked up the following simple sample that creates a JWT to access the calendar API:
import httplib2
import pprint
from apiclient.discovery import build
from oauth2client.client import SignedJwtAssertionCredentials
# Get the private key from the Google supplied private key file.
f = file("your_private_key_file.p12", "rb")
key = f.read()
f.close()
# Create the JWT
credentials = SignedJwtAssertionCredentials(
"xxxxxxxxxx#developer.gserviceaccount.com", key,
scope="https://www.googleapis.com/auth/calendar"
)
# Create an authorized http instance
http = httplib2.Http()
http = credentials.authorize(http)
# Create a service call to the calendar API
service = build("calendar", "v3", http=http)
# List all calendars.
lists = service.calendarList().list(pageToken=None).execute(http=http)
pprint.pprint(lists)
For this to work on Google App Engine you will need to enable PyCrypto for your app. This means adding the following to your app.yaml file:
libraries:
- name: pycrypto
version: "latest"