How to configure CorsRule for CDK using python - python

I am trying to figure out the proper syntax for setting up cors on an s3 bucket using CDK (python). The class aws_s3.CorsRule requires 3 params (allowed_methods, allowed_origins, max_age=None). I am trying to specify the allowed_methods which takes in a list of methods but the bases is enum.Enum. So how do I create a list of these methods. This is what I have tried but it doesn't pass validation.
s3.Bucket(self, "StaticSiteBucket",
bucket_name="replaceMeWithBucketName",
versioned=True,
removal_policy=core.RemovalPolicy.DESTROY,
website_index_document="index.html",
cors=s3.CorsRule(allowed_methods=[s3.HttpMethods.DELETE],allowed_origins=["*"],max_age=3000)
)
The only thing Im focused on is the cors line:
cors=s3.CorsRule(allowed_methods=[s3.HttpMethods.DELETE],allowed_origins=["*"],max_age=3000)
Trying to read the documentation is like peeling an onion.
https://docs.aws.amazon.com/cdk/api/latest/python/aws_cdk.aws_s3/HttpMethods.html#aws_cdk.aws_s3.HttpMethods
I tried calling each one individually as you can see using s3.HttpMethods.DELETE but that fails when it tries to synthesize.

looks like you at least forgot to wrap the param you pass to cors as a list. I agree that the docs are a bit of a rabbit hole, but you can see the Bucket docs specifies the cors param as (Optional[List[CorsRule]])
This is mine:
from aws_cdk import core
from aws_cdk import aws_s3
from aws_cdk import aws_apigateway
aws_s3.Bucket(self,
'my_bucket',
bucket_name='my_bucket',
removal_policy=core.RemovalPolicy.DESTROY,
cors=[aws_s3.CorsRule(
allowed_headers=["*"],
allowed_methods=[aws_s3.HttpMethods.PUT],
allowed_origins=["*"])
])
So yours should be:
cors=[s3.CorsRule(
allowed_methods=[s3.HttpMethods.DELETE],
allowed_origins=["*"],
max_age=3000)]

Related

How can I create an API Gateway end point with a dynamic URI using the AWS CDK in Python?

Is there a possibility to set a dynamic URI in the AWS API_GATEWAY aws_cdk description?
I currently have:
integration=api_gateway.Integration(
type=_apigw.IntegrationType.HTTP,
integration_http_method='GET',
uri=my_uri+'/my_service/my_fixed_endpoint',
...
Now I would like to use something like:
uri=my_uri+'/my_service/{my_dynamic_endpoint}',
With {my_dynamic_endpoint} being replaced with e.g. "football", "baseball", "tennis".
Is there a way to do this?
I think I found my answer in the aws api gateway docs. There, they use the following example:
api = apigateway.RestApi(self, "books-api")
api.root.add_method("ANY")
book = books.add_resource("{book_id}")
book.add_method("GET")
book.add_method("DELETE")
So I should write it in an object based manner like that
api = apigateway.RestApi(self, "sports-api")
api.root.add_method("ANY")
book = books.add_resource("{sport}")
book.add_method("GET")
book.add_method("DELETE")
What I called "my_dynamic_endpoint" is just a variable in the url path.
Sorry for the misleading formulations!!!

Connect Function App to CosmosDB with Managed Identity

I'm trying to write a function in a Function App that manipulates data in a CosmosDB. I get it working if I drop the read-write key in the environment variables. To make it more robust I wanted it to work as a managed identity app. The app has the role 'DocumentDB Account Contributor' on the Cosmos DB.
However, the CosmosClient constructor doesn't accept a Credential and needs the read-write key. I've been chasing down the rabbit hole of azure.mgmt.cosmosdb.operations where there is a DatabaseAccountsOperations class with a list_keys() method. I can't find a neat way to access that function though. If I try to create that object (which requires poaching the config, serializer and deserializer from my dbmgmt object) it still requires the resourceGroupName and accountName.
I can't help but think that I've taken a wrong turn somewhere because this has to be possible in a more straightforward manner. Especially given that the JavaScript SDK references a more logical class CosmosDBManagementClient in line with the SubscriptionClient. However, I can't find that class anywhere on the python side.
Any pointers?
from azure.identity import DefaultAzureCredential
from azure.cosmos import CosmosClient
from azure.mgmt.resource import SubscriptionClient
from azure.mgmt.cosmosdb import CosmosDB
from .cred_wrapper import CredentialWrapper
def main(req: func.HttpRequest) -> func.HttpResponse:
request_body = req.get_body()
# credential = DefaultAzureCredential()
# https://gist.github.com/lmazuel/cc683d82ea1d7b40208de7c9fc8de59d
credential = CredentialWrapper()
uri = os.environ.get('cosmos-db-uri')
# db = CosmosClient(url=uri, credential=credential) # Doesn't work, wants a credential that is a RW/R key
# Does work if I replace it with my primary / secondary key but the goal is to remove dependence on that.
subscription_client = SubscriptionClient(credential)
subscription = next(subscription_client.subscriptions.list())
dbmgmt = CosmosDB(credential, subscription.subscription_id) # This doesn't accept the DB URI??
operations = list(dbmgmt.operations.list()) # I see the list_keys() Operation there...
EDIT
A helpful soul provided a response here but removed it before I could even react or accept it as the answer. They pointed out that there is an equivalent python SDK and that from azure.mgmt.cosmosdb import CosmosDBManagementClient would do the trick.
From there, I was on my own as that resulted in
ImportError: cannot import name 'CosmosDBManagementClient' from 'azure.mgmt.cosmosdb'
I believe the root of the problem lies in an incompatibility of the package azure-mgmt. After removing azure-mgmt from my requirements.txt and only loading the cosmos and identiy related packages, the import error was resolved.
This solved 90% of the problem.
dbmgmt = CosmosDBManagementClient(credential, subscription.subscription_id, c_uri)
print(dbmgmt.database_accounts.list_keys())
TypeError: list_keys() missing 2 required positional arguments: 'resource_group_name' and 'account_name'
Does one really need to collect each of these parameters? Compared to the example that reads a secret from a Vault it seems so convoluted.
For other unfortunate ones looking to access CosmosDB with Managed Identity, it seems that this is, as of May 2021, not yet possible.
Source: Discussion on Github
Update 12/05/2021 - I came here finding a solution for this with Javascript/Typescript. So leaving the answer here for others. I think that a similar approach could work for Python.
You can use RBAC for data plane operations with Managed Identities. Finding the documentation was difficult.
RBAC for Cosmos DB data plane operations with Managed Identities
Important - If you get the error Request blocked by Auth mydb : Request is blocked because principal [xxxxxx-6fad-44e4-98bc-2d423a88b65f] does not have required RBAC permissions to perform action Microsoft.DocumentDB/databaseAccounts/readMetadata on resource [/]. Don't use the Portal to assign roles, use the Azure CLI for CosmosDB.
How to - creating a role assignment for a user/system MSI/user MSI is done using the Azure CosmosDB CLI
# Find the role ID:
resourceGroupName='<myResourceGroup>'
accountName='<myCosmosAccount>'
az cosmosdb sql role definition list --account-name $accountName --resource-group $resourceGroupName
# Assign to the MSI or user managed MSI:
readOnlyRoleDefinitionId = '<roleDefinitionId>' # as fetched above
principalId = '<aadPrincipalId>'
az cosmosdb sql role assignment create --account-name $accountName --resource-group $resourceGroupName --scope "/" --principal-id $principalId --role-definition-id $readOnlyRoleDefinitionId
Once this step is done, the code for connecting is very easy. Use the #azure/identity package's Default Credential. This works in Azure Function App with managed identity and on your laptop with VS code or with az login.
Docs for #azure/identity sdk
Examples of authentication with #azure/identity to get the credential object
import { CosmosClient } from "#azure/cosmos";
import { DefaultAzureCredential, ManagedIdentityCredential, ChainedTokenCredential } from "#azure/identity";
const defaultCredentials = new DefaultAzureCredential();
const managedCredentials = new ManagedIdentityCredential();
const aadCredentials = new ChainedTokenCredential(managedCredentials, defaultCredentials);
client = new CosmosClient({
endpoint: "https://mydb.documents.azure.com:443/",
aadCredentials
});

boto3 eks client how to generate presigned url

I'm trying to update a docker image within a deployment in EKS. I'm running a python code from a lambda function. However, I don't know how to use generate_presigned_url(). What should I pass as ClientMethod parameter???
import boto3
client = boto3.client("eks")
url = client.generate_presigned_url()
These are the clientMethods that you could perform in case of EKS.
'associate_encryption_config'
'associate_identity_provider_config'
'can_paginate'
'create_addon'
'create_cluster'
'create_fargate_profile'
'create_nodegroup'
'delete_addon'
'delete_cluster'
'delete_fargate_profile'
'delete_nodegroup'
'describe_addon'
'describe_addon_versions'
'describe_cluster'
'describe_fargate_profile'
'describe_identity_provider_config'
'describe_nodegroup'
'describe_update'
'disassociate_identity_provider_config'
'generate_presigned_url'
'get_paginator'
'get_waiter'
'list_addons'
'list_clusters'
'list_fargate_profiles'
'list_identity_provider_configs'
'list_nodegroups'
'list_tags_for_resource'
'list_updates'
'tag_resource'
'untag_resource'
'update_addon'
'update_cluster_config'
'update_cluster_version'
'update_nodegroup_config'
'update_nodegroup_version'
You can get more information about these method in the documentation here: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/eks.html#client
After over two weeks I suppose you've found your answer, anyway the ClientMethod mentioned (and, not really well explained on the boto3 docs) is just one of the methods you can use with the EKS client itself. I honestly think this is what KnowledgeGainer was trying to say by listing all the methods, basically you can just pick one. This would give you the presigned URL.
For example, here I'm using one method that isn't requiring any additional arguments, list_clusters:
>>> import boto3
>>> client = boto3.client("eks")
>>> client.generate_presigned_url("list_clusters")
'https://eks.eu-west-1.amazonaws.com/clusters?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAQKOXLHHBFT756PNG%2F20210528%2Feu-west-1%2Feks%2Faws4_request&X-Amz-Date=20210528T014603Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=d25dNCC17013ad9bc75c04b6e067105c23199c23cbadbbbeForExample'
If the method requires any additional arguments, you add those into Params as a dictionary:
>>> method_params = {'name': <your_cluster_name>}
>>> client.generate_presigned_url('describe_cluster', Params=method_params)

How to use `not` condition in the gitlab api issue query

I am trying to read the list of open issues title which doesn't have label resolved. For that I am referring the API documentation (https://docs.gitlab.com/ee/api/issues.html) which mentions NOT but I couldn't able to get the NOT to work.
The following python script I have tried so far to read the list of issues now I am not able to find how to use NOT to filter the issue which doesn't have resolved label.
import gitlab
# private token or personal token authentication
gl = gitlab.Gitlab('https://example.com', private_token='XXXYYYZZZ')
# make an API request to create the gl.user object. This is mandatory if you
# use the username/password authentication.
gl.auth()
# list all the issues
issues = gl.issues.list(all=True,scope='all',state='opened',assignee_username='username')
for issue in issues:
print(issue.title)
From Gitlab issues api documentation, not is of type Hash. It's a special type documented here
For example to exclude the labels Category:DAST and devops::secure, and to exclude the milestone 13.11, you would use the following parameters:
not[labels]=Category:DAST,devops::secure
not[milestone]=13.11
api example: https://gitlab.com/api/v4/issues?scope=all&state=opened&assignee_username=derekferguson&not[labels]=Category:DAST,devops::secure&not[milestone]=13.11
Using gitlab python module, you would need to pass some extra parameters by adding more keyword arguments:
import gitlab
gl = gitlab.Gitlab('https://gitlab.com')
extra_params = {
'not[labels]': "Category:DAST,devops::secure",
"not[milestone]": "13.11"
}
issues = gl.issues.list(all=True, scope='all', state='opened',
assignee_username='derekferguson', **extra_params)
for issue in issues:
print(issue.title)

How to set segment (top) level AWS Xray annotations in Python Lambda

I am successfully using AWS Xray within a Python v2 Lambda. The patch_all() is working well to automatically patch a portion of my libraries, i.e boto3, for Xray.
I am unable to set high level annotations that persist across the lower level subsegments. Can annotations in lambda be set like this? If not how else should they be set? I've tried getting current subsegment and segment.
import json
import re
import boto3
import logging
import sys
from aws_xray_sdk.core import xray_recorder
from aws_xray_sdk.core import patch_all
patch_all()
def lambda_handler(event, context):
subsegment_ref = xray_recorder.current_subsegment()
subsegment_ref.put_annotation('account_id', 'foo')
Lambda function segment is not generated by X-Ray SDK. We are working with Lambda team to provide better experience but currently there is no workaround to annotate the segment.
For annotating subsegment, you can create a subsegment inside the handler and then add annotation to it. You can see the quick start guide https://github.com/aws/aws-xray-sdk-python for creating custom subsegment.
The easiest way is using context manager style:
with xray_recorder.in_subsegment('pick_a_subsegment_name') as subsegment:
subsegment.put_annotation('key', 'value')

Categories

Resources