Attaching IAM user to IAM Group using Boto3 - python

I am trying to create an IAM policy, to an IAM group using Boto3. So far, I am not able to add IAM user to an IAM group. For "response = iam.add_user_to_group( ", I am getting an error, "Exception has occurred: AttributeError
'iam.ServiceResource' object has no attribute 'add_user_to_group'"
import boto3
iam = boto3.resource('iam') #using resource representing IAM
created_user = iam.create_user(
UserName='some_random_user'
)
print(created_user)
create_group_response = iam.create_group(GroupName = 'Tester')
response = iam.add_user_to_group(
UserName = 'some_random_user', #Name of user
GroupName = 'Tester'
)
response = iam.attach_user_policy(
UserName = 'some_random_user', #Name of user
PolicyArn = 'arn:aws:iam::196687784845:policy/boto-test'
# Policy ARN which you want to asign to user
)
Not sure what seems to be the problem, I am very new to python and boto so might be a very small thing.

According to the doc, add_user_to_group is an action for IAM client (not resource). Use this action on Group resource instead.

Related

Accessing QLDB ledger in another AWS account

I'm having trouble accessing the QLDB ledger in another AWS account.
I have granted necessary IAM permission for cross-account access.
I set the credentials in the EC2 where my python script is runing using below code.
sts_client = boto3.client("sts", region_name=region)
response = sts_client.assume_role(
RoleArn="arn:aws:iam::xxx:role/xxx-ec2",
RoleSessionName="RoleSessionname",
)
os.environ["AWS_ACCESS_KEY_ID"] = response["Credentials"]["AccessKeyId"]
os.environ["AWS_SECRET_ACCESS_KEY"] = response["Credentials"]["SecretAccessKey"]
os.environ["AWS_SESSION_TOKEN"] = response["Credentials"]["SessionToken"]
os.environ["AWS_DEFAULT_REGION"] = region
but keep on getting below error
in _get_session
raise ExecuteError(e, True, True)
pyqldb.errors.ExecuteError: Error containing the context of a failure during execute.
botocore.errorfactory.BadRequestException: An error occurred (BadRequestException) when calling the SendCommand operation: The Ledger with name my-ledger is not found
the error is thrown during the execution of the below code.
qldb_driver = QldbDriver(ledger_name='my-ledger', region_name='us-east-1')
result = qldb_driver.execute_lambda(lambda x: read_table(x, table_name))
Found out that the credentials can be passed to QldbDriver function per -> https://github.com/awslabs/amazon-qldb-driver-python/blob/master/pyqldb/driver/qldb_driver.py#L103

Azure: create storage account with container and upload blob to it in Python

I'm trying to create a storage account in Azure and upload a blob into it using their python SDK.
I managed to create an account like this:
client = get_client_from_auth_file(StorageManagementClient)
storage_account = client.storage_accounts.create(
resourceGroup,
name,
StorageAccountCreateParameters(
sku=Sku(name=SkuName.standard_ragrs),
enable_https_traffic_only=True,
kind=Kind.storage,
location=region)).result()
The problem is that later I'm trying to build a container and I don't know what to insert as "account_url"
I have tried doing:
client = get_client_from_auth_file(BlobServiceClient, account_url=storage_account.primary_endpoints.blob)
return client.create_container(name)
But I'm getting:
azure.core.exceptions.ResourceNotFoundError: The specified resource does not exist
I did manage to create a container using:
client = get_client_from_auth_file(StorageManagementClient)
return client.blob_containers.create(
resourceGroup,
storage_account.name,
name,
BlobContainer(),
public_access=PublicAccess.Container
)
But later when I'm trying to upload a blob using BlobServiceClient or BlobClien I still need the "account_url" so I'm still getting an error:
azure.core.exceptions.ResourceNotFoundError: The specified resource does not exist
Anyone can help me to understand how do I get the account_url for a storage account I created with the SDK?
EDIT:
I managed to find a workaround to the problem by creating the connection string from the storage keys.
storage_client = get_client_from_auth_file(StorageManagementClient)
storage_keys = storage_client.storage_accounts.list_keys(resource_group, account_name)
storage_key = next(v.value for v in storage_keys.keys)
return BlobServiceClient.from_connection_string(
'DefaultEndpointsProtocol=https;' +
f'AccountName={account_name};' +
f'AccountKey={storage_key};' +
'EndpointSuffix=core.windows.net')
This works but I thin George Chen answer is more elegant.
I could reproduce this problem, then I found get_client_from_auth_file could not pass the credential to the BlobServiceClient, cause if just create BlobServiceClient with account_url without credential it also could print the account name.
So if you want to use a credential to get BlobServiceClient, you could use the below code, then do other operations.
credentials = ClientSecretCredential(
'tenant_id',
'application_id',
'application_secret'
)
blobserviceclient=BlobServiceClient(account_url=storage_account.primary_endpoints.blob,credential=credentials)
If you don't want this way, you could create the BlobServiceClient with the account key.
client = get_client_from_auth_file(StorageManagementClient,auth_path='auth')
storage_account = client.storage_accounts.create(
'group name',
'account name',
StorageAccountCreateParameters(
sku=Sku(name=SkuName.standard_ragrs),
enable_https_traffic_only=True,
kind=Kind.storage,
location='eastus',)).result()
storage_keys = client.storage_accounts.list_keys(resource_group_name='group name',account_name='account name')
storage_keys = {v.key_name: v.value for v in storage_keys.keys}
blobserviceclient=BlobServiceClient(account_url=storage_account.primary_endpoints.blob,credential=storage_keys['key1'])
blobserviceclient.create_container(name='container name')

multiuser management in O365 python

I was creating an application of office which contain multiple user credentials and do the tasks like emailing and adding calender events. I choosed O365. All things were great here except. I could not save the credentials. like in other google products we pickle the creds.
with open(f'account_data/{account_name}.pickle','wb') as stream:
pickle.dump(account, stream)
but I error as
AttributeError: Can't pickle local object 'OAuth2Session.__init__.<locals>.<lambda>'
I need to store multiple user keys and do some tasks. If you have any other module then tell me.
I figured it out myself.
from O365 import Account, MSGraphProtocol, message, FileSystemTokenBackend
def new_account(account_name):
account = Account(credentials, scopes=scopes, )
token_backend = FileSystemTokenBackend(token_path='account_data', token_filename=f'{account_name}.txt')
account.con.token_backend = token_backend
account.authenticate()
account.con.token_backend.save_token()
def load_account(account_name):
account = Account(credentials, scopes=scopes, )
token_backend = FileSystemTokenBackend(token_path='account_data', token_filename=f'{account_name}.txt')
account.con.token_backend = token_backend
account.con.token_backend.load_token()
if account.con.refresh_token():
return account

unable to check and create aws security group with boto3

I'm trying to create a security groups and get the secutity group id as output using boto3. I want something like this:
If the security group exists, get/return/output the groud id.
If the security group doesn't exists create and authorize the group with the given rule and output the group id
This is my code so far:
ec2 = boto3.client('ec2', region_name='us-east-1')
for rds_security_group in ec2.describe_security_groups()['SecurityGroups']:
if rds_security_group['GroupName'] == 'testgroup':
print(rds_security_group['GroupId'])
return (rds_security_group['GroupId'])
else:
rds_security_group_name = ec2.create_security_group(
GroupName='testgroup',
Description='rds-security-group',
VpcId='vpc-12345')
client.authorize_security_group_ingress(
CidrIp=10.10.10.10/11,
IpProtocol='tcp',
FromPort=90,
ToPort=90,
GroupId=rds_security_group_name['GroupId'])
print(rds_security_group_name['GroupId'])
return(rds_security_group_name['GroupId'])
if security group doesn't exists code works perfectly by creating the group and returns the group id. but fails to return the group id if the security group already exists and throws up the existing error.
botocore.exceptions.ClientError: An error occurred (InvalidGroup.Duplicate) when calling the CreateSecurityGroup operation: The security group 'testgroup' already exists for VPC 'vpc-12345'
please help me on this ?
Your problem is that you are looping thru each security group and checking its group name. If the first security group is not called "testgroup" then you try to create it. Change your code to the following:
ec2 = boto3.client('ec2', region_name='us-east-1')
for rds_security_group in ec2.describe_security_groups()['SecurityGroups']:
if rds_security_group['GroupName'] == 'testgroup':
print(rds_security_group['GroupId'])
return (rds_security_group['GroupId'])
# Security Group was not found, create it
rds_security_group_name = ec2.create_security_group(
GroupName='testgroup',
Description='rds-security-group',
VpcId='vpc-12345')
client.authorize_security_group_ingress(
CidrIp=10.10.10.10/11,
IpProtocol='tcp',
FromPort=90,
ToPort=90,
GroupId=rds_security_group_name['GroupId'])
print(rds_security_group_name['GroupId'])
return(rds_security_group_name['GroupId'])
ec2 = boto3.client('ec2', region_name='us-east-1')
for rds_security_group in ec2.describe_security_groups()['SecurityGroups']:
if rds_security_group['GroupName'] == 'testgroup':
print(rds_security_group['GroupId'])
return (rds_security_group['GroupId'])
else:
rds_security_group_name = ec2.create_security_group(
GroupName='testgroup',
Description='rds-security-group',
VpcId='vpc-12345')
client.authorize_security_group_ingress(
CidrIp=10.10.10.10/11,
IpProtocol='tcp',
FromPort=90,
ToPort=90,
GroupId=rds_security_group_name['GroupId'])
print(rds_security_group_name['GroupId'])
return(rds_security_group_name['GroupId'])
I did find answer to my question with slight change in the existing code its self

How to use create tags Python Azure SDK?

I am trying to create tags in Azure 2.0.0rc2 using Python.
Following is the code i used:
def __update_tags(self):
username = 'user#xyz.com'
password = 'user#1234'
subscription_id = '478-ytehn-47ds5-784aa-4758a'
credentials = UserPassCredentials(username=username, password=password)
resource_client = ResourceManagementClient(credentials=credentials)
tag_operations = TagOperations(client=resource_client)
tag_operations.create_or_update_value(tag_name='key_1', tag_value='val_1')
On running this code i am getting error like:
if self.client.credentials.subscription_id is not None:
AttributeError: 'UserPassCredentials' object has no attribute 'subscription_id'
Anyone have idea to solve this issue.
In your code, subscription_id is specified but not used. You need the subscription_id when creating the resource_client. Please replace "resource_client = ResourceManagementClient(credentials=credentials)" with the code below:
resource_client = ResourceManagementClient(
ResourceManagementClientConfiguration(
credentials,
subscription_id
)
Check here for more information.
Update:
confirm import ResourceManagementClientConfiguration
According to the documents (Resource Management and Resource Management Authentication), as #forester123 said, summarize the code as below.
from azure.common.credentials import UserPassCredentials
from azure.mgmt.resource.resources import ResourceManagementClient, ResourceManagementClientConfiguration
username = 'user#xyz.com'
password = 'user#1234'
subscription_id = '478-ytehn-47ds5-784aa-4758a'
credentials = UserPassCredentials(username, password)
resource_client = ResourceManagementClient(
ResourceManagementClientConfiguration(
credentials,
subscription_id
)
)

Categories

Resources