I would like to get the location of a given blob storage container. I've instantiated my client:
blob_client = BlockBlobService(account_name='account_name', account_key='account_key')
and I was hoping I could pull the get_container_properties() method from it, but it doesn't return very much information.
properties = blob_client.get_container_properties(container_name='container_name')
properties only contains etagm last_modified, lease, and public_access.
How do I get the location of the container?
I think the location of the container you mentioned means storage url of the container. I searched the Container Class in Blob Storage Java SDK and .NET SDK, they all contain url property.
However, in Python SDK, can't find such url property in class azure.storage.blob.models.Container.
I also check get container rest api,still no such url property in response body.
Then I found out that the URL is actually stitched in the client code.
You could use python code to get the url of your container.
containerUrl = 'http://'+accountName+'.blob.core.windows.net/'+containerName;
Hope it helps you.
You can't use Python storage client library to get account region since the client library is for data access purposes. To get account properties, please use Python Storage Resource Provider client library: https://azure.microsoft.com/pt-br/resources/samples/storage-python-manage/
Related
I am using python sdk to copy blobs from one container to another, Here is the code,
from azure.storage.blob import BlobServiceClient
src_blob = '{0}/{1}'.format(src_url,blob_name)
destination_client = BlobServiceClient.from_connection_string(connectionstring)
copied_blob = destination_client.get_blob_client(dst_container,b_name)
copied_blob.start_copy_from_url(src_blob)
It throws the below error,
Content: <?xml version="1.0" encoding="utf-8"?><Error><Code>CannotVerifyCopySource</Code><Message>Public access is not permitted on this storage account.
I already gone through this post here and in my case the public access is disabled .
I do not have sufficient privilege to enable public access on the storage and test? Is there a work around solution to accomplish copy without changing that setting?
Azcopy 409 Public access is not permitted on this storage account
Do I need to change the way I connect to the account?
When copying a blob across storage accounts, the source blob must be publicly accessible so that Azure Storage Service can access the source blob. You were getting the error because you were using just the blob's URL. If the blob is in a private blob container, Azure Storage Service won't be able to access the blob using just its URL.
To fix this issue, you would need to generate a SAS token on the source blob with at least Read permission and use that SAS URL as copy source.
So your code would be something like:
src_blob_sas_token = generate_sas_token_somehow()
src_blob = '{0}/{1}?{2}'.format(src_url,blob_name, src_blob_sas_token)
check the privilege of your SAS token.
In your example, it doesn't look like you are passing the SAS token
I'm trying to use python 3 in order to set property in Azure : Allow Blob public access
I didn't find any information on the net on how to implement this via python,
I did find solution via Powershell: https://learn.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-configure?tabs=powershell
looking for solution for python3...
Thanks!
Allow Blob public access feature is newly added in the latest python sdk azure-mgmt-storage 16.0.0.
When using this feature, you need to add this line in your code:
from azure.mgmt.storage.v2019_06_01.models import StorageAccountUpdateParameters
Here is an example, it can work at my side:
from azure.identity import ClientSecretCredential
from azure.mgmt.storage import StorageManagementClient
from azure.mgmt.storage.v2019_06_01.models import StorageAccountUpdateParameters
subscription_id = "xxxxxxxx"
creds = ClientSecretCredential(
tenant_id="xxxxxxxx",
client_id="xxxxxxxx",
client_secret="xxxxxxx"
)
resource_group_name="xxxxx"
storage_account_name="xxxx"
storage_client = StorageManagementClient(creds, subscription_id)
#set the allow_blob_public_access settings here
p1 = StorageAccountUpdateParameters(allow_blob_public_access=False)
#then use update method to update this feature
storage_client.storage_accounts.update(resource_group_name, storage_account_name, p1)
I haven't tried this myself, but looking at the Python Storage Management SDK and the REST API this should be possible.
Look here for an example on how to create a new storage account using the Python SDK. As you can see, the request body seems to be pretty much exactly what gets passed on to the underlying REST API.
That API does support the optional parameter properties.allowBlobPublicAccess so you should be able to add that directly in python as well.
I have created an Azure function which is trigered when a new file is added to my Blob Storage. This part works well !
BUT, now I would like to start the "Speech-To-Text" Azure service using the API. So I try to create my URI leading to my new blob and then add it to the API call. To do so I created an SAS Token (From Azure Portal) and I add it to my new Blob Path .
https://myblobstorage...../my/new/blob.wav?[SAS Token generated]
By doing so I get an error which says :
Authentification failed Invalid URI
What am I missing here ?
N.B : When I generate manually the SAS token from the "Azure Storage Explorer" everything is working well. Plus my token is not expired in my test
Thank you for your help !
You might generate the SAS token with wrong authentication.
Make sure the Object option is checked.
Here is the reason in docs:
Service (s): Access to service-level APIs (e.g., Get/Set Service Properties, Get Service Stats, List Containers/Queues/Tables/Shares)
Container (c): Access to container-level APIs (e.g., Create/Delete Container, Create/Delete Queue, Create/Delete Table, Create/Delete
Share, List Blobs/Files and Directories)
Object (o): Access to object-level APIs for blobs, queue messages, table entities, and files(e.g. Put Blob, Query Entity, Get Messages,
Create File, etc.)
now to define Google storage client I'm using:
client = storage.Client.from_service_account_json('creds.json')
But I need to change client dynamically and prefer not deal with storing auth files to local fs.
So, is there some another way to connect by sending credentials as variable?
Something like for AWS and boto3:
iam_client = boto3.client(
'iam',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY
)
I guess, I miss something in docs and would be happy if someone point me where can I found this in docs.
If you want to use built-in methods, an option could be to create the constructor for the Client (Cloud Storage). In order to perform that actions these two links can be helpful.
Another possible option in order to avoid store auth files locally is using environment variable pointing to credentials outside of your applications code such as Cloud Key Management Service. To have more context about this you can take a look at this article.
I have created a Google bucket with "Fine-grained access control" and a few users have uploaded files to it. Using the python API I can't seem to get any information on who uploaded each. The blob.owner property just returns None:
sclient = storage.Client(project=GCLOUD_PROJECT)
bucket = storage.bucket.Bucket(client=sclient, name=GCLOUD_BUCKET)
blob = bucket.get_blob('foo.bar')
blob.reload()
print(blob.owner)
I'm calling reload() there because the documentation states it's required to pull some attributes from the server. All other properties I try print fine (size, updated, etag, md5_hash, etc.).
How can I recover the uploader identification?
For anyone running into this, I created a ticket and it was fixed.
https://github.com/googleapis/python-storage/issues/136