How do I get the url of uploaded file? - python

I have uploaded an mp4 file as follows:
import firebase_admin
from firebase_admin import credentials
from firebase_admin import storage
cred = credentials.Certificate('my-app-service.json')
firebase_admin.initialize_app(cred, {
'storageBucket': 'amy-app-name.appspot.com'
})
bucket = storage.bucket()
blob = bucket.blob('teamfk.mp4')
blob.upload_from_filename('path/to/teamfk.mp4')
Now I can't find the syntax to get a reference to the uploaded url ?
To add, I should be able to view/download from browser.
It need not be authenticated, public is fine.

As per Google Docs - Cloud Storage
The public URL of the file can be retrieved with
blob.make_public()
blob.public_url

Here is another way! If you want to generate a URL that only valid for a specific time range you can accomplish it using that way.
file_url = blob.generate_signed_url(datetime.timedelta(days=1), method='GET') #this URL only valid for 1 day
For more details refer this link :)

Related

How to create Google Cloud storage access token programmatically python

I need to have a public URL for a file that I am creating inside a google function.
I want therefore to create an access token :
I am able to upload the file from a python google function with the function blob.upload_from_string(blob_text), but I do not know how I can create a public url (or create an access token) for it.
Could you help me with it ?
EDITING WITH THE ANSWER (almost copy paste from Marc Anthony B answer )
blob = bucket.blob(storage_path)
token = uuid4()
metadata = {"firebaseStorageDownloadTokens": token}
blob.metadata = metadata
download_url = 'https://firebasestorage.googleapis.com/v0/b/{}/o/{}?alt=media&token={}' \
.format(bucket.name, storage_path.replace("/", "%2F"), token)
with open(video_file_path, 'rb') as f:
blob.upload_from_file(f)
Firebase Storage for Python still doesn't have its own SDK but you can use firebase-admin instead. Firebase Admin SDKs depend on the Google Cloud Storage client libraries to provide Cloud Storage access. The bucket references returned by the Admin SDK are objects defined in these libraries.
When uploading an object to Firebase Storage, you must incorporate a custom access token. You may use UUID4 for this case. See code below:
import firebase_admin
from firebase_admin import credentials
from firebase_admin import storage
from uuid import uuid4
projectId = '<PROJECT-ID>'
storageBucket = '<BUCKET-NAME>'
cred = credentials.ApplicationDefault()
firebase_admin.initialize_app(cred, {
'projectId': projectId,
'storageBucket': storageBucket
})
bucket = storage.bucket()
# E.g: "upload/file.txt"
bucket_path = "<BUCKET-PATH>"
blob = bucket.blob(bucket_path)
# Create a token from UUID.
# Technically, you can use any string to your token.
# You can assign whatever you want.
token = uuid4()
metadata = {"firebaseStorageDownloadTokens": token}
# Assign the token as metadata
blob.metadata = metadata
blob.upload_from_filename(filename="<FILEPATH>")
# Make the file public (OPTIONAL). To be used for Cloud Storage URL.
blob.make_public()
# Fetches a public URL from GCS.
gcs_storageURL = blob.public_url
# Generates a URL with Access Token from Firebase.
firebase_storageURL = 'https://firebasestorage.googleapis.com/v0/b/{}/o/{}?alt=media&token={}'.format(storageBucket, bucket_path, token)
print({
"gcs_storageURL": gcs_storageURL,
"firebase_storageURL": firebase_storageURL
})
As you can see from the code above, I've mentioned GCS and Firebase URLs. If you want a public URL from GCS then you should make the object public by using the make_public() method. If you want to use the access token generated, then just concatenate the default Firebase URL with the variables required.
If the objects are already in the Firebase Storage and already have access tokens incorporated on it, then you can get it by getting the objects metadata. See code below:
# E.g: "upload/file.txt"
bucket_path = "<BUCKET-PATH>"
blob = bucket.get_blob(bucket_path)
# Fetches object metadata
metadata = blob.metadata
# Firebase Access Token
token = metadata['firebaseStorageDownloadTokens']
firebase_storageURL = 'https://firebasestorage.googleapis.com/v0/b/{}/o/{}?alt=media&token={}'.format(storageBucket, bucket_path, token)
print(firebase_storageURL)
For more information, you may check out this documentation:
Google Cloud Storage Library for Python
Introduction to the Admin Cloud Storage API

Read JSON file from Firebase Storage using Admin SDK in Python without download

I am growing through Python, to read a JSON file present on Firebase Storage via the admin sdk. I have already read the following question
How to download file from firebase storage in python using firebase_admin
but in this case the file is downloaded to another file. Is it possible instead to read it without downloading it and without making it public? Or alternatively is it possible to make a temporary file and once python prints it deletes it? Thank you
My code:
import firebase_admin
from firebase_admin import credentials, storage
cred = credentials.Certificate('serviceAccount.json')
firebase_admin.initialize_app(cred, {
"storageBucket": "**.appspot.com"
})
source_blob_name = "UQ4ADY.json"
#The path to which the file should be downloaded
destination_file_name = r"temp\ftmp.json"
bucket = storage.bucket()
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
Be aware that there may be a lot of requests and that is why I prefer not to download the files
After a comment by Johm Hanley, i used download_as_text funcion:
import firebase_admin
from firebase_admin import credentials, storage
cred = credentials.Certificate('serviceAccount.json')
firebase_admin.initialize_app(cred, {
"storageBucket": "**.appspot.com"
})
source_blob_name = "UQ4ADY.json"
#The path to which the file should be downloaded
destination_file_name = r"temp\ftmp.json"
bucket = storage.bucket()
blob = bucket.blob(source_blob_name)
data = blob.download_as_text()

Is there a temporary directory or direct way to upload a file in azure storage?

so I try to make a python API so the user can upload a pdf file then the API directly sends it to Azure storage. what I found is I must have a directory i.e.
container_client = ContainerClient.from_connection_string(conn_str=conn_str,container_name='mycontainer')
with open('mylocalpath/myfile.pdf',"rb") as data:
container_client.upload_blob(name='myblockblob.pdf', data=data)
another solution is I have to store it on VM and then replace the local path to it, but I don't want to make my VM full.
If you want to upload it directly from the client-side to azure storage blob instead of receiving that file to your API you can use azure shared access signature inside your storage account and from your API you can make a function to generate Pre-Signed URL using that shared access signature service and return that URL to your client it will allow the client to upload file to your blob via that URL.
To generate URL can you follow the below code:
from datetime import datetime, timedelta
from azure.storage.blob import generate_blob_sas, BlobSasPermissions
blobname= "<blobname>"
accountkey="<accountkey>" #get this from access key section in azure storage.
containername = "<containername>"
def getpushurl(filename):
token = generate_blob_sas(
account_name=blobname,
container_name=containername,
account_key=accountkey,
permission=BlobSasPermissions(write=True),
expiry=datetime.utcnow() + timedelta(seconds=100),
blob_name=filename,
)
url = f"https://{blobname}.blob.core.windows.net/{containername}/{filename}?{token}"
return url
pdfpushurl = getpushurl("demo.text")
print(pdfpushurl)
So after generating this URL give it to the client so client could directly send the file to the URL received with PUT request and it will get uploaded directly to azure storage.
You can generate a SAS token with write permission for your users so that your users could upload .pdf files directly on their side without storing them on the server. For details, pls see my previous post here.
Try the code below to generate a SAS token with container write permission:
from azure.storage.blob import BlobServiceClient,ContainerSasPermissions,generate_container_sas
from datetime import datetime, timedelta
storage_connection_string=''
container_name = ''
block_blob_service = BlobServiceClient.from_connection_string(storage_connection_string)
container_client = block_blob_service.get_container_client(container_name)
sasToken = generate_container_sas(account_name=container_client.account_name,
container_name=container_client.container_name,
account_key= container_client.credential.account_key,
#grant write permission only
permission=ContainerSasPermissions(write=True),
start=datetime.utcnow() - timedelta(minutes=1),
#1 hour vaild time
expiry=datetime.utcnow() + timedelta(hours=1)
)
print(sasToken)
After you have replied to this SAS token to your user, just see this official guide to upload files from a HTML page, I think it would be helpful if you are developing a web app.

Trouble with Google Application Credentials

Hi there first and foremost this is my first time using Googles services. I'm trying to develop an app with the Google AutoML Vision Api (Custom Model). I have already build a custom model and generated the API keys(I hope I did it correctly tho).
After many attempts of developing via Ionics & Android and failing to connect to the to the API.
I have now taken the prediction modelling given codes in Python (on Google Colab) and even with that I still get an error message saying that Could not automatically determine credentials. I'm not sure where I have gone wrong in this. Please help. Dying.
#installing & importing libraries
!pip3 install google-cloud-automl
import sys
from google.cloud import automl_v1beta1
from google.cloud.automl_v1beta1.proto import service_pb2
#import key.json file generated by GOOGLE_APPLICATION_CREDENTIALS
from google.colab import files
credentials = files.upload()
#explicit function given by Google accounts
[https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-python][1]
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json(credentials)
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
#import image for prediction
from google.colab import files
YOUR_LOCAL_IMAGE_FILE = files.upload()
#prediction code from modelling
def get_prediction(content, project_id, model_id):
prediction_client = automl_v1beta1.PredictionServiceClient()
name = 'projects/{}/locations/uscentral1/models/{}'.format(project_id,
model_id)
payload = {'image': {'image_bytes': content }}
params = {}
request = prediction_client.predict(name, payload, params)
return request # waits till request is returned
#print function substitute with values
content = YOUR_LOCAL_IMAGE_FILE
project_id = "REDACTED_PROJECT_ID"
model_id = "REDACTED_MODEL_ID"
print (get_prediction(content, project_id, model_id))
Error Message when run the last line of code:
credentials = files.upload()
storage_client = storage.Client.from_service_account_json(credentials)
these two lines are the issue I think.
The first one actually loads the contents of the file, but the second one expects a path to a file, instead of the contents.
Lets tackle the first line first:
I see that just passing the credentials you get after calling credentials = files.upload() will not work as explained in the docs for it. Doing it like you're doing, the credentials don't actually contain the value of the file directly, but rather a dictionary for filenames & contents.
Assuming you're only uploading the 1 credentials file, you can get the contents of the file like this (stolen from this SO answer):
from google.colab import files
uploaded = files.upload()
credentials_as_string = uploaded[uploaded.keys()[0]]
So now we actually have the contents of the uploaded file as a string, next step is to create an actual credentials object out of it.
This answer on Github shows how to create a credentials object from a string converted to json.
import json
from google.oauth2 import service_account
credentials_as_dict = json.loads(credentials_as_string)
credentials = service_account.Credentials.from_service_account_info(credentials_as_dict)
Finally we can create the storage client object using this credentials object:
storage_client = storage.Client(credentials=credentials)
Please note I've not tested this though, so please give it a go and see if it actually works.

How to retrieve image from Firebase Storage using Python?

I have already store my image to Firebase Storage, and I need to take it out by using Python code. Can I retrieve the image by using any URL? Or is there any way to retrieve it out?
Here are the image of how I store it in Firebase Storage:
This is what I use. Hope it helps.
import firebase_admin
from firebase_admin import credentials
from firebase_admin import storage
# Fetch the service account key JSON file contents
cred = credentials.Certificate("credentials.json")
# Initialize the app with a service account, granting admin privileges
app = firebase_admin.initialize_app(cred, {
'storageBucket': '<BUCKET_NAME>.appspot.com',
}, name='storage')
bucket = storage.bucket(app=app)
blob = bucket.blob("<your_blob_path>")
print(blob.generate_signed_url(datetime.timedelta(seconds=300), method='GET'))
It generates a public URL (for 300 secs) for you to download your files.
For example, in my case, I use that URL to display stored pictures in my django website with <img> tag.
Here is the doc for more usefull functions.

Categories

Resources