I want to upload images from the mobile application to my Firestore using a Flask API.
The app is uploading a list of int that represents the image. The length of the list is really long, over a thousand I believe.
I'm sending this list of ints in the request json.
How can I upload this list of ints to Firebase storage using the Firebase module for Flask.
Is using a list of ints to upload the image even the correct way?
If not, how can I upload the image using the Flask API?
This the code I have so far, and it's a mess :) :
def post(self):
request_data = request.get_json()
baqala_id = request_data["userID"]
imageBytes = request_data["imageBytes"]
# imageFile = request_data["imageFile"]
bucket = storage.bucket("bucket_name.appspot.com")
blob = bucket.blob(f"{baqala_id}/{imageBytes}")
# blob = bucket.blob(f"{baqala_id}/{imageFile}")
upload = blob.upload_from_string(imageBytes)
# upload = blob.upload_from_file(imageFile)
return {"message": "upload succesfull"}, 200
Note: I can't using the Firebase plugin for Flutter because it's too slow and cannot be used in a separate Isolate.
Related
so I try to make a python API so the user can upload a pdf file then the API directly sends it to Azure storage. what I found is I must have a directory i.e.
container_client = ContainerClient.from_connection_string(conn_str=conn_str,container_name='mycontainer')
with open('mylocalpath/myfile.pdf',"rb") as data:
container_client.upload_blob(name='myblockblob.pdf', data=data)
another solution is I have to store it on VM and then replace the local path to it, but I don't want to make my VM full.
If you want to upload it directly from the client-side to azure storage blob instead of receiving that file to your API you can use azure shared access signature inside your storage account and from your API you can make a function to generate Pre-Signed URL using that shared access signature service and return that URL to your client it will allow the client to upload file to your blob via that URL.
To generate URL can you follow the below code:
from datetime import datetime, timedelta
from azure.storage.blob import generate_blob_sas, BlobSasPermissions
blobname= "<blobname>"
accountkey="<accountkey>" #get this from access key section in azure storage.
containername = "<containername>"
def getpushurl(filename):
token = generate_blob_sas(
account_name=blobname,
container_name=containername,
account_key=accountkey,
permission=BlobSasPermissions(write=True),
expiry=datetime.utcnow() + timedelta(seconds=100),
blob_name=filename,
)
url = f"https://{blobname}.blob.core.windows.net/{containername}/{filename}?{token}"
return url
pdfpushurl = getpushurl("demo.text")
print(pdfpushurl)
So after generating this URL give it to the client so client could directly send the file to the URL received with PUT request and it will get uploaded directly to azure storage.
You can generate a SAS token with write permission for your users so that your users could upload .pdf files directly on their side without storing them on the server. For details, pls see my previous post here.
Try the code below to generate a SAS token with container write permission:
from azure.storage.blob import BlobServiceClient,ContainerSasPermissions,generate_container_sas
from datetime import datetime, timedelta
storage_connection_string=''
container_name = ''
block_blob_service = BlobServiceClient.from_connection_string(storage_connection_string)
container_client = block_blob_service.get_container_client(container_name)
sasToken = generate_container_sas(account_name=container_client.account_name,
container_name=container_client.container_name,
account_key= container_client.credential.account_key,
#grant write permission only
permission=ContainerSasPermissions(write=True),
start=datetime.utcnow() - timedelta(minutes=1),
#1 hour vaild time
expiry=datetime.utcnow() + timedelta(hours=1)
)
print(sasToken)
After you have replied to this SAS token to your user, just see this official guide to upload files from a HTML page, I think it would be helpful if you are developing a web app.
I am trying to use Python, the Reddit API, and Firebase to store images from reddit in Firebase Storage and then put the URL from firebase storage in a doc.
Right now I am just storing the URL that the reddit api provides in my project's Firebase Database. BUT this is a problem since sometimes these URLs "go bad and expire" meaning that when I grab a doc and display the URL I have stored I get an error.
This is how I am getting the information before creating the doc in firebase
function getJSON(sub){
var ret;
var yourUrl="https://www.reddit.com/r/"+sub+".json";
fetch(yourUrl).then(response => {
return response.json();
}).then(function(data){
var i = 5;
list= data.data.children
curlist = list[i].data
if(curlist.domain==='i.redd.it'){
if(curlist.post_hint==='image'){
var url =curlist.url;
var author=curlist.author;
var time=curlist.created_utc;
var score = curlist.score;
sendToFirebase(url,author,sub,time,score);
}
}
}).catch(err => {
});
}
What I would like to do instead of putting the reddit URL in the doc is take this reddit URL and store the image that is at the end of this URL in my Firebase Storage, so that way I can take the image URL that is stored in Firebase Storage and put it in the doc.
The Firebase SDKs require that the image data is available on the local client to be able to upload it to Cloud Storage. There are no methods that take an external image URL of the data to upload to Cloud Storage.
If the image is not local to the client, you will first need to download it to the client, before uploading it to Cloud Storage with the Firebase SDK.
I have two api endpoints, one that takes a file from an http request and uploads it to a google cloud bucket using the python api, and another that downloads it again. in the first view, i get the file content type from the http request and upload it to the bucket,setting that metadata:
from google.cloud import storage
file_obj = request.FILES['file']
client = storage.Client.from_service_account_json(path.join(
path.realpath(path.dirname(__file__)),
'..',
'settings',
'api-key.json'
))
bucket = client.get_bucket('storage-bucket')
blob = bucket.blob(filename)
blob.upload_from_string(
file_text,
content_type=file_obj.content_type
)
Then in another view, I download the file:
...
bucket = client.get_bucket('storage-bucket')
blob = bucket.blob(filename)
blob.download_to_filename(path)
How can I access the file metadata I set earlier (content_type) ? It's not available on the blob object anymore since a new one was instantiated, but it still holds the file.
You should try
blob = bucket.get_blob(blob_name)
blob.content_type
Goal: Take/attach pictures in a PhoneGap application and send a public URL for each picture to a Google Cloud SQL database.
Question 1: Is there a way to create a Google Cloud Storage object from a base64 encoded image (in Python), then upload that object to a bucket and return a public link?
I'm looking to use PhoneGap to send images to a Python Google App Engine application, then have that application send the images to a Google Cloud Storage bucket I have set up, then return a public link back to the PhoneGap app. These images can either be taken directly from the app, or attached from existing photo's on the user's device.
I use PhoneGap's FileTransfer plugin to upload the images to GAE, which are sent as base64 encoded images (this isn't something I can control).
Based on what I've found in Google Docs, I can upload the images to Blobstore; however, it requires <input type='file'> elements in a form. I don't have 'file' input elements; I just take the image URI returned from PhoneGap's camera object and display a thumbnail of the picture that was taken (or attached).
Question 2: Is it possible to have an <input type='file'> element and control it's value? As in, is it possible to set it's value based on whether the user chooses a file, or takes a picture?
Thanks in advance!
Here's a solution for others who might face this problem. Turns out it's incredibly simple!
Once you have a bucket setup for your GAE project, you can use this Python code to send an image to the bucket:
import cloudstorage as gcs
import webapp2
import cgi
import MySQLdb
import os
import logging
import time
from google.appengine.api import mail
from google.appengine.api import images
from google.appengine.ext import blobstore
class UploadImageHandler(webapp2.RequestHandler):
def post(self):
self.response.headers.add_header(ACCESS_CONTROL, '*')
f = self.request.POST['image']
fname = '/your-bucket-name/%s' % f.filename;
gcs_file = gcs.open(fname, 'w', content_type="image/jpeg")
gcs_file.write(self.request.get('image'))
gcs_file.close()
And the code used to upload the file from a PhoneGap application:
// Uploads images in "imageURIs" to the web service specified in "server".
function uploadImages(imageURIs, server) {
var success = function(data) {
alert("Successfully uploaded image!");
};
var fail = function(error) {
alert("Failed to upload image: "+error);
};
var options = new FileUploadOptions();
options.fileKey = "image";
options.mimeType = "image/jpeg";
var ft = new FileTransfer();
for (var i = 0; i < imageURIs.length; i++) {
alert("Uploading"+i);
options.fileName = imageURIs[i].substr(imageURIs[i].lastIndexOf('/') + 1);
ft.upload(imageURIs[i], encodeURI(server), success, fail, options);
}
}
I hope it helps someone else. :)
Yes, that is a fine use for GAE and GCS. You do not need an <input type=file>, per se. You can just set up POST parameters in your call to your GAE url. Make sure you send a hidden key as well, and work from SSL-secured urls, to prevent spammers from posting to your app.
I'm trying to upload an image to the Face.com API. It either takes a url to an image, or images can be uploaded directly. Their website says:
A requests that uploads a photo must be formed as a MIME multi-part
message sent using POST data. Each argument, including the raw image
data, should be specified as a separate chunk of form data.
Problem is, I don't know exactly what that means. Right now my code looks like this:
import urllib
import json
apikey = "[redacted]"
secret = "[redacted]"
img = raw_input("Enter the URL of an image: ");
url = "http://api.face.com/faces/detect.json?api_key=" + apikey + "&api_secret=" + secret + "&urls=" + urllib.quote(img) + "&attributes=all"
data = json.loads(urllib.urlopen(url).read())
How can I convert this to work with a locally stored image?
The easiest way to upload photo in Python to face.com API is just using the Python Client Library that can be downloaded form http://developers.face.com/download/.
You got 2 there. Both support uploading by passing filename to the detected method (as a different param than the urls).