How to upload video through Python GAE to Google Cloud Storage? - python

I have a python GAE application I'm developing locally. I'd like to add the feature that users can upload image or video from computer to Google Cloud Storage.
I've looked over the Google Cloud Storage documentation a few times. Perhaps I'm not smart enough to grasp the workings quickly.
I would really appreciate it if someone can run down a very simple example of the entire process, from user uploading file through a POST form, to storing it in the Google Cloud Storage, and also how to store the path to file in the NDB datastore, and finally how to retrieve file and render it to user.
Thanks a lot

Example here showing a direct upload to GCS using a form POST and a signed url. After the upload GCS uses a callback to send you the GCS object path.
A policy document defines what a user (with our without a Google account) can upload with a form POST.

Related

Django Google App Engine Upload files greater than 32mb

I have a Django Rest Framework Project that I've integrated with Django-Storages to upload files to GCS. Everything works locally. However, Google App Engine imposes a hard limit of 32mb on the size of each request, I cannot upload any files greater than this described limit.
I looked into many posts here on StackOverflow and on the internet. Some of the solutions out listed the use of Blobstore API. However, I cannot find a way to integrate this into Django. Another solution describes the use of django-filetransfers but that plugin is obsolete.
I would appreciate it if someone can point me towards an approach I can take to fixing this problem.
PS: I would like to point out that the current setup works likes this. A post request sends the file up to the server which then handles the process of storing the file in google cloud storage. Since Google App Engine restricts request size to 32mb I cannot get to the point of receiving the file. So my issue is that how can I go about uploading these large files.
According with the official documentation[1] cloud storage can manage files until the 5 tb of size, nevertheless, is recommended take a look at the best practices document[2], also there is an example about how to upload objects using python here [3].
[1]https://cloud.google.com/storage/docs/json_api/v1/objects/insert
[2]https://cloud.google.com/storage/docs/best-practices#uploading
[3]https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-python

Public URL to files in Google Cloud Storage from python backend (Google App Engine)

I'm developing an Android application, which communicates with backend Google App Engine written in Python. User is uploading and downloading files to Google Cloud Storage. So far, the files where being sent to the GAE backend by POST request, and then saved in GCS. I want user to do it directly to GCS (to avoid sending large files over POST). And (on download request) I would like to send user only public URL to file. There is a nice tutorial for it in PHP:
https://cloud.google.com/appengine/docs/php/googlestorage/user_upload and
https://cloud.google.com/appengine/docs/php/googlestorage/public_access and key sentence there: "Once the file is written to Cloud Storage as publically readable, you need to get the public URL for the file, using CloudStorageTools::getPublicUrl." How to do the same in python?
The public URL of a file in GCS looks like this:
https://storage.googleapis.com/<appname>.appspot.com/<filename>
When I store files in GCS, I explicitly give the file a filename, so I can create a serving URL using the template above.
Are you giving a filename when you store files in GCS? If not, are you able to do so? Maybe provided details of how you are saving the files to GCS in your question to get a better answer.

How to allow users to download images from Google Cloud Storage

My google app engine app allows users to upload images to Google Cloud Storage. I use python on the server.
What is the easiest way to allow users to download the images stored in Google Cloud Storage with program. Javascript on the client side bars users from saving downloaded files.
The python boto library seems like overkill. I would like to display a list of images a user is allowed to download, with a 'download' button beside it. Thanks!
Whatever logic your app uses to determine which images the currently logged-in user is allowed to download, you just need to put on each corresponding "download button" a signed url for the image within (assuming HTML5) an <a href=[[signed url] download=[[filename]]>[[image name/description]]<a> tag.

user upload to my S3 bucket

I would like for a user, without having to have an Amazon account, to be able to upload mutli-gigabyte files to an S3 bucket of mine.
How can I go about this? I want to enable a user to do this by giving them a key or perhaps through an upload form rather than making a bucket world-writeable obviously.
I'd prefer to use Python on my serverside, but the idea is that a user would need nothing more than their web browser or perhaps opening up their terminal and using built-in executables.
Any thoughts?
You are attempting to proxy the file thorough your python backend to S3, that too large files. Instead you can configure S3 to accept files from user directly (without proxying through your backend code).
It is explained here: Browser Uploads to S3 using HTML POST Forms. This way your server need not handle any upload load at all.
If you also want your users to use their elsewhere ID (google/FB etc) to achieve this workflow, that too is possible. They will be able to upload these files to a sub-folder (path) in your bucket without exposing other parts of your bucket. This is detailed here: Web Identity Federation with Mobile Applications. Though it says mobile, you can apply the same to webapps.
Having said all that, as #Ratan points out, large file uploads could break in between when you try from a browser and it cant retry "only the failed parts". This is where a dedicated app's need come in. Another option is to ask your users to keep the files in their Dropbox/BOX.com account and your server can read from there - these services already take care of large file upload with all retries etc using their apps.
This answer is relevant to .Net as language.
We had such requirement, where we had created an executable. The executable internally called a web method, which validated the app authenticated to upload files to AWS S3 or NOT.
You can do this using a web browser too, but I would not suggest this, if you are targeting big files.

Uploading blob data using python in GAE

I have my simple application deployed and working. Despite using forms I want to run a script on my pc to upload data into GAE blobstore. Bulkloader does it well when I upload data to datastore, but could not find any way to upload the same way for blobstore.
I know that Bulkloader does not have the ability to upload blob data to blobstore of GAE. But I wonder if is there a way to upload blob into blobstore via python scripts?
I have used and succesed doing it from form applications, but now I am getting data from some other db and want to upload the contents to datastore and blobstore. Trying to upload blob into datastore totally messed up.
So, is it possible to upload given attribute values of a table to blobstore via some python scripts?
You need to:
Create a Google Cloud project for the application. (via Application Settings for your app)
Upload your blobs to Google Cloud Storage.
Access them using the Blobstore API.
Details here:
https://developers.google.com/appengine/docs/python/blobstore/#Python_Using_the_Blobstore_API_with_Google_Cloud_Storage
I was considering the same a few weeks ago, but I only had ~90 MB of static files. I ended up including them as static data in my app. The appcfg upload script is smart enough to detect when the same files have previously been uploaded and clones them as needed for subsequent uploads. That might be something you want to consider, depending on your situation.

Categories

Resources