How to allow users to download images from Google Cloud Storage - python

My google app engine app allows users to upload images to Google Cloud Storage. I use python on the server.
What is the easiest way to allow users to download the images stored in Google Cloud Storage with program. Javascript on the client side bars users from saving downloaded files.
The python boto library seems like overkill. I would like to display a list of images a user is allowed to download, with a 'download' button beside it. Thanks!

Whatever logic your app uses to determine which images the currently logged-in user is allowed to download, you just need to put on each corresponding "download button" a signed url for the image within (assuming HTML5) an <a href=[[signed url] download=[[filename]]>[[image name/description]]<a> tag.

Related

How to allow a user to download a Google Cloud Storage file from Compute Engine without public access

I'm going to try and keep this as short as possible.
I have a compute engine instance, and it is running Python/Flask.
What I am trying to do, is allow a user to download a file from google cloud storage, however I do not want the file to be publicly accessible. Is there a way I can have my Compute instance stream the file from cloud storage for the user to download, and then have the file deleted from the compute instance after the user has finished downloading the file? I'd like the download to start immediately after they click the download button.
I am using the default app credentials.
subprocess is not an option.
SideNote:
Another way I was thinking about doing this was to allow each user, who is logged into the website, access to a specific folder on a bucket. However I am unsure if this would even be possible without having them login with a google account. This also seems like it would be a pain to implement.
#jterrace's answer is what you want.
Signed URLs can have a time limit associated with them. In your application you would create a signed url for the file and do a HTTP redirect to said file.
https://cloud.google.com/storage/docs/access-control/create-signed-urls-program
If you are using the default compute engine service account (the default associated with your GCE instance) you should be able to sign just fine. Just follow the instructions on how to create the keys in the url above.
You can do all kinds of awesome stuff this way, including allowing users to upload DIRECTLY to google cloud storage! :)
It sounds like you're looking for Signed URLs.
Service account associated with your compute engine will solve the problem.
Service accounts authenticate applications running on your
virtual machine instances to other Google Cloud Platform services. For
example, if you write an application that reads and writes files on
Google Cloud Storage, it must first authenticate to the Google Cloud
Storage API. You can create a service account and grant the service
account access to the Cloud Storage API.
For historical reasons, all projects come with the Compute Engine default service account, identifiable using this email:
[PROJECT_NUMBER]-compute#developer.gserviceaccount.com
By default, the service account of compute engine has read-only access to google cloud storage service. So, compute engine can access your storage using GCP client libraries.
gsutil is the command-line tool for GCP storage, which is very handy for trying out various options offered by storage.
start by typing gsutil ls from your compute engine which lists all the buckets in your cloud storage.

How to upload video through Python GAE to Google Cloud Storage?

I have a python GAE application I'm developing locally. I'd like to add the feature that users can upload image or video from computer to Google Cloud Storage.
I've looked over the Google Cloud Storage documentation a few times. Perhaps I'm not smart enough to grasp the workings quickly.
I would really appreciate it if someone can run down a very simple example of the entire process, from user uploading file through a POST form, to storing it in the Google Cloud Storage, and also how to store the path to file in the NDB datastore, and finally how to retrieve file and render it to user.
Thanks a lot
Example here showing a direct upload to GCS using a form POST and a signed url. After the upload GCS uses a callback to send you the GCS object path.
A policy document defines what a user (with our without a Google account) can upload with a form POST.

Secure access of webassets with Flask and AWS S3

I am trying to serve files securely (images in this case) to my users. I would like to do this using flask and preferably amazon s3 however I would be open to another cloud storage solution if required.
I have managed to get my flask static files like css and such on s3 however this is all non-secure. So everyone who has the link can open the static files. This is obviously not what I want for secure content. I can't seems to figure out how I can make a file available to just authenticated user that 'owns' the file.
For example: When I log into my dropbox account and copy a random file's download link. Then go over to anther computer and use this link it will denie me access. Even though I am still logged in and the download link is available to user on the latter pc.
Make the request to your Flask application, which will authenticate the user and then issue a redirect to the S3 object. The trick is that the redirect should be to a signed temporary URL that expires in a minute or so, so it can't be saved and used later or by others.
You can use boto.s3.key.generate_url function in your Flask app to create the temporary URL.

user upload to my S3 bucket

I would like for a user, without having to have an Amazon account, to be able to upload mutli-gigabyte files to an S3 bucket of mine.
How can I go about this? I want to enable a user to do this by giving them a key or perhaps through an upload form rather than making a bucket world-writeable obviously.
I'd prefer to use Python on my serverside, but the idea is that a user would need nothing more than their web browser or perhaps opening up their terminal and using built-in executables.
Any thoughts?
You are attempting to proxy the file thorough your python backend to S3, that too large files. Instead you can configure S3 to accept files from user directly (without proxying through your backend code).
It is explained here: Browser Uploads to S3 using HTML POST Forms. This way your server need not handle any upload load at all.
If you also want your users to use their elsewhere ID (google/FB etc) to achieve this workflow, that too is possible. They will be able to upload these files to a sub-folder (path) in your bucket without exposing other parts of your bucket. This is detailed here: Web Identity Federation with Mobile Applications. Though it says mobile, you can apply the same to webapps.
Having said all that, as #Ratan points out, large file uploads could break in between when you try from a browser and it cant retry "only the failed parts". This is where a dedicated app's need come in. Another option is to ask your users to keep the files in their Dropbox/BOX.com account and your server can read from there - these services already take care of large file upload with all retries etc using their apps.
This answer is relevant to .Net as language.
We had such requirement, where we had created an executable. The executable internally called a web method, which validated the app authenticated to upload files to AWS S3 or NOT.
You can do this using a web browser too, but I would not suggest this, if you are targeting big files.

Uploading blob data using python in GAE

I have my simple application deployed and working. Despite using forms I want to run a script on my pc to upload data into GAE blobstore. Bulkloader does it well when I upload data to datastore, but could not find any way to upload the same way for blobstore.
I know that Bulkloader does not have the ability to upload blob data to blobstore of GAE. But I wonder if is there a way to upload blob into blobstore via python scripts?
I have used and succesed doing it from form applications, but now I am getting data from some other db and want to upload the contents to datastore and blobstore. Trying to upload blob into datastore totally messed up.
So, is it possible to upload given attribute values of a table to blobstore via some python scripts?
You need to:
Create a Google Cloud project for the application. (via Application Settings for your app)
Upload your blobs to Google Cloud Storage.
Access them using the Blobstore API.
Details here:
https://developers.google.com/appengine/docs/python/blobstore/#Python_Using_the_Blobstore_API_with_Google_Cloud_Storage
I was considering the same a few weeks ago, but I only had ~90 MB of static files. I ended up including them as static data in my app. The appcfg upload script is smart enough to detect when the same files have previously been uploaded and clones them as needed for subsequent uploads. That might be something you want to consider, depending on your situation.

Categories

Resources