Uploading blob data using python in GAE - python

I have my simple application deployed and working. Despite using forms I want to run a script on my pc to upload data into GAE blobstore. Bulkloader does it well when I upload data to datastore, but could not find any way to upload the same way for blobstore.
I know that Bulkloader does not have the ability to upload blob data to blobstore of GAE. But I wonder if is there a way to upload blob into blobstore via python scripts?
I have used and succesed doing it from form applications, but now I am getting data from some other db and want to upload the contents to datastore and blobstore. Trying to upload blob into datastore totally messed up.
So, is it possible to upload given attribute values of a table to blobstore via some python scripts?

You need to:
Create a Google Cloud project for the application. (via Application Settings for your app)
Upload your blobs to Google Cloud Storage.
Access them using the Blobstore API.
Details here:
https://developers.google.com/appengine/docs/python/blobstore/#Python_Using_the_Blobstore_API_with_Google_Cloud_Storage
I was considering the same a few weeks ago, but I only had ~90 MB of static files. I ended up including them as static data in my app. The appcfg upload script is smart enough to detect when the same files have previously been uploaded and clones them as needed for subsequent uploads. That might be something you want to consider, depending on your situation.

Related

How to serve csv files in my s3 bucket to a python app deployed on heroku

I have a python app (specifically a dash plotly dashboard) that I have deployed on Heroku. I have static files (csv/maps in the form of html etc.) that are input files for my app. However I am unable to get my python script to read these files when the heroku app starts.
I have already done the initial authentication piece of allowing heroku to access my aws bucket and set permissions.
Ok the steps are like this. This has to be a serverless application. Upon clicking the submit button on your website, an api should be called. Get/Post depending on your need.
(1)
An API will invoke a Lambda function that will take the csv file and store in s3.
Create a rest api using apigateway, connect with lamdba then store in s3.
You can use boto3 library if you pick python for lambda.
(2) Another way, if you don't need to manipulate the data on backend. You can create an API that takes a file (less than 6mb) and stores directly to s3 bucket.
If you are familiar with terraform this might help.
Best wishes.

How to upload data file to a Heroku-deployed Dash web app

I have been researching about uploading data (by external users) to my Dash app and it seems the only way is the dcc.Upload component (a drag-and-drop component on the UI side - https://dash.plotly.com/dash-core-components/upload) … To clarify it is this uploaded file that will be read into pandas and fed into the callbacks for analysis and visualisation.
I also read about Heroku simple-file-upload config (https://devcenter.heroku.com/articles/simple-file-upload) and the AWS S3 bucket (https://devcenter.heroku.com/articles/s3) as the necessary way to store static data uploaded to the app. Nowhere is it mentioned in the Dash dcc.Upload docs about the storage of the uploaded file, i.e. the web server part and the UI are not linked together in any documentation I could find.
Can anyone explain to a total web dev newbie, once deployed to Heroku, does the dcc.Upload require the set up of the Heroku simple-file-upload config or an S3 storage bucket ? If not, how does it deal with the storage of the file? Is there any other way for a user to upload data to be used in the web app?
PS I am not even sure the data file the user will upload is a static file or a dynamic one, as it will obviously be processed within the code for the analysis to happen (ie group, sort, filter, etc)
The Upload component from dash core components holds the data in the browser itself (as a base64 encoded string), i.e. you don't need any storage on Heroku.
Another option would be dash-uploader, which is able to handle larger files. However, this component holds the data on the server.

Flask - Serving user-uploaded images to the webpage

So I am working on a Flask application which is pretty much a property manager that involves allowing users to upload images of their properties. I am new to Flask and have never had to deal with images before. From a lot of Googling I understand that there are various ways to manage static files like images.
One way is to allow users to upload images directly to the file system, and then displaying it by retrieving the file location in the static folder using something like:
<img src="static/images/filename.jpg">
However, is this really an efficient way since this means storing generating and storing the location of each image URL in the database? Especially when it comes to deploying the application? Another way I discovered was using base64 encoding and storing the image directly into the database, which also doesn't sound very efficient either.
Another way, which I think might be the best to go about this, is to use an AWS S3 bucket. The user would then be able to upload an image directly to that bucket and be assigned a URL to that image. This URL is stored in the database and can then be used to display the image similarly to the file system method. Is my understanding of this correct? Is there a better way to go about this? And is there something similar to django-storages that can be used to connect Flask to S3?
Any input or pointing me in the right direction would be much appreciated. Thank you!
If you want to store the images in the web server then the best approach for you is to use nginx as proxy in front of flask and let nginx serve the static folder for all the images.
Nginx is pretty much enough for a small website. Don't try to serve the file using flask. It is too slow.
If you want to store the images in s3 ,then you just need to store the name of image in bucket in the database. You can tell flask to use s3 bucket as the static folder. You can use boto3 library in python to access s3.
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/quickstart.html
If you are concerned of exposing s3 bucket to users, then you can use cloudfront distribution. It is cheaper in terms of price to serve and also hides your bucket.

Public URL to files in Google Cloud Storage from python backend (Google App Engine)

I'm developing an Android application, which communicates with backend Google App Engine written in Python. User is uploading and downloading files to Google Cloud Storage. So far, the files where being sent to the GAE backend by POST request, and then saved in GCS. I want user to do it directly to GCS (to avoid sending large files over POST). And (on download request) I would like to send user only public URL to file. There is a nice tutorial for it in PHP:
https://cloud.google.com/appengine/docs/php/googlestorage/user_upload and
https://cloud.google.com/appengine/docs/php/googlestorage/public_access and key sentence there: "Once the file is written to Cloud Storage as publically readable, you need to get the public URL for the file, using CloudStorageTools::getPublicUrl." How to do the same in python?
The public URL of a file in GCS looks like this:
https://storage.googleapis.com/<appname>.appspot.com/<filename>
When I store files in GCS, I explicitly give the file a filename, so I can create a serving URL using the template above.
Are you giving a filename when you store files in GCS? If not, are you able to do so? Maybe provided details of how you are saving the files to GCS in your question to get a better answer.

How to upload video through Python GAE to Google Cloud Storage?

I have a python GAE application I'm developing locally. I'd like to add the feature that users can upload image or video from computer to Google Cloud Storage.
I've looked over the Google Cloud Storage documentation a few times. Perhaps I'm not smart enough to grasp the workings quickly.
I would really appreciate it if someone can run down a very simple example of the entire process, from user uploading file through a POST form, to storing it in the Google Cloud Storage, and also how to store the path to file in the NDB datastore, and finally how to retrieve file and render it to user.
Thanks a lot
Example here showing a direct upload to GCS using a form POST and a signed url. After the upload GCS uses a callback to send you the GCS object path.
A policy document defines what a user (with our without a Google account) can upload with a form POST.

Categories

Resources