Serving images for HTML from GAE datastore - python

I am developing an application that will take HTML and images from the user and save it in a datastore. So far this part is done. How do I serve these images as resources of the HTML page when a user requests a particular one?

If you're adamant you want to keep images in the GAE datastore (not usually the best approach -- Google Cloud Storage is), you can serve them e.g with a handlers: entry with
handlers:
- url: /img/*
- script: images.app
and in images.py you have something like
app = webapp2.WSGIapplication('/img/(.*)', ImgHandler)
with, earlier in the same file, s/thing like:
class ImgHandler(webapp2.RequestHandler):
def get(self, img_key_urlsafe):
key = ndb.Key(urlsafe=img_key_urlsafe)
img = key.get()
self.response.headers['Content-Type'] = 'image/png'
self.response.write(img.data)
Of course, you'll have to arrange to have the images' URLs on the client side (e.g in HTML from jinja2 templates) properly prepared as
/img/some_image_key_urlsafe
and I'm assuming the images are PNG, etc (you could have the content type as one of the image entity's attributes, of course).
Unless the images are really small, this will add substantial load to your GAE app, which could be minimized by stashing the images up in Google Storage and serving them directly from there... serving them directly from datastore IS feasible (as long as they're pretty small, since a GAE entity is bounded to max 1MB!), but it's usually not optimal.

Related

Storing user images on AWS

I'm implementing a simple app using ionic2, which calls an API built using Flask. When setting up the profile, I give the option to the users to upload their own images.
I thought of storing them in an S3 bucket and serving them through CloudFront.
After some research I can only find information about:
Uploading images from the local storage using python.
Uploading images from a HTML file selector using javascript.
I can't find anything about how to deal with blobs/files when you have a front end interacting with an API. When I started researching the options I had thought of were:
Post the file to Amazon on the client side and return the
CloudFront url directly to the back end. I am not too keen on this
one because it would involve having some kind of secret on the
client side (maybe is not that dangerous, but I would rather have it
on the back end).
Upload the image to the server and somehow tell the back end about
which file we want the back end to choose. I am not too keen on
this approach either because the client would need to have knowledge
about the server itself (not only the API).
Encode the image (I have tought of base64, but with the lack of
examples I think that it is plain wrong) and post it to back end,
which will handle all the S3 upload/store CloudFront URL.
I feel like all these approaches are plain wrong, but I can't think (or find) what is the right way of doing it.
How should I approach it?
Have the server generate a pre-signed URL for the client to upload the image to. That means the server is in control of what the URLs will look like and it doesn't expose any secrets, yet the client can upload the image directly to S3.
Generating a pre-signed URL in Python using boto3 looks something like this:
s3 = boto3.client('s3', aws_access_key_id=..., aws_secret_access_key=...)
params = dict(Bucket='my-bucket', Key='myfile.jpg', ContentType='image/jpeg')
url = s3.generate_presigned_url('put_object', Params=params, ExpiresIn=600)
The ContentType is optional, and the client will have to set the same Content-Type HTTP header during upload to url; I find it handy to limit the allowable file types if known.

How/where should I store data in a Django app that is not connected to any HTTP request?

I have created a Django app with the usual components: applications, models views, templates, etc.
The architecture of a Django app is such that it basically just sits there and does nothing until you call one of the views by hitting a REST endpoint. Then it serves a page (or in my case some JSON) and waits for the next REST request.
I would like to insert into this app some automated tweeting. For this purpose I will be using the python-twitter library. My tweets will contain a URL. Twitter's website says that any URLs inserted into tweets get shortened to 23 characters by Twitter itself. So the remaining characters are available for the non-URL portion of the tweet. But the 23-character size may change. So Twitter recommends checking the current size of shortened URLs when the application is loaded but no more than once daily. This is how I can check the current shortened-URL size using python-twitter:
>>> import twitter
>>> twitter_keys = {
"CONSUMER_KEY": "BlahBlahBlah1",
"CONSUMER_SECRET": "BlahBlahBlah2",
"ACCESS_TOKEN_KEY": "BlahBlahBlah3",
"ACCESS_TOKEN_SECRET": "BlahBlahBlah4",
}
>>> api = twitter.Api(
consumer_key=twitter_keys['CONSUMER_KEY'],
consumer_secret=twitter_keys['CONSUMER_SECRET'],
access_token_key=twitter_keys['ACCESS_TOKEN_KEY'],
access_token_secret=twitter_keys['ACCESS_TOKEN_SECRET'],
)
>>> api.GetShortUrlLength()
23
Where and how should I save this value 23 such that it is retrieved from Twitter only once at the start of the application, but available to my Django models all the time during the execution of my application? Should I put it in the settings.py file? Or somewhere else? Please include a code sample if necessary to make it absolutely clear and explicit.
Lot's of different ways and it's primarily a matter of opinion. The simplest of course would be to keep that data in the source file for the module that connects to twitter. Which looks like your existing system. Which works fine as long as this is not an app that's being commited to a public VCS repository.
If the code get's into a public repository you have two choices. Use an 'app_settings' file or save it in the db. Both approaches are described here: https://stackoverflow.com/a/37266007/267540

Upload image to google cloud store through a web page form python, google app engine, angularjs

First let me say,
I think I've read every stack exchange post that seemed relevant to this topic.*
What I'm trying to do:
I'm trying to allow certain users to send an image to google cloud storage through a form on my site..
Another requirement along with sending that image to the cloud storage,is, I need to create a model of that image in the datastore, with it's relevant attributes.
I'm not sure what the best way to go about this is because..
Approaches I've read about:
There's a bunch of approaches for getting images to the cloud storage:
through the blobstore api
through the python client for cloud storage: https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/#Python_About_the_client_library
through rest api (I was considering a json approach)https://cloud.google.com/storage/docs/json_api/v1/how-tos/upload#simple
I've read a lot of conflicting information,
and I've read so much I can't keep it straight,
however,
I think the resolution was:
For blobstore:
The blobstore.uploadtocloudstoragethingmethod may or may not have a file upload limit size (no that's not the actual call or method name) 2. The blobstore method sounded like it's going to be phased out??
For Jason API:
"However, the client library includes App Engine optimizations, so using the REST API might require extra development time. Note: The App Engine development server supports the Cloud Storage Client. It doesn't support the REST API."
For the client library:
I don't know. It looks fine so far. Here is the method I'm thinking about using:
"cloudstorage.open(filename, mode='r', content_type=None,
options=None,
read_buffer_size=storage_api.ReadBuffer.DEFAULT_BUFFER_SIZE,
retry_params=None)
In read mode (r) opens the specified Cloud Storage
object for read. In write mode w, if the specified file exists, it
opens it for an overwrite (append is not supported). If the file
doesn't exist, it is created in the specified bucket."
The stuff I'm using:
Google App Engine
Python (for handlers, models,etc)
angularjs and jquery (front end)
json sometimes (backend)
GAE boilerplate (don't ask)
What I think I'm gonna do :
Get the user's image:
So, I found this form which I tested. It does grab a file from your computer:
<form action="<?php echo $upload_url?>" enctype="multipart/form-data"
method="post">
Files to upload: <br> <input type="file" name="uploaded_files" size="40"> <input type="submit" value="Send">
(No, I can't use the, the php junk in the action. I took it from some place in the docs, but I tested it, it does grab a file, no it won't send, and it would be silly to expect it to)
Modify my current handler (which takes care of creating a model of the image in the datastore) to also use
this method:
"cloudstorage.open(filename, mode='r', content_type=None,
options=None,
read_buffer_size=storage_api.ReadBuffer.DEFAULT_BUFFER_SIZE,
retry_params=None)
In read mode (r) opens the specified Cloud Storage
object for read. In write mode w, if the specified file exists, it
opens it for an overwrite (append is not supported). If the file
doesn't exist, it is created in the specified bucket."
My Problems so far
I don't know how that form works, and I don't know how to get the right information from this form to a handler, and what info that handler will need, so I can process it, create the model (I already know how to create the model in the data store, already doing so), and also send the image to the google cloud storage with this method:
"cloudstorage.open(filename, mode='r', content_type=None,
options=None,
read_buffer_size=storage_api.ReadBuffer.DEFAULT_BUFFER_SIZE,
retry_params=None)
In read mode (r) opens the specified Cloud Storage
object for read. In write mode w, if the specified file exists, it
opens it for an overwrite (append is not supported). If the file
doesn't exist, it is created in the specified bucket."
Can anyone give me an example? I can't find any good relevant posts or docs on what to put in the form such that it will grab the file, grab the other form data that the user fills in, and then ends up sending the image to the google cloud storage.
--UPDATE-- :
So, right after posting this, I googled gcs.open, which led to this post:
how to upload image to the cloud storage via app engine?
I believe this could do the trick as far as backend goes, I do have one question left.
To repaste the question:
Questions:
I don't know what the image name will be, I'd like to use the the existing one, or perhaps a random one. Suggestions?
Where do I stick this: enctype="multipart/form-data" ?
will number 2. mess with my other form data? I need more info than just this image in the post request
To answer your questions:
I don't know what the image name will be, I'd like to use the the
existing one, or perhaps a random one. Suggestions?
I don't think that really matters, just make sure to think about possible name clashes and what would happen then. I think this mostly depends on the needs of your application. You can always use a random filename and retain the actual filename in the datastore next to a reference to the Cloud Storage item.
Where do I stick this: enctype="multipart/form-data" ?
Enctype is a form attribute and so needs to be set on the form element in your HTML:
<form enctype="multipart/form-data" action="/save" ... >
will number 2. mess with my other form data? I need more info than
just this image in the post request
No it should not. Just add more form input fields and they will all be submitted with the form data.
You can also use a signed form for a direct upload to GCS. This way you do not need the blobstore or a post handler. And you can control the timeout and objectname using a policy document. See my aswer here.

Upload image to Appengine Datastore using BlobStore and Endpoints

How can I upload a file/image to the Appengine Datastore using blobStore? I'm using Google Cloud Endpoints.
This is my model:
class ProductImage(EndpointsModel):
_message_fields_schema = ('product', 'enable', 'image')
product = ndb.KeyProperty(Product)
image = ndb.BlobKeyProperty(required=True)
enable = ndb.BooleanProperty(default=True)
How can I test it from API Explorer? At the frontend I'm using AngularJS.
I couldn't figure out a way to do this with just Endpoints; I had to have a hybrid server with part-endpoints application, part-webapp2 blobstore_handlers application. If you use the webapp2 stuff as per the Blobstore upload examples for those parts, it works. For example, the flow should be:
Client requests an upload URL (use Endpoints for this call, and have
it basically do blobstore.create_upload_url(PATH)
Client uploads
image to the given URL, which is handled by your
blobstore_handlers.BlobstoreUploadHandler method, which pulls out
the upload and dumps the blob_info.key() (in JSON, for example)
Client calls createProduct or whatever, an Endpoint, and passes back
the blobkey it just received, along with the rest of your
ProductImage model. You may want to call get_serving_url in that method and stash it in your model, it shouldn't change later.
Clients can then use that stashed serving url to view image.
Also I had a lot of "fun" with the BlobKeyProperty. In dev deployments, everything worked fine, but in 'production', I'd get invalid image errors while calling get_serving_url() on the stored blobkey. I think this might be due to the blobs actually not being bitmaps, though, and dev not caring.

Serving files from BlobStore in GAE

I want to ask if I can download files from the blobstore in google app engine (zip files especially) without using the handlers(class handlers). I mean serve files directly without downloadhandler class usage.
Have any idea??
No. (if I understand the question properly) There is no direct URL for blobstore items, so you can't get at them directly. However you can serve blobs from URLs that you define with less than 10 lines of code.
EDIT: The send_blob also takes a save_as argument. Try save_as=True to use the blob's uploaded filename as the attachment filename.

Categories

Resources