how to simulate image upload to google app engine blobstore - python

I'm uploading images to the GAE blobstore using create_upload_url
uploadURL = blobstore.create_upload_url('/upload')
For the purpose of unit testing the gae code, can you simulate the image upload? OR should I insert the image data in my test bed and assume the upload is successful? If so, how do you upload an image to the test bed?

Agree with #fredrik on what exactly you're testing there.
Anyway, if you're doing some functional/blackbox/similar testing, you could simply use Webtest framework (see post method) and do the actual upload, e.g.
payload = [(fieldname, filename)]
test_app.post(uploadURL, upload_files=payload)
Have a look at Handler Testing for Python for details on how to initialize the above test_app.

Could you provided some code on how your test look?
I think you should be able to fake a request to the upload_url using webapp2. Have a look here for some sample code on how to fake requests.
On the other hand you should think of what the purpose of your test is. Is the purpose to test that the image upload works or is it how your code works after the upload is complete?
When running unit-tests try to break the dependencies to other libraries so that you only test you own code. And then add a new suite of implementation test, ie make a request for and url and check that you get the expected response. As in test_redirect_if_no_session of the example above, make a request to a page that requires a user and expect a redirect (http response code 302).
..fredrik

Related

How to post request to API using only code?

I am developing a DAG to be scheduled on Apache Airflow which main porpuse will be to post survey data (on json format) to an API and then getting a response (the answers to the surveys). Since this whole process is going to be automated, every part of it has to be programmed in the DAG, so I canĀ“t use Postman or any similar app (unless there is a way to automate their usage, but I don't know if this is possible).
I was thinking of using the requests library for Python, and the function I've written for posting the json to the API looks like this:
def postFileToAPI(**context):
print('uploadFileToAPI() ------ ')
json_file = context['ti'].xcom_pull(task_ids='toJson') ## this pulls the json file from a previous task
print('--------------- Posting survey request to API')
r = requests.post('https://[request]', data = json_file)
(I haven't finished defining the http link for the request because my source data is incomplete.)
However, since this is my frst time working with APIs and the requests library, I don't know if this is enough. For example, I'm unsure if I need to provide a token from the API to perform the request.
I also don't know if there are other libraries that are better suited for this or that could be a good support.
In short: I don't know if what I'm doing will work as intended, what other information I need t provide my DAG or if there are any libraries to make my work easier.
The Python requests package that you're using is all you need, except if you're making a request that needs extra authorisation - then you should also import for example requests_jwt (then from requests_jwt import JWTAuth) if you're using JSON web tokens, or whatever relevant requests package corresponds for your authorisation style.
You make POST and GET requests and all individual requests separately.
Include the URL and data arguments as you have done and that should work!
You may also need headers and/or auth arguments to get through security,
eg for the GitLab api for a private repository you would include these extra arguments, where GITLAB_TOKEN is a GitLab web token.
```headers={'PRIVATE-TOKEN': GITLAB_TOKEN},
auth=JWTAuth(GITLAB_TOKEN)```
If you just try it it should work, if it doesn't work then test the API with curl requests directly in the Terminal, or let us know :)

Storing user images on AWS

I'm implementing a simple app using ionic2, which calls an API built using Flask. When setting up the profile, I give the option to the users to upload their own images.
I thought of storing them in an S3 bucket and serving them through CloudFront.
After some research I can only find information about:
Uploading images from the local storage using python.
Uploading images from a HTML file selector using javascript.
I can't find anything about how to deal with blobs/files when you have a front end interacting with an API. When I started researching the options I had thought of were:
Post the file to Amazon on the client side and return the
CloudFront url directly to the back end. I am not too keen on this
one because it would involve having some kind of secret on the
client side (maybe is not that dangerous, but I would rather have it
on the back end).
Upload the image to the server and somehow tell the back end about
which file we want the back end to choose. I am not too keen on
this approach either because the client would need to have knowledge
about the server itself (not only the API).
Encode the image (I have tought of base64, but with the lack of
examples I think that it is plain wrong) and post it to back end,
which will handle all the S3 upload/store CloudFront URL.
I feel like all these approaches are plain wrong, but I can't think (or find) what is the right way of doing it.
How should I approach it?
Have the server generate a pre-signed URL for the client to upload the image to. That means the server is in control of what the URLs will look like and it doesn't expose any secrets, yet the client can upload the image directly to S3.
Generating a pre-signed URL in Python using boto3 looks something like this:
s3 = boto3.client('s3', aws_access_key_id=..., aws_secret_access_key=...)
params = dict(Bucket='my-bucket', Key='myfile.jpg', ContentType='image/jpeg')
url = s3.generate_presigned_url('put_object', Params=params, ExpiresIn=600)
The ContentType is optional, and the client will have to set the same Content-Type HTTP header during upload to url; I find it handy to limit the allowable file types if known.

Upload image to Appengine Datastore using BlobStore and Endpoints

How can I upload a file/image to the Appengine Datastore using blobStore? I'm using Google Cloud Endpoints.
This is my model:
class ProductImage(EndpointsModel):
_message_fields_schema = ('product', 'enable', 'image')
product = ndb.KeyProperty(Product)
image = ndb.BlobKeyProperty(required=True)
enable = ndb.BooleanProperty(default=True)
How can I test it from API Explorer? At the frontend I'm using AngularJS.
I couldn't figure out a way to do this with just Endpoints; I had to have a hybrid server with part-endpoints application, part-webapp2 blobstore_handlers application. If you use the webapp2 stuff as per the Blobstore upload examples for those parts, it works. For example, the flow should be:
Client requests an upload URL (use Endpoints for this call, and have
it basically do blobstore.create_upload_url(PATH)
Client uploads
image to the given URL, which is handled by your
blobstore_handlers.BlobstoreUploadHandler method, which pulls out
the upload and dumps the blob_info.key() (in JSON, for example)
Client calls createProduct or whatever, an Endpoint, and passes back
the blobkey it just received, along with the rest of your
ProductImage model. You may want to call get_serving_url in that method and stash it in your model, it shouldn't change later.
Clients can then use that stashed serving url to view image.
Also I had a lot of "fun" with the BlobKeyProperty. In dev deployments, everything worked fine, but in 'production', I'd get invalid image errors while calling get_serving_url() on the stored blobkey. I think this might be due to the blobs actually not being bitmaps, though, and dev not caring.

Testing Flask REST server

I have a tiny Flask server that is supposed to load data from a file and run a function on it. This function will return a DataFrame and I return the json version of it. Much to my surprise this all works nicely. However, how would I test this? I have included some attempts below but I don't understand Flask (nor REST) well enough yet:
#!/home/thomas/python
from flask import Flask
from flask.ext.restful import Resource, Api
app = Flask(__name__)
api = Api(app)
class UniverseAPI(Resource):
def get(self):
import pandas as pd
frame = pd.read_csv("//datasrv10//data$//AQ//test.csv", index_col=0, header=0)
return frame.to_json()
api.add_resource(UniverseAPI, '/data/universe')
I am happy to include a few of my attempts here... I appreciate any hints. I have read the official documentation.
I should specify what I mean with testing. I can run this on my linux server and can extract all the required information with the requests package. However, I want to create a unittest that comes without the need to start the server on the localhost. I think I have managed with the FLASK test-client. However, the problem now is that the requests response object and the flask response object treat the underlying json strings rather differently. So I guess my problem is more related to json string issues rather than FLASK. Thanks for all your helpful feedback though
Well, the basics of writing a REST API are essentially a set of design principles. My understanding of it is based on this article by Miguel Grinberg, http://blog.miguelgrinberg.com/post/designing-a-restful-api-with-python-and-flask .
In it, he talks about how a REST API is:
"Stateless" - All interactions with the service can happen using the information from one request.
Built upon accessing "resources" from URIs using HTTP requests like GET, PUTS, and POST. A resource could be an order in a store, a task in a web app, or whatever you like.
There's also a bunch of stuff about how the server should standardize all forms of communication between itself and the client, indicate whether it can do cacheing, and other stuff like that. From an initial design standpoint, though, this is "the point" as he put it:
"The task of designing a web service or API that adheres to the REST guidelines then becomes > an exercise in identifying the resources that will be exposed and how they will be affected > by the different request methods."
If you're looking for an interesting example of a REST API that might be suited to your interests (I know it is to mine), reddit's is open source. It's a relatable example to see how they try and structure the interactions behind requests: http://www.reddit.com/dev/api

How do I make Flask stream a static file with HTTP 206 Partial Content?

I want to use a looping video on a site made powered by Flask. Apparently, Chrome will not loop the video, unless it was streamed with an HTTP 206 code being returned. Flask, however, always returns this static file with an HTTP 200. How do I stream static content from my Flask project (hosted on Heroku, for the record) to make the video correctly loop in Chrome?
I had the same problem when serving my video files and I found the solution by digging into the source code of Werkzeug. I solved it by adding the flag conditional=True in the send_from_directory function as follows:
#app.route('/uploads/<filename>')
def uploaded_file(filename):
"""Endpoint to serve uploaded videos
Use `conditional=True` in order to support range requests necessary for
seeking videos.
"""
return send_from_directory(app.config['UPLOAD_FOLDER'], filename,
conditional=True)
Reponse objects in Flask have a "status_code" parameter you can pass. See this documentation for more details, but essentially, you may want to subclass the Response object.
Also take a look at make_response() - it may reveal a simpler way, depending on your application structure.
Take a look at the streaming pattern for more details, but it's geared towards generated content as opposed to static.

Categories

Resources