I have a post form that calls Amazon's S3 service. I am doing a direct from browser upload to amazon, and thus have to pass some additional information to Amazon, that I will not know until the file is uploaded (file-type, name, 'signature' which references both). http://aws.amazon.com/articles/1434
It is a large video file, which I don't want to upload to my server first, which (to the best of my knowledge) rules out using urllib2.open(...) to pass the additional variables to amazon as a POST request. I've been working at it for a couple days now, and haven't had any success with it. What would be the best way to do this?
Note: I will probably be using the uploadify plugin to upload the file. Thank you.
You're probably looking for Query String Request Authentication Alternative.
You can authenticate certain types of requests by passing the required
information as query-string parameters instead of using the
Authorization HTTP header. This is useful for enabling direct
third-party browser access to your private Amazon S3 data, without
proxying the request. The idea is to construct a "pre-signed" request
and encode it as a URL that an end-user's browser can retrieve.
Additionally, you can limit a pre-signed request by specifying an
expiration time.
Related
I'm building a small website that involves users uploading images that will be displayed later. The images are stored in an S3 bucket.
Sometimes, I need to display a lot of these images at once, and I'm not sure how best to accommodate that, without allowing public access to S3.
Currently, when there's a request to the server, the server downloads the object from S3, and then returns the file to the client- This is understandably slow. I would love to just be able to return the S3 URL and have the client load from there (So the traffic doesn't have to pass through my server and I don't have to wait for the image to download from S3->Server->Client, but I also don't want S3 bucket urls that are just unsecured and that anyone can go to.
What is the best architecture to solve this? Is there a way of giving people very brief temporary permission to a bucket? Is it possible to scope that to a specific url?
I looked around on stackoverflow and github for similar questions, but most of them seem to have to do with how the files are uploaded and not accessing them securely.
As suggested by #jarmod, you can pre-sign your objects' URL.
In this case, once you need to share an image, you need to create a pre-sign URL for the object and share this URL.
Your server will only provide the URL. The user will access the image directly, without your server in the middle of the request.
The AWS site explains how to use pre-sign URLs:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-presigned-url.html
https://docs.aws.amazon.com/sdk-for-go/v1/developer-guide/s3-example-presigned-urls.html
I have a Python FastAPI Backend which gives me an endpoint in order to retrieve some oAuth data. The Endpoint works when i put it manually in my Browser. It first does a redirect retrieves an accessToken which it then adds to a second link and opens it. So the functionality works. Now i'm a bit stuck on how to get this data in my Frontend. I tried a GET request on the original endpoint but then get the response with the redirect link. What would be a clever way to handle this? I would like to store the data in the Frontend in order to manipulate it
Thanks for helping me!
After doing many OAuth2 interactions, I recommend you use a library; OAuth2 flow is fairly complicated with the interactions between Client, Backend, Identification, & Authorization servers. Using a library helps ensure you are doing all the key swaps properly and securely; Also it will save you lots of time :).
Here is a library I would use in your shoes; https://github.com/manfredsteyer/angular-oauth2-oidc
I'm making an Ajax call in the UI to the API, so the localhost needs to be able to query the API. Users of the platform should be able to access the API, but need to use a token I already provide.
Is there a way to allow anonymous API usage locally only?
I looked into JWT and it does not seem to be the right fit.
As I've pointed out in the comments, JWTs should suffice in this case as, from what I've understood, you're not handling any extremely sensitive data (which can be hashed and not exposable to the user using JWTs as well) but want to validate each request. Using the same link you can check the validity of a token in their debugger.
There's a lot out there on how to issue JWT tokens to clients from Django, but I'm looking for a way to store a JWT token that is issued to the Django app for authentication against an external API.
The setup:
Django requests and receives token from external API. It is good for 24 hours.
Every time a client a makes a request, the app must make an authenticated call to the external API. Ideally, if 3 clients make 2 requests each, we should only need to request a single JWT.
24 hours later, a fourth client makes a request. Django sees that the token is invalid and requests a new one.
The problems here:
Requests from multiple clients should not each require their own token.
The token must be able to be changed (this rules out sticking it in the settings)
The token must be stored securely.
Ideas so far:
Stick in the database with a field listing the expiry time. This seems questionable from a security standpoint.
Implement some kind of in memory storage like this https://github.com/waveaccounting/dj-inmemorystorage . This seems like overkill.
Any suggestions as to a better way to do this?
The django cache was the way to go. See the above link for an example.
I have a REST (or almost REST) web api,
I want the API users to be able to use all the api, even if for some reason the can only make GET calls, so the plan is to accept a url parameter (query string) like request_method that can be GET (default) or POST, PUT, DELETE and I want to route them.
My question is other than the standard request handler overrides and checking in each httpRequestHandler in the get(self) method if this is meant to be a POST, PUT, DELETE and calling the appropriate functions, is there a way to do this "routing" in a more general way, like in the URL patterns in the application definition or overriding a routing function or something?
To make it clear, these requests are all coming over GET with a parameter for example like ?request_method=POST
Any suggestions is appreciated.
Possible Solutions:
only have a ".*" url pattern and handle all the routing in a single RequestHandler. Should work fine, except that I won't be taking advantage of the url pattern matching features of Tornado.
add an if to all the get(self) methods in all the request handlers and check if the request should be handled by get if not, call the relevant method.
This would be a very foolish thing to do. Both Chrome and Firefox, along with many other web user agents, will speculatively fetch (GET) some or all of the links on a page, including your request_method=DELETE URLs. You will find your database has been emptied out just because someone was looking around. Do not deliberately break HTTP. GET is defined to be a "safe" method, meaning it's okay to GET any URL you like and nothing bad will happen.
EDIT for others in similar situations:
The OP says he is using JSONP and is in control of both the API server and the client web app. In such a case the ideal solution is Cross-Origin Resource Sharing (CORS, spec), although this technology requires IE8+, Firefox 3.5+, Safari 4+ or Chrome 3+. If you need to target earlier browsers, and you control both domains, I would recommend merging the content of the two domains at least for your own client web app. The api domain can remain for external clients, but they would be restricted by the CORS browser requirements.