Authenticate a server versus an AppEngine application - python

I cannot see how I could authenticate a server with vs GAE.
Let's say I have an application on GAE which have some data and I somehow need this data on another server.
It is easy to enable OAuth authentication on GAE but here I cannt use this since there is no "account" binded to my server.
Plus GAE doesn't support client certificate.
I could generate a token for each server that needs to access the GAE Application, and transfe them on the server. It would then use it to access the GAE Application by adding it in the URL (using HTTPS)...
Any other idea?

That is exactly what you need to do. On the server, generate a key (you choose the length), and store it in the datastore. When the other server makes a request, use HTTPS and include the key. Its like an API key (it is actually).

Related

Share a session between 2 flask apps running on different servers

I am using client side sessions. The requirement is to redirect from 1 flask server which already have a user session data to another flask app on a different server and use the same client session information to make sure the user has already logged in if not send them back to the 1st server for authentication.
If possible i would like to keep using the client side sessions. If not any information regarding the alternative will be helpful.
Thank you
Normally there is 2 options.
First, client side authentication using token like JWT (Json Web Token), this approach authenticate every request using token included in header and no need additional server.
Second, server side approach with additional session store like Redis for multiple backends.

Is there a way to use openshift as a oauth authenticator in my web app?

I am creating a web app for my company.I don't want to add a new sign up process and store the creds for our employees. We already use openshift and every one having openshift creds can login into our openshift cluster. I want to re use that creds to login into my web app.
I came to knew that openshift supports oauth 2.0 and but most of the methods available in internet is using other identity providers like google as auth in openshift. No one guides in using openshift as identity provider in a web app. Any leads will be appreciated.
Based on what I'm seeing in OpenShift 4.1's documentation on Configuring the internal OAuth Server it looks like it may be possible to use the /oauth/authorize endpoint of the control-plane api.
The OpenShift Container Platform master includes a built-in OAuth server. Users obtain OAuth access tokens to authenticate themselves to the API.
When a person requests a new OAuth token, the OAuth server uses the configured identity provider to determine the identity of the person making the request.
It then determines what user that identity maps to, creates an access token for that user, and returns the token for use.
The intention of this endpoint is to grant OAuth tokens specifically for use with the OpenShift cluster, not for third party applications.
Even if it ends up being possible, you'll still probably want to use the OAuth/OIDC mechanisms directly in the upstream authentication provider OpenShift is using if possible as that will provide better support and be more intuitive from an application architecture standpoint.
You can use the openshift user api to access the identity of the user which requested an access token.
The api to call is <api_root>/apis/user.openshift.io/v1/users/~ with a Authorization: Bearer <token> header.
This will give you the k8s user object containing the username and groups of the user.
You can also do this from within an openshift pod using https://kubernetes.default.svc as api_root, this requires you to add the ca in the pod to setup a secure connection.
The ca is mounted in any pod at /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
You can use the oauth mechanism provided by openshift to retrieve an access token using code grant.
The documentation for the openshift oauth internals is sketchy at best, I found it helpful to find the correct urls and parameters in the dex openshift connector source code: here and here

REST API in Python over SSL

I am creating a REST API. Basic idea is to send data to a server and the server gives me some other corresponding data in return. I want to implement this with SSL. I need to have an encrypted connection between client and server. Which is the best REST framework in python to achieve this?
You can choose any framework to develop your API, if you want SSL on your API endpoints you need to setup SSL with the Web server that is hosting your application
You can obtain a free SSL cert using Let's encrypt. You will however need a domain in order to be able to get a valid SSL certificate.
SSL connection between client and server does not depend on the framework you choose. Web Servers like Apache HTTPD and Nginx act as the public facing reverse proxy to your python web application. Configuring SSL with your webserver will give you encrypted communication between client and server
On assumption that you are talking about communication between REST Apis and some other stack like flask(A different server).
Rest apis can be used to communicate data with any type of platform as long as they agree on a common protocol to share data.
Data can be shared using xml, yaml or json. Your rest apis can be on any stack you like.
Architecture will be something like:-
Your main site(microservice or monolithic) <=> REST Apis(microservices)
You can use djangorestframework or any other you prefer.

API access authentication/application key (django/nginx/gunicorn)

I have a web app created in django, running in gunicorn app server behind nginx webserver/reverse-proxy. I need to have external application to access some processed data (csv/json), for which I need some sort of authentication. The basic django auth/login is not optimal as a simple script needs to pull the data with a simple request, no cookies etc (not created by me).
For now, I have
set up the service being available with https/tls only
created an IP-filter in django to reduce the "attack surface" with:
request.META['HTTP_X_REAL_IP']
and using nginx to forward the ip with:
proxy_set_header X-Real-IP $remote_addr;
Next I was thinking to include and application key (hash of a pw or something) which needs to be included in the request, and is checked against db for a list of valid keys.
Is this a suitable api authentication or is there something else which can be used/recomennded? some sort of application key framework?
There are many authentication methods beside of session/cookie based ones. For your case I will suggest simple token authentication. Just save same token in your django app and external app and on each request from external app to django, send additional header:
Authentication: Token YOUR_TOKEN_KEY
Now all you need to do in django is to fetch that token and check if it matches one saved locally.
If you want more auth options for API, check Django Rest Framework documentation.

GAE: Can't Use Google Server Side API's (Whitelisting Issue)

To use Google API's, after activating them from the Google Developers Console, one needs to generate credentials. In my case, I have a backend that is supposed to consume the API server side. For this purpose, there is an option to generate what the Google page calls "Key for server applications". So far so good.
The problem is that in order to generate the key, one has to mention IP addresses of servers that would be whitelisted. But GAE has no static IP address that I could use there.
There is an option to manually get the IP's by executing:
dig -t TXT _netblocks.google.com #ns1.google.com
However there is no guarantee that the list is static (further more, it is known to change from time to time), and there is no programatic way I could automate the use of adding IP's that I get from dig into the Google Developers Console.
This leaves me with two choices:
Forget about GAE for this project, ironically, GAE cannot be used as a backend for Google API's (better use Amazon or some other solution for that). or
Program something like a watchdog over the output of the dig command that would notify me if there's a change, and then I would manually update the whitelist (no way I am going to do this - too dangerous), or allow all IP's to use the Google API granted it has my API key. Not the most secure solution but it works.
Is there any other workaround? Can it be that GAE does not support consuming Google API's server side?
You can use App Identity to access Google's API from AppEngine. See: https://developers.google.com/appengine/docs/python/appidentity/. If you setup your app using the cloud console, it should have already added your app's identity with permission to your project, but you can always check that out. From the "Permissions" Tab in cloud console for your project, make sure your service account is added under "Service Accounts" (in the form of your_app_id#appspot.gserviceaccount.com)
Furthermore, if you use something like the JSON API Libs available for python, you can use the bundled oauth2 library to do all of this for you using AppAssertionCredentials to authorize the API you wish to use. See: https://developers.google.com/api-client-library/python/guide/google_app_engine#ServiceAccounts
Yes, you should use App Identity. Forget about getting an IP or giving up on GAE :-) Here is an example of how to use Big Query, for example, inside a GAE application:
static {
// initializes Big Query
JsonFactory jsonFactory = new JacksonFactory();
HttpTransport httpTransport = new UrlFetchTransport();
AppIdentityCredential credential = new AppIdentityCredential(Arrays.asList(Constants.BIGQUERY_SCOPE));
bigquery = new Bigquery.Builder(httpTransport, jsonFactory, credential)
.setApplicationName(Constants.APPLICATION_NAME).setHttpRequestInitializer(credential)
.setBigqueryRequestInitializer(new BigqueryRequestInitializer(Constants.API_KEY)).build();
}

Categories

Resources