REST API in Python over SSL - python

I am creating a REST API. Basic idea is to send data to a server and the server gives me some other corresponding data in return. I want to implement this with SSL. I need to have an encrypted connection between client and server. Which is the best REST framework in python to achieve this?

You can choose any framework to develop your API, if you want SSL on your API endpoints you need to setup SSL with the Web server that is hosting your application
You can obtain a free SSL cert using Let's encrypt. You will however need a domain in order to be able to get a valid SSL certificate.
SSL connection between client and server does not depend on the framework you choose. Web Servers like Apache HTTPD and Nginx act as the public facing reverse proxy to your python web application. Configuring SSL with your webserver will give you encrypted communication between client and server

On assumption that you are talking about communication between REST Apis and some other stack like flask(A different server).
Rest apis can be used to communicate data with any type of platform as long as they agree on a common protocol to share data.
Data can be shared using xml, yaml or json. Your rest apis can be on any stack you like.
Architecture will be something like:-
Your main site(microservice or monolithic) <=> REST Apis(microservices)
You can use djangorestframework or any other you prefer.

Related

Detect SSL/TLS Client Authentication with Python?

I have multiple Web Servers (mostly IIS & Apache).
Some of them are configured to allow only clients with a specific certificate. Moreover, we have users that have multiple certificates for the same web server which allows different actions to be made.
Ex : Certificates that are used only for highly privileged actions and some for everyday use.
We do not use the operating system certificate store for compliance issues.
I have made the following python code which works by specifying the wanted certificate :
import requests
response = requests.get("https://myserver-dns-name.com", cert="./ClientCert.Key")
I tried fetching the server certificate and look for the following enhanced key usage oid 1.3.6.1.5.5.7.3.1 (TLS Server Auth) but not all servers have it specified.
I tried using the Python ssl library and handle the handshake but the library doesn't seems to allow it. What I wanted is have a callback when the server sends a certificate request to the client.
What I want to know is if its possible to detect from the client that the server have client authentication enforced in Python or any other languages?
I want to implement the same behavior as chromium in Python (Showing a certificate selection dialog as seen here ssl_client_auth_handler.cc)

Dealing with HTTP and websocket connections on the backend

I am working on a game I want to support over iOS/Android/Browser and thinking Websockets is what I want to use for the communication. I use python and so found that I should be using Tornado.
I am trying to understand websockets a little better and their integration in browsers.
Will the messages over the websocket connection also contain the HTTP cookies for the connection? If not can I send it?
How is the HTTP connection for the web page linked to the websocket connection? I mean how will I know they are coming from the same webapp on the server side?
The Tornado wiki page says in the performance section that Tornado can be set up with nginx as the front end. How does that work? I thought Tornado and nginx have to be running on separate machines since both listen on port 80 and also because nginx does not understand WS protocol. What am I missing?
Also it will be great if someone can point me to any resources I can read up on about either Tornado or websocket that could help me.
The websocket is setup by sending an ordinary http request to the server, this request will contain all the stored cookies for the domain. If you do a native implementation for e.g. Android you can use libraries like Autobahn|Android, the API allows you to set cookies for the websocket handshake.
You can set a cookie when first loading the page to maintain a session identifier.
In that scenario they would be running 4 Tornado instances (on different ports, but not port 80) and Nginx on port 80 as a load-balancer, spreading the incoming client requests to the Tornado instances, see running Tornado and Nginx on same server for a configuration example. Recent versions of Nginx does support websockets, see e.g nginx + python + websockets.

Authenticate a server versus an AppEngine application

I cannot see how I could authenticate a server with vs GAE.
Let's say I have an application on GAE which have some data and I somehow need this data on another server.
It is easy to enable OAuth authentication on GAE but here I cannt use this since there is no "account" binded to my server.
Plus GAE doesn't support client certificate.
I could generate a token for each server that needs to access the GAE Application, and transfe them on the server. It would then use it to access the GAE Application by adding it in the URL (using HTTPS)...
Any other idea?
That is exactly what you need to do. On the server, generate a key (you choose the length), and store it in the datastore. When the other server makes a request, use HTTPS and include the key. Its like an API key (it is actually).

URLFetch behind a Proxy Server on App Engine Production

Is there a way to specify a proxy server when using urlfetch on Google App Engine?
Specifically, every time I make a call using urlfetch, I want GAE to go through a proxy server. I want to do this on production, not just dev.
I want to use a proxy because there are problems with using google's outbound IP addresses (rate limiting, no static outbound IP, sometimes blacklisted, etc.). Setting a proxy is normally easy if you can edit the http message itself, but GAE's API does not appear to let you do this.
You can always roll your own:
In case of fixed destination: just setup a fixed port forwarding on a proxy server. Then send requests from GAE to proxy. If you have multiple destinations, then set forwarding on separate ports, one for each destination.
In case of dynamic destination (too much to handle via fixed port forwarding), your GAE app adds a custom http header (X-Something) containing final destination and then connects to custom proxy. Custom proxy inspects this field and forwards the request to the destination.
We ran into this issue and reached out to Google Cloud support. They suggested we use Google App Engine flexible with some app.yaml settings, custom network, and an ip-forwarding NAT gateway instance.
This didn't work for us because many core features from App Engine Standard are not implemented in App Engine Flexible. In essence we would need to rewrite our product.
So, to make applicable URL fetch requests appear to have a static IP we made a custom proxy: https://github.com/csgactuarial/app-engine-proxy
For redundancy reasons, I suggest implementing this as a multi region, multi zone, load balanced system.

How do I implement secure authentication using xml-rpc in python?

I have a basic xml-rpc web service service running.
What is the simplest way(I'm a newbie) to implement secure authentication?
I just need some direction.
You could checkout This code for a simple XML-RPC server over HTTPS. Authentication can work in any way you wish ... they could authenticate with some credentials and you provide a cookie for the rest of the session.
The Python docs for xmlrpc include details of using the HTTP 'Authorization' header for passing in credentials.
Here is some code that uses Twisted to implement a xmlrpc auth mechanism, which could easily use HTTPS instead of HTTP.
This guy has written a HTTPS XML-RPC setup with authorization which you can download.
There are tons of resources, and ways of doing this which are easily googleable. This all depends on if you are using mod_wsgi for example, or writing a standalone server using Twisted.
Bottom line:
a) Use SSL for communication
b) Use the HTTP authorization mechanism

Categories

Resources