Swagger Codegen Authorisation for Python-Flask - python

I have build a swagger doc, generated the swagger client (python-flask with python2 support).
I've built my code up, tested, happy with what I've got. Now I want to secure my API endpoints using https and Basic Auth.
This is v2 of the Open Api Specification (OAS) so I'm setting up as follows (described https://swagger.io/docs/specification/2-0/authentication/basic-authentication/)
swagger: "2.0"
securityDefinitions:
basicAuth:
type: "basic"
Whether I specify that my endpoint have individual security settings or whether I specify this at the root level in the YAML for all endpoints, it makes no difference.
security:
- basicAuth: []
I take my YAML, export to JSON, then run the following to rebuild the swagger_server code:
java -jar swagger-codegen-cli-2.3.1.jar generate -l python-flask -
DsupportPython2=true -i swagger.json -a "Authorization: Basic
Base64encodedstring"
What I'm expecting is for the controller or model code to validate that a basic auth header has been passed that matches the authrization specified in the generation code but I see no references anywhere. Not sure if I've just read this wrong or if there's an issue with the way I'm doing it or some of the options I'm using?

Python server generated by Swagger Codegen uses Connexion, and Connexion only supports OAuth 2 out of the box. As explained in the linked issue,
users always can add custom mechanisms by decorating their handler functions (see https://github.com/zalando/connexion/blob/master/examples/basicauth/app.py)

Related

Create new API config for GCP API Gateway using Python

I'm trying to create a new API config for an existing API and Gateway, using the Python client library google-cloud-api-gateway. I can't figure out how to specify the api_config parameter. The example in the docs doesn't include this parameter (which is required).
request = apigateway_v1.CreateApiConfigRequest(
parent="parent_value",
api_config_id="api_config_id_value",
api_config="?"
)
What is the correct syntax for providing my config.yaml file? Even better would be if I could provide the config as a Python dict representation instead of loading a .yaml file.

Swaggerhub returns predefined response, but server stub doesn't

I have API yaml specification that defines what should be the response from each endpoint, e.g. I want that /version returns version1.0 as defined:
openapi: "3.0.3"
info:
title: "TITLE"
description: "DESCRIPTION"
version: "1.0.0"
paths:
/version:
get:
description: "description"
responses:
"200":
description: "description"
content:
application/json:
schema:
$ref: '#/components/schemas/version'
components:
schemas:
version:
type: object
properties:
version:
type: string
example:
"version": "version1.0"
API server that runs on Swaggerhub works like expected, but e.g. python-flask server stub generated by Swaggerhub and run locally on my machine returns do some magic! for every endpoint and not the example provided in yaml. The same with server stubs generated using swagger-codegen and openapi-generator.
I am mainly interested in making it work for openapi-generator generated server stub.
This is working as expected.
The example tag is an OpenAPI tag to provide an example object or schema to your users to demonstrate the use of your APIs. The virtual service provided by SwaggerHub (virtserver) is able to interpret example value and return these values when you make API calls to virtualized service provided by SwaggerHub.
On the other hand, codegen may or may not handle example tags in the same way (it depends on the codegen template). I have two recommendations about how to go about resolving this:
Add the example return values yourself. The codegen is not meant to be a complete solution, it's meant to be a jumping off point for development. Once you've generated the code, implement and add in your own business logic.
Create your own codegen template. This is a popular option for anyone who needs Swagget codegen to do something that isn't currently supported: https://github.com/swagger-api/swagger-codegen/wiki/Building-your-own-Templates

Swagger UI showing HTTP HEAD method which is not supported by the API

I have developed an API Gateway using python's aiohttp module.
Now, I am trying to develop a swagger UI for the same server.
For now, I am doing it by specifying commends in the function.
Below is an example -
async def list_models(request):
"""
List models API.
---
tags:
- models
summary: List models
description: This API lists models created till date.
produces:
- application/json
responses:
"200":
description: List of all the models created.
"""
url = MODELPERSISTENCE_SERVICE_URL + '/models/'
return await execute_get_request(url)
However when I deploy the server and visit the swagger UI I see that HTTP HEAD method is also supported by this API which is wrong.
As you can see I am not mentioning HEAD or GET method anywhere in the specification. How do I prevent HEAD method from popping up in the swagger UI?
Here is the image to show the HEAD method in play -
Do you use add_get to add routes? Then Swagger is not wrong.
aiohttp creates HTTP HEAD handlers by default when using the router's add_get method. If you don't want them, register the routes with the allow_head=False named parameter.

How do I develop against OAuth locally?

I'm building a Python application that needs to communicate with an OAuth service provider. The SP requires me to specify a callback URL. Specifying localhost obviously won't work. I'm unable to set up a public facing server. Any ideas besides paying for server/hosting? Is this even possible?
Two things:
The OAuth Service Provider in question is violating the OAuth spec if it's giving you an error if you don't specify a callback URL. callback_url is spec'd to be an OPTIONAL parameter.
But, pedantry aside, you probably want to get a callback when the user's done just so you know you can redeem the Request Token for an Access Token. Yahoo's FireEagle developer docs have lots of great information on how to do this.
Even in the second case, the callback URL doesn't actually have to be visible from the Internet at all. The OAuth Service Provider will redirect the browser that the user uses to provide his username/password to the callback URL.
The two common ways to do this are:
Create a dumb web service from within your application that listens on some port (say, http://localhost:1234/) for the completion callback, or
Register a protocol handler (you'll have to check with the documentation for your OS specifically on how to do such a thing, but it enables things like <a href="skype:555-1212"> to work).
(An example of the flow that I believe you're describing lives here.)
In case you are using *nix style system, create a alias like 127.0.0.1 mywebsite.dev in /etc/hosts (you need have the line which is similar to above mentioned in the file, Use http://website.dev/callbackurl/for/app in call back URL and during local testing.
This was with the Facebook OAuth - I actually was able to specify 'http://127.0.0.1:8080' as the Site URL and the callback URL. It took several minutes for the changes to the Facebook app to propagate, but then it worked.
This may help you:
http://www.marcworrell.com/article-2990-en.html
It's php so should be pretty straightforward to set up on your dev server.
I've tried this one once:
http://term.ie/oauth/example/
It's pretty simple. You have a link to download the code at the bottom.
localtunnel [port] and voila
http://blogrium.wordpress.com/2010/05/11/making-a-local-web-server-public-with-localtunnel/
http://github.com/progrium/localtunnel
You could create 2 applications? 1 for deployment and the other for testing.
Alternatively, you can also include an oauth_callback parameter when you requesting for a request token. Some providers will redirect to the url specified by oauth_callback (eg. Twitter, Google) but some will ignore this callback url and redirect to the one specified during configuration (eg. Yahoo)
So how I solved this issue (using BitBucket's OAuth interface) was by specifying the callback URL to localhost (or whatever the hell you want really), and then following the authorisation URL with curl, but with the twist of only returning the HTTP header. Example:
curl --user BitbucketUsername:BitbucketPassword -sL -w "%{http_code} %{url_effective}\\n" "AUTH_URL" -o /dev/null
Inserting for your credentials and the authorisation url (remember to escape the exclamation mark!).
What you should get is something like this:
200 http://localhost?dump&oauth_verifier=OATH_VERIFIER&oauth_token=OATH_TOKEN
And you can scrape the oath_verifier from this.
Doing the same in python:
import pycurl
devnull = open('/dev/null', 'w')
c = pycurl.Curl()
c.setopt(pycurl.WRITEFUNCTION, devnull.write)
c.setopt(c.USERPWD, "BBUSERNAME:BBPASSWORD")
c.setopt(pycurl.URL, authorize_url)
c.setopt(pycurl.FOLLOWLOCATION, 1)
c.perform()
print c.getinfo(pycurl.HTTP_CODE), c.getinfo(pycurl.EFFECTIVE_URL)
I hope this is useful for someone!

SPNEGO (kerberos token generation/validation) for SSO using Python

I'm attempting to implement a simple Single Sign On scenario where some of the participating servers will be windows (IIS) boxes. It looks like SPNEGO is a reasonable path for this.
Here's the scenario:
User logs in to my SSO service using his username and password. I authenticate him using some mechanism.
At some later time the user wants to access App A.
The user's request for App A is intercepted by the SSO service. The SSO service uses SPNEGO to log the user in to App A:
The SSO service hits the App A web page, gets a "WWW-Authenticate: Negotiate" response
The SSO service generates a "Authorization: Negotiate xxx" response on behalf of the user, responds to App A. The user is now logged in to App A.
The SSO service intercepts subsequent user requests for App A, inserting the Authorization header into them before passing them on to App A.
Does that sound right?
I need two things (at least that I can think of now):
the ability to generate the "Authorization: Negotiate xxx" token on behalf of the user, preferably using Python
the ability to validate "Authorization: Negotiate xxx" headers in Python (for a later part of the project)
This is exactly what Apple does with its Calendar Server. They have a python gssapi library for the kerberos part of the process, in order to implement SPNEGO.
Look in CalendarServer/twistedcaldav/authkerb.py for the server auth portion.
The kerberos module (which is a c module), doesn't have any useful docstrings, but PyKerberos/pysrc/kerberos.py has all the function definitions.
Here's the urls for the svn trunks:
http://svn.calendarserver.org/repository/calendarserver/CalendarServer/trunk
http://svn.calendarserver.org/repository/calendarserver/PyKerberos/trunk
Take a look at the http://spnego.sourceforge.net/credential_delegation.html tutorial. It seems to be doing what you are trying to do.
I've been searching quite some time for something similar (on Linux), that has lead me to this page several times, yet giving no answer. So here is my solution, I came up with:
The web-server is a Apache with mod_auth_kerb. It is already running in a Active Directory, single sign-on setup since quite some time.
What I was already able to do before:
Using chromium with single sign on on Linux (with a proper krb5 setup, with working kinit user#domain)
Having python connect and single sign on using sspi from the pywin32 package, with something like sspi.ClientAuth("Negotiate", targetspn="http/%s" % host)
The following code snippet completes the puzzle (and my needs), having Python single sign on with Kerberos on Linux (using python-gssapi):
in_token=base64.b64decode(neg_value)
service_name = gssapi.Name("HTTP#%s" % host, gssapi.C_NT_HOSTBASED_SERVICE)
spnegoMechOid = gssapi.oids.OID.mech_from_string("1.3.6.1.5.5.2")
ctx = gssapi.InitContext(service_name,mech_type=spnegoMechOid)
out_token = ctx.step(in_token)
buffer = sspi.AuthenticationBuffer()
outStr = base64.b64encode(out_token)

Categories

Resources