Authenticating with imaplib.IMAP4 and OAuth without OAuth2 - python

I'm using web.py to build a simple server that learns on a user's gmail messages. I've gone through the OAuth flow using the rauth library and I now have the access token. I now want to use imaplib to pull down the messages for processing. However, it's extremely unclear to me how I use the IMAP4.authenticate method. From the documentation:
Authenticate command — requires response processing.
mechanism specifies which authentication mechanism is to be used - it should appear in the nstance variable capabilities in the form AUTH=mechanism.
authobject must be a callable object:
data = authobject(response)
It will be called to process server continuation responses. It should return data that will be encoded and sent to server. It should return None if the client abort response * should be sent instead.
All of the examples I can find online doing this use the authenticate method of the oauth2 library or xoauth library, but I've read that oauth2 is deprecated and xoauth is not fit for production. What's the move here? What's the library for my job?
Thanks!

The library I wanted is Google's new OAuth2 python library. I was confused by naming because oauth2-python, which is deprecated, is also 'import oauth2'. With their library it's dead simple cause they have a function called GenerateOAuth2String which just takes an email and a token and generates something you can pass write to imaplib, which they demo in the function TestImapAuthentication. perfect.

Related

Generate the AWS HTTP signature from boto3

I am working with the AWS Transcribe streaming service that boto3 does not support yet, so to make HTTP/2 requests, I need to manually setup the authorization header with the "AWS Signature Version 4"
I've found some example implementation, but I was hoping to just call whatever function boto3/botocore have implemented using the same configuration object.
Something like
session = boto3.Session(...)
auth = session.generate_signature('POST', '/stream-transcription', ...)
Any pointers in that direction?
Contrary to the AWS SDKs for most other programming languages, boto3/botocore don't offer the functionality to sign arbitrary requests using "AWS Signature Version 4" yet. However there is at least already an open feature request for that: https://github.com/boto/botocore/issues/1784
In this feature request, existing alternatives are discussed as well. One is the third-party Python library aws-requests-auth, which provides a thin wrapper around botocore and requests to sign HTTP-requests. That looks like the following:
import requests
from aws_requests_auth.boto_utils import BotoAWSRequestsAuth
auth = BotoAWSRequestsAuth(aws_host="your-service.domain.tld",
aws_region="us-east-1",
aws_service="execute-api")
response = requests.get("https://your-service.domain.tld",
auth=auth)
Another alternative presented in the feature request is to implement the necessary glue-code on your own, as shown in the following gist: https://gist.github.com/rhboyd/1e01190a6b27ca4ba817bf272d5a5f9a.
Did you check this SDK? Seems very recent but might do what you need.
https://github.com/awslabs/amazon-transcribe-streaming-sdk/tree/master
It looks like it handles the signing: https://github.com/awslabs/amazon-transcribe-streaming-sdk/blob/master/amazon_transcribe/signer.py
I have not tested this, but you can likely accomplish this by following along with with this SigV4 unit test:
https://github.com/boto/botocore/blob/master/tests/unit/test_auth_sigv4.py
Note, this constructs a request using the botocore.awsrequest.AWSRequest helper. You'll likely need to dig around to figure out how to send the actual HTTP request (perhaps with httpsession.py)

How to use google python oauth libraries to implement OpenID Connect?

I am evaluating different options for authentication in a python App Engine flex environment, for apps that run within a G Suite domain.
I am trying to put together the OpenID Connect "Server flow" instructions here with how google-auth-library-python implements the general OAuth2 instructions here.
I kind of follow things up until 4. Exchange code for access token and ID token, which looks like flow.fetch_token, except it says "response to this request contains the following fields in a JSON array," and it includes not just the access token but the id token and other things. I did see this patch to the library. Does that mean I could use some flow.fetch_token to create an IDTokenCredentials (how?) and then use this to build an OpenID Connect API client (and where is that API documented)? And what about validating the id token, is there a separate python library to help with that or is that part of the API library?
It is all very confusing. A great deal would be cleared up with some actual "soup to nuts" example code but I haven't found anything anywhere on the internet, which makes me think (a) perhaps this is not a viable way to do authentication, or (b) it is so recent the python libraries have not caught up? I would however much rather do authentication on the server than in the client with Google Sign-In.
Any suggestions or links to code are much appreciated.
It seems Google's python library contains a module for id token validation. This can be found at google.oauth2.id_token module. Once validated, it will return the decoded token which you can use to obtain user information.
from google.oauth2 import id_token
from google.auth.transport import requests
request = requests.Request()
id_info = id_token.verify_oauth2_token(
token, request, 'my-client-id.example.com')
if id_info['iss'] != 'https://accounts.google.com':
raise ValueError('Wrong issuer.')
userid = id_info['sub']
Once you obtain user information, you should follow authentication process as described in Authenticate the user section.
OK, I think I found my answer in the source code now.
google.oauth2.credentials.Credentials exposes id_token:
Depending on the authorization server and the scopes requested, this may be populated when credentials are obtained and updated when refresh is called. This token is a JWT. It can be verified and decoded [as #kavindu-dodanduwa pointed out] using google.oauth2.id_token.verify_oauth2_token.
And several layers down the call stack we can see fetch_token does some minimal validation of the response JSON (checking that an access token was returned, etc.) but basically passes through whatever it gets from the token endpoint, including (i.e. if an OpenID Connect scope is included) the id token as a JWT.
EDIT:
And the final piece of the puzzle is the translation of tokens from the (generic) OAuthSession to (Google-specific) credentials in google_auth_oauthlib.helpers, where the id_token is grabbed, if it exists.
Note that the generic oauthlib library does seem to implement OpenID Connect now, but looks to be very recent and in process (July 2018). Google doesn't seem to use any of this at the moment (this threw me off a bit).

Is there a boto3 funciton to convert authorization_code into authorization_token

My project is python and using boto3 lib.
I'm using aws cognito Authorization code grant flow with return_type=code instead of return_type=token (implicit flow). Once my user is authorized my redirect url is injected with the queryStringParameter code=4d55a121-8ffc-4058-844b-xxxx.
outlined here
I need to be able to verify this code. Because of course someone can take the redirect url and make a fake code and paste it into the browser. According to this doc I can exchange the code for a token. This works as expected via a rest client. I get the token and can continue to pass the token as the Authorization header. But what I'm asking is there has to be a boto3 method that takes this code and converts it into a token for me. If i have to use the requests lib I will.
I have tried for days. get_user isnt the answer as that requires a token not the code.
For reference on what I'm trying to do heres my repo. The focus is in def edit(). I'm currently using requests to achieve the same thing but would like to use the boto library
https://github.com/knittledan/python-lambda-cognito
Nope, believe you should use an https client to exchange the authorization code for tokens with the token endpoint provided:
https://docs.aws.amazon.com/cognito/latest/developerguide/token-endpoint.html

Sending an order to oanda

I want to send an order to oanda to make a transaction,I use ipython notebook to compile my code,this is my code:
import oandapy
trade_expire=datetime.now()+timedelta(days=1)
trade_expire=trade_expire.isoformat("T")+"Z"
oanda=oandapy.API(environment='practice',access_token='XXXX....')
account_id=xxxxxxx
response=oanda.create_order(account_id,instrument='USD_EUR',units=1000,side='buy',/
type='limit',price=1.105,expire=trade_expire)
But the error is:
OandaError: OANDA API returned error code 4 (The access token provided does
not allow this request to be made)
How can I solve this problem?
I had the same problem, but when sending orders via curl commands.
The problem has to do with which API you are using from which account.
I notice in your python it says "practice," so you'll want to make sure the API token you generated is from within your practice account. Live accounts and practice accounts each use their own API tokens, and your commands will need to match.
You might also look elsewhere in your python, where it actually pings OandA's server.
For example, when using curl, a live account uses
"https://api-fxtrade.oanda.com/v3/accounts/<ACCOUNT>/orders"
and a practice account uses
"https://api-fxpractice.oanda.com/v3/accounts/<ACCOUNT>/orders"
Using your API token generated on your live account in a practice account will produce the error you're asking about.

SPNEGO (kerberos token generation/validation) for SSO using Python

I'm attempting to implement a simple Single Sign On scenario where some of the participating servers will be windows (IIS) boxes. It looks like SPNEGO is a reasonable path for this.
Here's the scenario:
User logs in to my SSO service using his username and password. I authenticate him using some mechanism.
At some later time the user wants to access App A.
The user's request for App A is intercepted by the SSO service. The SSO service uses SPNEGO to log the user in to App A:
The SSO service hits the App A web page, gets a "WWW-Authenticate: Negotiate" response
The SSO service generates a "Authorization: Negotiate xxx" response on behalf of the user, responds to App A. The user is now logged in to App A.
The SSO service intercepts subsequent user requests for App A, inserting the Authorization header into them before passing them on to App A.
Does that sound right?
I need two things (at least that I can think of now):
the ability to generate the "Authorization: Negotiate xxx" token on behalf of the user, preferably using Python
the ability to validate "Authorization: Negotiate xxx" headers in Python (for a later part of the project)
This is exactly what Apple does with its Calendar Server. They have a python gssapi library for the kerberos part of the process, in order to implement SPNEGO.
Look in CalendarServer/twistedcaldav/authkerb.py for the server auth portion.
The kerberos module (which is a c module), doesn't have any useful docstrings, but PyKerberos/pysrc/kerberos.py has all the function definitions.
Here's the urls for the svn trunks:
http://svn.calendarserver.org/repository/calendarserver/CalendarServer/trunk
http://svn.calendarserver.org/repository/calendarserver/PyKerberos/trunk
Take a look at the http://spnego.sourceforge.net/credential_delegation.html tutorial. It seems to be doing what you are trying to do.
I've been searching quite some time for something similar (on Linux), that has lead me to this page several times, yet giving no answer. So here is my solution, I came up with:
The web-server is a Apache with mod_auth_kerb. It is already running in a Active Directory, single sign-on setup since quite some time.
What I was already able to do before:
Using chromium with single sign on on Linux (with a proper krb5 setup, with working kinit user#domain)
Having python connect and single sign on using sspi from the pywin32 package, with something like sspi.ClientAuth("Negotiate", targetspn="http/%s" % host)
The following code snippet completes the puzzle (and my needs), having Python single sign on with Kerberos on Linux (using python-gssapi):
in_token=base64.b64decode(neg_value)
service_name = gssapi.Name("HTTP#%s" % host, gssapi.C_NT_HOSTBASED_SERVICE)
spnegoMechOid = gssapi.oids.OID.mech_from_string("1.3.6.1.5.5.2")
ctx = gssapi.InitContext(service_name,mech_type=spnegoMechOid)
out_token = ctx.step(in_token)
buffer = sspi.AuthenticationBuffer()
outStr = base64.b64encode(out_token)

Categories

Resources