I am building a command line tool using python that interfaces with an RESTful api. The API uses oauth2 for authentication. Rather than asking for access_token every time user runs the python tool. Can I store the access_token in some way so that I can use it till its lifespan? If it is then how safe it is.
You can store the access token in a file on your user's desktop.
You can do so using a storage. Assuming you use oauth2client:
# Reading credentials
store = oauth2client.file.Storage(cred_path)
credentials = store.get()
# Writing credentials
creds = client.AccessTokenCredentials(access_token, user_agent)
creds.access_token = access_token
creds.refresh_token = refresh_token
creds.client_id = client_id
creds.client_secret = client_secret
# For some reason it does not save all the credentials,
# so write them to a json file manually instead
with open(credential_path, "w") as f:
f.write(creds.to_json)
In terms of security, I would not see much of a threat here as these access tokens will be on a user's desktop. If someone wants to get their access token, they would need to have read access to that file for that time frame. However, if they can already do that, they most likely also can use your script to send them a copy of the user's access token every time it is authenticated. But take my word lightly as I'm not a professional in that area. See information security stack exchange.
A post in information security stack exchange did talk about this:
these tokens give access to some fairly privileged information about your users.
However, the question was addressed to a database instead.
In conclusion, you can keep it in a file. (But take my word with a grain of salt)
Do you want to store it on the service side or locally?
Since your tool interfaces RESTful API, which is stateless, meaning that no information is stored between different requests to API, you actually need to provide access token every time your client accesses any of the REST endpoints. I am maybe missing some of the details in your design, but access tokens should be used only for authorization, since your user is already authenticated if he has a token. This is why tokens are valid only for a certain amount of time, usually 1 hour.
So you need to provide a state either by using cookie (web interface) or storing the token locally (Which is what you meant). However, you should trigger the entire oauth flow every time a user logs in to your client (authenticating user and providing a new auth token) otherwise you are not utilizing the benefits of oauth.
Related
I am developing an application that is supposed to help a friend of mine better organize his YouTube channels. He has multiple channels on different Google accounts. I'm developing this in Python and I currently don't have too much experience with the YouTube Data API, which I'm planning on using, since it seems like the only option.
The application itself isn't very complicated. The only things it needs to be able to do is upload videos, with a specified title, description and other properties and it should also be possible to write comments on videos. I started a simple application in the Google Developers Console, enabled the YouTube Data API and created an API key and an OAUTH-Client ID.
So far I've managed to post comments on videos, but it seems like every time I run the Python script (currently its just a simple script that posts a single comment) Google wants me to explicitly choose which account I want to use and I have to give permission to the script every time I run it.
Is there a way I can just run the script once and tell Google which account I want to use to post the comment, give all the permissions and Google then remembers that so I don't have to explicitly give permissions every time?
Also how would I be able to then switch accounts and make uploads with that one, because currently I always need to choose one, when the Google client pops up, when running the script.
I've heard you can get an application authorized by Google, would that help with this or is it fine if I just keep my app in test and not in production?
If you have N accounts and want to upload videos on each of them, then you'll have to run to successful completion N OAuth 2 authorization/authentication flows.
For each of these N OAuth flows, upon completing each one successfully, you'll have to make persistent the obtained credentials data to a separate file within your computer local storage.
This could well be conceived as an initialization step of your app (although, at any later stage, you may well repeat it for any additional channel that you need your app be aware of). Your code would look like:
# run an OAuth flow; then obtain credentials data
# relative to the channel the app's user had chosen
# during that OAuth flow
from google_auth_oauthlib.flow import InstalledAppFlow
scopes = ['https://www.googleapis.com/auth/youtube']
flow = InstalledAppFlow.from_client_secrets_file(
client_secret_file, scopes)
cred = flow.run_console()
# build an YouTube service object such that to
# be able to retrieve the ID of the channel that
# the app's user had chosen during the OAuth flow
from googleapiclient.discovery import build
youtube = build('youtube', 'v3', credentials = cred)
response = youtube.channels().list(
part = 'id',
mine = True
).execute()
channel_id = response['items'][0]['id']
# save the credentials data to a JSON text file
cred_file = f"/path/to/credentials/data/dir/{channel_id}.json"
with open(cred_file, 'w', encoding = 'UTF-8') as json_file:
json_file.write(cred.to_json())
Above, client_secret_file is the full path to the file containing your app's client secret JSON file that you've obtained from Google Developers Console.
Subsequently, each time you'll want to upload a video, you'll have to choose from within the app to which channel to upload that video. From the perspective of the logic of your program that would imply the following thing -- say you've chosen the channel of which ID is channel_id: do read in the credentials data file associated to channel_id for to pass its content to your YouTube service object youtube constructed as shown below:
# read in the credentials data associated to
# the channel identified by its ID 'channel_id'
from google.oauth2.credentials import Credentials
cred_file = f"/path/to/credentials/data/dir/{channel_id}.json"
cred = Credentials.from_authorized_user_file(cred_file)
# the access token need be refreshed when
# the previously saved one expired already
from google.auth.transport.requests import Request
assert cred and cred.valid and cred.refresh_token
if cred.expired:
cred.refresh(Request())
# save credentials data upon it got refreshed
with open(cred_file, 'w', encoding = 'UTF-8') as json_file:
json_file.write(cred.to_json())
# construct an YouTube service object through
# which any API invocations are authorized on
# behalf of the channel with ID 'channel_id'
from googleapiclient.discovery import build
youtube = build('youtube', 'v3', credentials = cred)
Upon running this code, the YouTube service object youtube will be initialized such a way that each an every API endpoint call that is issued through this object will accomplish an authorized request on behalf of the channel identified by channel_id.
An important note: you need to have installed the package Google Authentication Library for Python, google-auth, version >= 1.21.3 (google-auth v1.3.0 introduced Credentials.from_authorized_user_file, v1.8.0 introduced Credentials.to_json and v1.21.3 fixed this latter function w.r.t. its class' expiry member), for the credentials object cred to be saved to and loaded from JSON text files.
Also an important note: the code above is simplified as much as possible. Error conditions are not handled at all. For example, the code above does not handle the error situation when cred_file already exists at the time of writing out a new credentials data file or when cred_file does not exist at the time of reading in credentials data that's supposed to already exist.
I have a Python service which imports a library that talks to the PayPal API. There is a config file that is passed into the library __init__() which contains the PayPal API username and password.
Calling the PayPal API token endpoint with the username and password will return a token used to authenticate during the pay call. However, this token lasts for 90 minutes and should be reused.
There are multiple instances of this service running on different servers and they need to all share this one secret token.
What would the best way of storing this 9 minute token be?
While you could persist this in a database, since it's only valid for 90 minutes, you might consider using an in-memory data store like Redis. It's very simple to set up and there are various Python clients available.
Redis in particular supports expiration time when setting a value, so you can make sure it'll only be kept for a set amount of time. Of course, you should still have exception handling in place in case for some reason the key is invalidated early.
While this may introduce a software dependency if you're not already using a key-value store, it's not clear from your question how this library is intended to be used and thus whether this is an issue.
If installing other software is not an option, you could use a temporary file. However, because Python's tempfile doesn't seem to support directly setting a temporary file's name, you might have to handle file management manually. For example:
import os
import time
import tempfile
# 90 minutes in seconds. Setting this a little lower would
# probably be better to account for network latency.
MAX_AGE = 90 * 60
# /tmp/libname/ needs to exist for this to work; creating it
# if necessary shouldn't give you much trouble.
TOKEN_PATH = os.path.join(
tempfile.gettempdir(),
'libname',
'paypal.token',
)
def get_paypal_token():
token = None
if os.path.isfile(TOKEN_PATH):
token_age = time.time() - os.path.getmtime(TOKEN_PATH)
if token_age < MAX_AGE:
with open(TOKEN_PATH, 'r') as infile:
# You might consider a test API call to establish token validity here.
token = infile.read()
if not token:
# Get a token from the PayPal API and write it to TOKEN_PATH.
token = 'dummy'
with open(TOKEN_PATH, 'w') as outfile:
outfile.write(token)
return token
Depending on the environment, you would probably want to look into restricting permissions on this temp file. Regardless of how you persist the token, though, this code should be a useful example. I wouldn't be thrilled about sticking something like this on the file system, but if you already have the PayPal credentials used to request a token on disk, writing the token to temporary storage probably won't be a big deal.
You could store the token as a system variable.
import os
# Store token
os.environ['PAYPAL_API_TOKEN'] = <...>
# Retrieve token
token = os.environ['PAYPAL_API_TOKEN']
Be aware of the security implications though: Other processes could read the token.
When an OAuth2Credential object tries to refresh its access_token, sometimes it gets an error of invalid_grant and then it becomes unable to be refreshed. The code I used is based on Google's python API and Mirror API examples.
Background:
Using oauth2client module for authentication and OAuth2Credential object.
Storing the OAuth2Credential object pickled and base64'd into the database like Google's own example code
Using apiclient module to make calls to the Mirror API
This code runs on 3 different servers, all exhibiting the same issue when trying to send
The scopes I ask for are "https://www.googleapis.com/auth/glass.timeline" and "https://www.googleapis.com/auth/userinfo.profile"
I can confirm that access_type is set to "offline"
I ask for approval_prompt to be "force" just in case
Here is the code that is being used to call the mirror API:
from apiclient.discovery import build
http = credential.authorize(http=httplib2.Http())
service = build("mirror", "v1", http=http)
payload = <JSON_PAYLOAD_HERE>
service.timeline().insert(body=payload).execute()
When the service is called, there is the potential for it to issue a 401 which means the access_token needs to be refreshed. It then calls the refresh method which excepts with AccessTokenRefreshError with the error invalid_grant. At this point, the credential is as good as bunk, since the access_token is expired and the refresh_token will only give the same error.
I have seen pages that say this can happen due to either NTP problems, but I have confirmed (and even switched NTP servers) that my servers are in sync. The other documented possibility is that only 25 refresh tokens can exist before they get recycled, but I have implemented a store() method on the Credential object so when it is refreshed, the new credentials are saved in place (I can confirm that this works as I see new information in the database when it is refreshed).
Since I can't get a user's credentials to start exhibiting this problem on demand, I can't explain any other conditions to recreate the issue other than "waiting some time". I have seen the issue happen soon after authenticating and sending one call, all the way to a week's worth of time after a hundred calls.
The only way for now to get the issue to be resolved is to ask the user to reauthorize, but that isn't a solution since I am expecting to use the api's offline without user interaction. I'd also have no way to notify the user that they need to reauthorize.
Answer from the comment thread: the user had toggled off the Glassware from the MyGlass website which resulted in the token being revoked.
The user needs to go through the authorization flow again in order to be able to use the Glassware by either visiting the Glassware authorization endpoint or toggling it back "on" on MyGlass if available.
Can someone please give me a clear explanation of how to get the Google Calendar API v3 working with the Python Client? Specifically, the initial OAuth stage is greatly confusing me. All I need to do is access my own calendar, read it, and make changes to it. Google provides this code for configuring my app:
import gflags
import httplib2
from apiclient.discovery import build
from oauth2client.file import Storage
from oauth2client.client import OAuth2WebServerFlow
from oauth2client.tools import run
FLAGS = gflags.FLAGS
# Set up a Flow object to be used if we need to authenticate. This
# sample uses OAuth 2.0, and we set up the OAuth2WebServerFlow with
# the information it needs to authenticate. Note that it is called
# the Web Server Flow, but it can also handle the flow for native
# applications
# The client_id and client_secret are copied from the API Access tab on
# the Google APIs Console
FLOW = OAuth2WebServerFlow(
client_id='YOUR_CLIENT_ID',
client_secret='YOUR_CLIENT_SECRET',
scope='https://www.googleapis.com/auth/calendar',
user_agent='YOUR_APPLICATION_NAME/YOUR_APPLICATION_VERSION')
# To disable the local server feature, uncomment the following line:
# FLAGS.auth_local_webserver = False
# If the Credentials don't exist or are invalid, run through the native client
# flow. The Storage object will ensure that if successful the good
# Credentials will get written back to a file.
storage = Storage('calendar.dat')
credentials = storage.get()
if credentials is None or credentials.invalid == True:
credentials = run(FLOW, storage)
# Create an httplib2.Http object to handle our HTTP requests and authorize it
# with our good Credentials.
http = httplib2.Http()
http = credentials.authorize(http)
# Build a service object for interacting with the API. Visit
# the Google APIs Console
# to get a developerKey for your own application.
service = build(serviceName='calendar', version='v3', http=http,
developerKey='YOUR_DEVELOPER_KEY')
But (a) it makes absolutely no sense to me; the comment explanations are terrible, and (b) I don't know what to put in the variables. I've registered my program with Google and signed up for a Service Account key. But all that gave me was an encrypted key file to download, and a client ID. I have no idea what a "developerKey" is, or what a "client_secret" is? Is that the key? If it is, how do I get it, since it is actually contained in an encrypted file? Finally, given the relatively simple goals of my API use (i.e., it's not a multi-user, multi-access operation), is there a simpler way to be doing this? Thanks.
A simple (read: way I've done it) way to do this is to create a web application instead of a service account. This may sound weird since you don't need any sort of web application, but I use this in the same way you do - make some queries to my own calendar/add events/etc. - all from the command line and without any sort of web-app interaction. There are ways to do it with a service account (I'll tinker around if you do in fact want to go on that route), but this has worked for me thus far.
After you create a web application, you will then have all of the information indicated above (side note: the sample code above is based on a web application - to use a service account your FLOW needs to call flow_from_clientsecrets and further adjustments need to be made - see here). Therefore you will be able to fill out this section:
FLOW = OAuth2WebServerFlow(
client_id='YOUR_CLIENT_ID',
client_secret='YOUR_CLIENT_SECRET',
scope='https://www.googleapis.com/auth/calendar',
user_agent='YOUR_APPLICATION_NAME/YOUR_APPLICATION_VERSION')
You can now fill out with the values you see in the API console (client_id = the entire Client ID string, client_secret = the client secret, scope is the same and the user_agent can be whatever you want). As for the service line, developerKey is the API key you can find under the Simple API Access section in the API console (label is API key):
service = build(serviceName='calendar', version='v3', http=http,
developerKey='<your_API_key>')
You can then add in a simple check like the following to see if it worked:
events = service.events().list(calendarId='<your_email_here>').execute()
print events
Now when you run this, a browser window will pop up that will allow you to complete the authentication flow. What this means is that all authentication will be handled by Google, and the authentication response info will be stored in calendar.dat. That file (which will be stored in the same directory as your script) will contain the authentication info that the service will now use. That is what is going here:
storage = Storage('calendar.dat')
credentials = storage.get()
if credentials is None or credentials.invalid == True:
credentials = run(FLOW, storage)
It checks for the existence of valid credentials by looking for that file and verifying the contents (this is all abstracted away from you to make it easier to implement). After you authenticate, the if statement will evaluate False and you will be able to access your data without needing to authenticate again.
Hopefully that shines a bit more light on the process - long story short, make a web application and use the parameters from that, authenticate once and then forget about it. I'm sure there are various points I'm overlooking, but hopefully it will work for your situation.
Google now has a good sample application that gets you up and running without too much fuss. It is available as the "5 minute experience - Quickstart" on their
Getting Started page.
It will give you a URL to visit directly if you are working on a remote server without a browser.
I am currently using django-social-auth to manage oauth2 registration with google-oauth2 for access to Google Drive. I have added offline access to my extra_arguments. Therefore Google returns a refresh token and it is stored by django-social-auth. The problem is that django-social-auth never uses this refresh token to update the access token. Therefore the access token expires after one hour, and I can't use it to perform offline requests. I want to keep the access_token valid 24/7 so I can keep my database synced with each users Google Drive.
GOOGLE_OAUTH2_AUTH_EXTRA_ARGUMENTS = {'access_type':'offline'}
GOOGLE_OAUTH_EXTRA_SCOPE = ['https://www.googleapis.com/auth/drive https://www.googleapis.com/auth/userinfo.profile']
SOCIAL_AUTH_USER_MODEL = 'accounts.GoogleDriveUser'
SOCIAL_AUTH_EXTRA_DATA = True
SOCIAL_AUTH_SESSION_EXPIRATION = False
Is there a way to force django-social auth to update the access_token every time it expires using the refresh_token. I would love to see an example of how this problem could be solved.
It looks like UserSocialAuth objects now have a .refresh_token() method, which allows you to use .tokens and get the updated token.
There's no way directly implemented in django-social-auth at the moment (I've raised a ticket to track it https://github.com/omab/django-social-auth/issues/492), meanwhile this snippet will do the work, it just need to be improved a little to suite your needs.