I'm trying to get a very simple Python script to talk to Freebase.
All the examples I've found use the simple / api key authorization model. So I made a Google Developer account, made a project, and tried to get a key as Google says to. It demands I provide a list of numeric IP addresses that I'll call from. Not feasible, since I don't have a fixed IP (I do have dyndns set up, but that doesn't help since Google won't take a domain name, only numerics).
So I tried OAuth2, which is overkill for what I need (I'm not accessing any non-public user data). But I couldn't find even one online example of using OAuth2 for Freebase. I tried adjusting other examples, but after bouncing around between appengine, Decorator, several obsolete Python libraries, and several other approaches, I got nowhere.
Can anyone either explain or point to a good example of how to do this (without spending 10x more time on authorization, than on the app I'm trying to authorize)? A working example with OAuth2, preferably without many layers of "simplifying" APIs; or a tip on how to get around the fixed-IP requirement for API key authorization, would be fantastic. Thanks!
Steve
I had to do this for Google Drive, but as far as I know this should work for any Google API.
When you create a new Client ID in the developer console, you should have the option to create a Service Account. This will create a public/private key pair, and you can use that to authenticate without any OAuth nonsense.
I stole this code out of our GDrive library, so it may be broke and it is GDrive specific, so you will need to replace anything that says "drive" with whatever Freebase wants.
But I hope it's enough to get you started.
# Sample code that connects to Google Drive
from apiclient.discovery import build
import httplib2
from oauth2client.client import SignedJwtAssertionCredentials, VerifyJwtTokenError
SERVICE_EMAIL = "you#gmail.com"
PRIVATE_KEY_PATH ="./private_key.p12"
# Load private key
key = open(PRIVATE_KEY_PATH, 'rb').read()
# Build the credentials object
credentials = SignedJwtAssertionCredentials(SERVICE_EMAIL, key, scope='https://www.googleapis.com/auth/drive')
try:
http = httplib2.Http()
http = credentials.authorize(http)
except VerifyJwtTokenError as e:
print(u"Unable to authorize using our private key: VerifyJwtTokenError, {0}".format(e))
raise
connection = build('drive', 'v2', http=http)
# You can now use connection to call anything you need for freebase - see their API docs for more info.
Working from #Rachel's sample code, with a bit of fiddling I got to this, which works, and illustrates the topic, search, and query features.
Must install libraries urllib and json, plus code from https://code.google.com/p/google-api-python-client/downloads/list
Must enable billing from 'settings' for the specific project
The mglread() interface for Python is broken as of April 2014.
The documented 'freebase.readonly' scope doesn't work.
from apiclient.discovery import build
import httplib2
from oauth2client.client import SignedJwtAssertionCredentials, VerifyJwtTokenError
# Set up needed constants
#
SERVICE_EMAIL = args.serviceEmail
PRIVATE_KEY_PATH = args.privateKeyFile
topicID = args.topicID
query = args.query
search_url = 'https://www.googleapis.com/freebase/v1/search'
topic_url = 'https://www.googleapis.com/freebase/v1/topic'
mql_url = "https://www.googleapis.com/freebase/v1/mqlread"
key = open(PRIVATE_KEY_PATH, 'rb').read()
credentials = SignedJwtAssertionCredentials(SERVICE_EMAIL, key,
scope='https://www.googleapis.com/auth/freebase')
try:
http = httplib2.Http()
http = credentials.authorize(http)
except VerifyJwtTokenError as e:
print(u"Unable to authorize via private key: VerifyJwtTokenError, {0}".format(e))
raise
connection = build('freebase', 'v1', http=http)
# Search for a topic by Freebase topic ID
# https://developers.google.com/freebase/v1/topic-overview
#
params = { 'filter': 'suggest' }
url = topic_url + topicID + '?' + urllib.urlencode(params)
if (args.verbose): print("URL: " + url)
resp = urllib.urlopen(url).read()
if (args.verbose): print("Response: " + resp)
respJ = json.loads(resp)
print("Topic property(s) for '%s': " % topicID)
for property in respJ['property']:
print(' ' + property + ':')
for value in respJ['property'][property]['values']:
print(' - ' + value['text'])
print("\n")
# Do a regular search
# https://developers.google.com/freebase/v1/search-overview
#
params = { 'query': query }
url = search_url + '?' + urllib.urlencode(params)
if (args.verbose): print("URL: " + url)
resp = urllib.urlopen(url).read()
if (args.verbose): print("Response: " + resp)
respJ = json.loads(resp)
print("Search result for '%s': " % query)
theKeys = {}
for res in respJ['result']:
print ("%-40s %-15s %10.5f" %
(res['name'], res['mid'], res['score']))
params = '{ "id": "%s", "type": []}' % (res['mid'])
# Run a query on the retrieved ID, to get its types:
url = mql_url + '?query=' + params
resp = urllib.urlopen(url).read()
respJ = json.loads(resp)
print(" Type(s): " + `respJ['result']['type']`)
otherKeys = []
for k in res:
if (k not in ['name', 'mid', 'score']): otherKeys.append(k)
if (len(otherKeys)): print(" Other keys: " + ", ".join(otherKeys))
sys.exit(0)
Related
I am trying to use OAuth2 to access the Azure DevopsAPI, to query work-items.
But I am unable to get the access tokene.
I am using Python and Flask. My approach is based on these resources:
Microsoft documentation , there currently Step 3 is relevant
OAuth Tutorial, which worked fine for Github, but is not working for Azure.
Relevant libraries:
from requests_oauthlib import OAuth2Session
from flask import Flask, request, redirect, session, url_for
Parameters:
client_id = "..."
client_secret = "..."
authorization_base_url = "https://app.vssps.visualstudio.com/oauth2/authorize"
token_url = "https://app.vssps.visualstudio.com/oauth2/token"
callback_url = "..."
Step 1: User Authorization. (works fine)
#app.route("/")
def demo():
azure = OAuth2Session(client_id)
authorization_url, state = azure.authorization_url(authorization_base_url)
session['oauth_state'] = state
authorization_url += "&scope=" + authorized_scopes + "&redirect_uri=" + callback_url
print(authorization_url)
return redirect(authorization_url)
Step 2: Retrieving an access token (generates an error)
#app.route("/callback", methods=["GET"])
def callback():
fetch_body = "client_assertion_type=urn:ietf:params:oauth:client-assertion-type:jwt-bearer" \
"&client_assertion=" + client_secret + \
"&grant_type=urn:ietf:params:oauth:grant-type:jwt-bearer" \
"&assertion=" + request.args["code"] + \
"&redirect_uri=" + callback_url
azure = OAuth2Session(client_id, state=session['oauth_state'])
token = azure.fetch_token(token_url=token_url, client_secret=client_secret,
body=fetch_body,
authorization_response=request.url)
azure.request()
session['oauth_token'] = token
return redirect(url_for('.profile'))
The application-registration and adhoc-SSL-certification are working fine (using it just temporary).
When I use the client_assertion in Postman, I get a correct response from Azure:
But when I execute the code, this error is thrown:
oauthlib.oauth2.rfc6749.errors.MissingTokenError: (missing_token) Missing access token parameter.
Which only lets me know, that no token was received.
There is one issue in the generated request body, where the grant_type is added twice:
grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer
grant_type=authorization_code
The first value is expected by Azure, but the second one is generated automatically by the library.
Now when I specify the grant_type in the fetch_token call, like this:
token = azure.fetch_token(token_url=token_url, client_secret=client_secret,
body=fetch_body, grant_type="urn:ietf:params:oauth:grant-type:jwt-bearer",
authorization_response=request.url)
I get this error
TypeError: prepare_token_request() got multiple values for argument 'grant_type'
And the actual request to Azure is not even sent.
I see in the web_application.py that is used by oauth2_session.py, that grant_type ='authorization_code' is set fixed, so I guess this library is generally incompatible with Azure.
Is that the case?
If so, what would be the simplest way to connect to Azure-OAuth with Python (Flask)?
I would be very grateful for any advice and help that point me in the right direction.
I just found the azure.devops library that solves my problem.
Ressources
https://github.com/Microsoft/azure-devops-python-api
https://github.com/microsoft/azure-devops-python-samples/blob/main/src/samples/work_item_tracking.py
azure-devops-python-api query for work item where field == string
from azure.devops.connection import Connection
from azure.devops.v5_1.work_item_tracking import Wiql
from msrest.authentication import BasicAuthentication
import pprint
# Fill in with your personal access token and org URL
personal_access_token = '... PAT'
organization_url = 'https://dev.azure.com/....'
# Create a connection to the org
credentials = BasicAuthentication('', personal_access_token)
connection = Connection(base_url=organization_url, creds=credentials)
# Get a client (the "core" client provides access to projects, teams, etc)
core_client = connection.clients.get_core_client()
wit_client = connection.clients.get_work_item_tracking_client()
query = "SELECT [System.Id], [System.WorkItemType], [System.Title], [System.AssignedTo], [System.State]," \
"[System.Tags] FROM workitems WHERE [System.TeamProject] = 'Test'"
wiql = Wiql(query=query)
query_results = wit_client.query_by_wiql(wiql).work_items
for item in query_results:
work_item = wit_client.get_work_item(item.id)
pprint.pprint(work_item.fields['System.Title'])
I am able to get a token using the default scope for powerbi ( "scope" :
"https://analysis.windows.net/powerbi/api/.default"] )
With that token, I am able to read the workspaces my user has access to, ( "https://api.powerbi.com/v1.0/myorg/groups") and the reports information inside each of those workspaces (
"https://api.powerbi.com/v1.0/myorg/reports/")
But it does not matter if I reuse the same token or just acquire a brand new, if I try to export a specific report, I got a 401 error code. This is the way I am issuing the requests.get
token_ = <new token or reused from previous get requests>
reports = requests.get( # Use token to call downstream service
config['reports']+report_id+'/Export',
headers={'Authorization': 'Bearer ' + token_ },)
Now, if I go to https://learn.microsoft.com/en-us/rest/api/power-bi/reports/getreportsingroup
and sign in (with the same user I am using on my python script). Get the token from that page and use it on my script. It works, If I use it in postman, it works
I I try to use the token acquired by my script in Postman, I also get a 401 error. So, yes, my script is not getting the correct token for this particular entry point but it is good enough for the groups and reports entry point.
Is there anything I need to add to the request for the token on this particular entry point?
Thank you very much,
Andres
Here is the full script I am using, there is also a params.json that looks like this:
{
"authority": "https://login.microsoftonline.com/1abcdefg-abcd-48b6-9b3c-bd5123456",
"client_id": "5d2545-abcd-4765-8fbb-53555f2fa91",
"username":"myusername#tenant",
"password": "mypass",
"scope" : ["https://analysis.windows.net/powerbi/api/.default"],
"workspaces" : "https://api.powerbi.com/v1.0/myorg/groups",
"reports": "https://api.powerbi.com/v1.0/myorg/reports/"
}
#script based on msal github library sample
import sys # For simplicity, we'll read config file from 1st CLI param sys.argv[1]
import json
import logging
import requests
import msal
def exportReport(report_id,token_):
result = app.acquire_token_by_username_password( config["username"], config["password"], scopes=config["scope"])
token_ = result['access_token']
print(f'Using token: {token_}')
reports = requests.get( # Use token to call downstream service
config['reports']+report_id+'/Export',
headers={'Authorization': 'Bearer ' + token_ },)
print(f'-reports: {reports.status_code}')
def list_reports(workspace_id,ws_id,ws_name,token_):
print(f'reports id for workspace {ws_name}')
for rp in workspace_id['value']:
if rp["id"] == "1d509119-76a1-42ce-8afd-bd3c420dd62d":
exportReport("1d509119-76a1-42ce-8afd-bd0c420dd62d",token_)
def list_workspaces(workspaces_dict):
for ws in workspaces_dict['value']:
yield (ws['id'],ws['name'])
config = json.load(open('params.json'))
app = msal.PublicClientApplication(
config["client_id"], authority=config["authority"],
)
result = None
if not result:
logging.info("No suitable token exists in cache. Let's get a new one from AAD.")
result = app.acquire_token_by_username_password(
config["username"], config["password"], scopes=config["scope"])
if "access_token" in result:
workspaces = requests.get( # Use token to call downstream service
config['workspaces'],
headers={'Authorization': 'Bearer ' + result['access_token']},).json()
ids=list_workspaces(workspaces) #prepare workspace generator
headers = {'Authorization': 'Bearer ' + result['access_token']}
while True:
try:
ws_id,ws_name=next(ids)
reports = requests.get( # Use token to call downstream service
config['workspaces']+'/'+ws_id+'/reports',
headers={'Authorization': 'Bearer ' + result['access_token']},).json()
list_reports(reports,ws_id,ws_name,result['access_token'])
except StopIteration:
exit(0)
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id")) # You may need this when reporting a bug
if 65001 in result.get("error_codes", []):
# AAD requires user consent for U/P flow
print("Visit this to consent:", app.get_authorization_request_url(config["scope"]))
From your description, I suppose you didn't grant the correct permission for your AD App used to login your user account in the code, please follow the steps below.
Navigate to the Azure portal -> Azure Active Directory -> App registrations -> find your AD App used in the code(filter with All applications) -> API permissions -> add the Report.Read.All Delegated permission in Power BI Service API(this permission is just for read action, if you need some further write operation, choose Report.ReadWrite.All) -> click the Grant admin consent for xxx button at last.
Update:
Use the application id of the access token got from Get-PowerBIAccessToken solve the issue.
In the Python code for requesting data from Google Analytics ( https://developers.google.com/analytics/devguides/reporting/core/v4/quickstart/service-py ) via an API, oauth2client is being used. The code was last time updated in July 2018 and until now the oauth2client is deprecated. My question is can I get the same code, where instead of oauth2client, google-auth or oauthlib is being used ?
I was googling to find a solution how to replace the parts of code where oauth2client is being used. Yet since I am not a developer I didn't succeed. This is how I tried to adapt the code in this link ( https://developers.google.com/analytics/devguides/reporting/core/v4/quickstart/service-py ) to google-auth. Any idea how to fix this ?
import argparse
from apiclient.discovery import build
from google.oauth2 import service_account
from google.auth.transport.urllib3 import AuthorizedHttp
SCOPES = ['...']
DISCOVERY_URI = ('...')
CLIENT_SECRETS_PATH = 'client_secrets.json' # Path to client_secrets.json file.
VIEW_ID = '...'
def initialize_analyticsreporting():
"""Initializes the analyticsreporting service object.
Returns:l
analytics an authorized analyticsreporting service object.
"""
# Parse command-line arguments.
credentials = service_account.Credentials.from_service_account_file(CLIENT_SECRETS_PATH)
# Prepare credentials, and authorize HTTP object with them.
# If the credentials don't exist or are invalid run through the native client
# flow. The Storage object will ensure that if successful the good
# credentials will get written back to a file.
authed_http = AuthorizedHttp(credentials)
response = authed_http.request(
'GET', SCOPES)
# Build the service object.
analytics = build('analytics', 'v4', http=http, discoveryServiceUrl=DISCOVERY_URI)
return analytics
def get_report(analytics):
# Use the Analytics Service Object to query the Analytics Reporting API V4.
return analytics.reports().batchGet(
body=
{
"reportRequests":[
{
"viewId":VIEW_ID,
"dateRanges":[
{
"startDate":"2019-01-01",
"endDate":"yesterday"
}],
"dimensions":[
{
"name":"ga:transactionId"
},
{
"name":"ga:sourceMedium"
},
{
"name":"ga:date"
}],
"metrics":[
{
"expression":"ga:transactionRevenue"
}]
}]
}
).execute()
def printResults(response):
for report in response.get("reports", []):
columnHeader = report.get("columnHeader", {})
dimensionHeaders = columnHeader.get("dimensions", [])
metricHeaders = columnHeader.get("metricHeader", {}).get("metricHeaderEntries", [])
rows = report.get("data", {}).get("rows", [])
for row in rows:
dimensions = row.get("dimensions", [])
dateRangeValues = row.get("metrics", [])
for header, dimension in zip(dimensionHeaders, dimensions):
print (header + ": " + dimension)
for i, values in enumerate(dateRangeValues):
for metric, value in zip(metricHeaders, values.get("values")):
print (metric.get("name") + ": " + value)
def main():
analytics = initialize_analyticsreporting()
response = get_report(analytics)
printResults(response)
if __name__ == '__main__':
main()
I need to obtain response in form of a json with given dimensions and metrics from Google Analytics.
For those running into this problem and wish to port to the newer auth libraries, do a diff b/w the 2 different versions of the short/simple Google Drive API sample at the code repo for the G Suite APIs intro codelab to see what needs to be updated (and what can stay as-is). The bottom-line is that the API client library code can remain the same while all you do is swap out the auth libraries underneath.
Note that sample is only for user acct auth... for svc acct auth, the update is similar, but I don't have an example of that yet (working on one though... will update this once it's published).
I'm using Python 2.6 and the client library for Google API which I am trying to use to get authenticated access to email settings :
f = file(SERVICE_ACCOUNT_PKCS12_FILE_PATH, 'rb')
key = f.read()
f.close()
credentials = client.SignedJwtAssertionCredentials(SERVICE_ACCOUNT_EMAIL, key, scope='https://apps-apis.google.com/a/feeds/emailsettings/2.0/', sub=user_email)
http = httplib2.Http()
http = credentials.authorize(http)
return discovery.build('email-settings', 'v2', http=http)
When I execute this code , I got the follwowing error:
UnknownApiNameOrVersion: name: email-settings version: v2
What's the api name and version for email settingsV2?
Is it possible to use it with service account?
Regards
I found the solution to get email settings using service account oauth2:
Here is a example:
SERVICE_ACCOUNT_EMAIL = ''
SERVICE_ACCOUNT_PKCS12_FILE_PATH = ''
EMAIL_SETTING_URI = "https://apps-apis.google.com/a/feeds/emailsettings/2.0/%s/%s/%s"
def fctEmailSettings():
user_email = "user#mail.com"
f = file(SERVICE_ACCOUNT_PKCS12_FILE_PATH, 'rb')
key = f.read()
f.close()
credentials = client.SignedJwtAssertionCredentials(SERVICE_ACCOUNT_EMAIL, key, scope='https://apps-apis.google.com/a/feeds/emailsettings/2.0/', sub=user_email)
auth2token = OAuth2TokenFromCredentials(credentials)
ESclient = EmailSettingsClient(domain='doamin.com')
auth2token.authorize(ESclient)
username = 'username'
setting='forwarding'
uri = ESclient.MakeEmailSettingsUri(username, setting)
entry = ESclient.get_entry(uri = uri, desired_class = GS.gdata.apps.emailsettings.data.EmailSettingsEntry)
It appears that the emailsettings API is not available using the Discovery API. The APIs Discovery service returns back details of an API - what methods are available, etc.
See the following issue raised on the PHP client API
https://github.com/google/google-api-php-client/issues/246
I'm unclear as to why the emailsettings is not available via the discovery API or whether there are plans to do so. Really it feels like a lot of these systems and libraries are unmaintained.
The deprecated gdata client library does have support. Try the following example, which I can confirm works ok.
https://code.google.com/p/gdata-python-client/source/browse/samples/apps/emailsettings_example.py
In case you have multiple entry points in your app that need to access the EmailSettings API, here's a re-usable function that returns a "client" object:
def google_get_emailsettings_credentials():
'''
Google's EmailSettings API is not yet service-based, so delegation data
has to be accessed differently from our other Google functions.
TODO: Refactor when API is updated.
'''
with open(settings.GOOGLE_PATH_TO_KEYFILE) as f:
private_key = f.read()
client = EmailSettingsClient(domain='example.com')
credentials = SignedJwtAssertionCredentials(
settings.GOOGLE_CLIENT_EMAIL,
private_key,
scope='https://apps-apis.google.com/a/feeds/emailsettings/2.0/',
sub=settings.GOOGLE_SUB_USER)
auth2token = gdata.gauth.OAuth2TokenFromCredentials(credentials)
auth2token.authorize(client)
return client
It can then be called from elsewhere, e.g. to reach the DelegationFeed:
client = google_get_emailsettings_credentials()
uri = client.MakeEmailSettingsUri(username, 'delegation')
delegates_xml = client.get_entry(
uri=uri,
desired_class=gdata.apps.emailsettings.data.EmailSettingsDelegationFeed)
This code is available online to run a map of your connections in linkedin
This uses linkedin api.
I'm able to connect fine and everything runs okay till the last script of actually writing the data to a csv.
Whenever I run the code
import oauth2 as oauth
import urlparse
import simplejson
import codecs
CONSUMER_KEY = "xxx"
CONSUMER_SECRET = "xxx"
OAUTH_TOKEN = "xxx"
OAUTH_TOKEN_SECRET = "xxx"
OUTPUT = "linked.csv"
def linkedin_connections():
# Use your credentials to build the oauth client
consumer = oauth.Consumer(key=CONSUMER_KEY, secret=CONSUMER_SECRET)
token = oauth.Token(key=OAUTH_TOKEN, secret=OAUTH_TOKEN_SECRET)
client = oauth.Client(consumer, token)
# Fetch first degree connections
resp, content = client.request('http://api.linkedin.com/v1/people/~/connections?format=json')
results = simplejson.loads(content)
# File that will store the results
output = codecs.open(OUTPUT, 'w', 'utf-8')
# Loop thru the 1st degree connection and see how they connect to each other
for result in results["values"]:
con = "%s %s" % (result["firstName"].replace(",", " "), result["lastName"].replace(",", " "))
print >>output, "%s,%s" % ("John Henry", con)
# This is the trick, use the search API to get related connections
u = "https://api.linkedin.com/v1/people/%s:(relation-to-viewer:(related-connections))?format=json" % result["id"]
resp, content = client.request(u)
rels = simplejson.loads(content)
try:
for rel in rels['relationToViewer']['relatedConnections']['values']:
sec = "%s %s" % (rel["firstName"].replace(",", " "), rel["lastName"].replace(",", " "))
print >>output, "%s,%s" % (con, sec)
except:
pass
if __name__ == '__main__':
linkedin_connections()
for result in results["values"]:
KeyError: 'values'
When I run this I get an error message:
Traceback (most recent call last):
File "linkedin-2-query.py", line 51, in <module>
linkedin_connections()
File "linkedin-2-query.py", line 35, in linkedin_connections
for result in results["values"]:
KeyError: 'values'
Any suggestions or help would be greatly appreciated!
I encountered the same issue working through the post Visualizing your LinkedIn graph using Gephi – Part 1.
Python raises a KeyError whenever a dict() object is requested (using the format a = adict[key]) and the key is not in the dictionary. KeyError - Python Wiki
After searching a bit and adding some print statements, I realize that my OAuth session has expired, so the OAuth token in my linkedin-2-query.py script was is longer valid.
Since the OAuth token is invalid, the LinkedIn API does not return a dictionary with the key "values" like the script expects. Instead, the API returns the string 'N'. Python tries to find the dict key "values"in the string 'N', fails, and generates the KeyError: 'values'.
So a new, valid OAuth token & secret should get the API to return a dict containing connection data.
I run the linkedin-1-oauth.py script again, and then visit the LinkedIn Application details page to find my new OAuth token. (The screenshot omits the values for my app. You should see alphanumeric values for each Key, Token, & Secret.)
...
I then update my linkedin-2-query.py script with the new OAuth User Token and OAuth User Secret
OAUTH_TOKEN = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" # your updated OAuth User Token
OAUTH_TOKEN_SECRET = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" # your updated OAuth User Secret
After updating the OAuth token & secret, I immediately run my linkedin-2-query.py script. Hooray, it runs without errors and retrieves my connection data from the API.