I'm trying to call a Google Cloud function from within Python using the following:
import requests
url = "MY_CLOUD_FUNCTON_URL"
data = {'name': 'example'}
response = requests.post(url, data = data)
but I get back the error: Your client does not have permission to get URL MY_CLOUD_FUNCTON from this server
Does anyone know how I can avoid this error? I am assuming I should be passing credentials as part of the request somehow?
Also note that if I instead try to call the function via gcloud from the command line like the below then it works, but i want to do this from within python
gcloud functions call MY_CLOUD_FUNCTON --data '{"name": "example"}'
Any help would be really appreciated!
Given a working Cloud Function in HTTP mode which requires authentication in order to be triggered.
You need to generate an authentication token and insert it in the header as shown below:
import os
import json
import requests
import google.oauth2.id_token
import google.auth.transport.requests
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = './my-service-account.json'
request = google.auth.transport.requests.Request()
audience = 'https://mylocation-myprojectname.cloudfunctions.net/MyFunctionName'
TOKEN = google.oauth2.id_token.fetch_id_token(request, audience)
r = requests.post(
'https://mylocation-myprojectname.cloudfunctions.net/MyFunctionName',
headers={'Authorization': f"Bearer {TOKEN}", "Content-Type": "application/json"},
data=json.dumps({"key": "value"}) # possible request parameters
)
r.status_code, r.reason
You have a few options here. Either open the function to the public so that anyone can call it or take the more secure route, albeit necessitating a bit more steps. I will cover the 2nd option since it's the one I would suggest for security reasons, but should you be satisfied with simply opening the function to the public ( which is especially useful if you are trying to create a public endpoint after all ), see this documentation.
If you want to limit who can invoke your GCF however, you would have to perform a few more steps.
Create a service account and give it the Cloud Functions Invoker role ( if you simply want to restrict it's permissions to only invoke the GCF )
After you assign the Service Account a role(s), the next page will give you the option to create a key
After creating the Service Account Key and downloading it as credentials.json, the next step is straightforward. You would simply populate the environment variable GOOGLE_APPLICATION_CREDENTIALS with the path to the credentials.json file.
Once these steps are done, you can simply invoke the GCF as you did before, only this time, it will invoke it as the service account that you created, which contained all the permissions necessary to invoke a GCF.
This may be obvious to many, but to add to Marco's answer (I can't comment yet):
Make sure to install the google-auth package, not the google package. More details in the documentation and the requirements.txt for the code on GitHub.
Related
I have a Python script that is running periodically on an AWS EC2 Ubuntu machine.
This script reads data from some files and sometimes changes data in them.
I want to download these files from OneDrive, do my own thing with them, and upload them back to OneDrive.
I want this to be done automatically, without the need for a user to approve any login or credentials. I'm ok with doing it once (i.e. approving the login on the first run) but the rest has to run automatically, without asking ever again for approvals (unless the permissions change, of course).
What is the best way to do this?
I've been reading the documentation on Microsoft Graph API but I'm struggling with the authentication part. I've created an application in Azure AAD, gave the sample permissions (to test) and created a secret credential.
I managed to do it. I'm not sure if it's the best way but it is working now. It's running automatically every hour and I don't need to touch it.
I followed the information on https://learn.microsoft.com/en-gb/azure/active-directory/develop/v2-oauth2-auth-code-flow
This is what I did.
Azure Portal
Create an application. Azure Active Directory -> App Registrations -> Applications from personal account
In Supported account types, choose the one that has personal Microsoft accounts.
In Redirect URI, choose Public client/native. We'll add the specific URI later.
In the application details, in the section Overview, take note of the Application (client) ID. We'll need this later.
In the section Authentication, click Add a Platform and choose Desktop + devices. You can use your own, I chose one of the suggested: https://login.microsoftonline.com/common/oauth2/nativeclient
In the section API permissions, you have to add all the permissions that your app will use. I added User.Read, Files.ReadWrite and offline_access. The offline_access is to be able to get the refresh token, which will be crucial to keep the app running without asking the user to login.
I did not create any Certificate or Secret.
Web
Looks like to get a token for the first time we have to use a browser or emulate something like that.
There must be a programmatic way to do this, but I had no idea how to do it. I also thought about using Selenium for this, but since it's only one time and my app will request tokens every hour (keeping the tokens fresh), I dropped that idea.
If we add new permissions, the tokens that we have will become invalid and we have to do this manual part again.
Open a browser and go to the URL below. Use the Scopes and the Redirect URI that you set up in Azure Portal.
https://login.microsoftonline.com/common/oauth2/v2.0/authorize?client_id=your_app_client_id&response_type=code&redirect_uri=https%3A%2F%2Flogin.microsoftonline.com%2Fcommon%2Foauth2%2Fnativeclient&response_mode=query&scope=User.Read%20offline_access%20Files.ReadWrite
That URL will redirect you to the Redirect URI that you set up and with a code=something in the URL. Copy that something.
Do a POST request with type FORM URL Encoded. I used https://reqbin.com/ for this.
Endpoint: https://login.microsoftonline.com/common/oauth2/v2.0/token
Form URL: grant_type=authorization_code&client_id=your_app_client_id&code=use_the_code_returned_on_previous_step
This will return an Access Token and a Refresh Token. Store the Refresh Token somewhere. I'm saving it in a file.
Python
# Build the POST parameters
params = {
'grant_type': 'refresh_token',
'client_id': your_app_client_id,
'refresh_token': refresh_token_that_you_got_in_the_previous_step
}
response = requests.post('https://login.microsoftonline.com/common/oauth2/v2.0/token', data=params)
access_token = response.json()['access_token']
new_refresh_token = response.json()['refresh_token']
# ^ Save somewhere the new refresh token.
# I just overwrite the file with the new one.
# This new one will be used next time.
header = {'Authorization': 'Bearer ' + access_token}
# Download the file
response = requests.get('https://graph.microsoft.com/v1.0/me/drive/root:' +
PATH_TO_FILE + '/' + FILE_NAME + ':/content', headers=header)
# Save the file in the disk
with open(file_name, 'wb') as file:
file.write(response.content)
So basically, I have the Refresh Token always updated.
I call the Token endpoint using that Refresh Token, and the API gives me an Access Token to use during the current session and a new Refresh Token.
I use this new Refresh Token the next time I run the program, and so on.
I've just published a repo which does this. Contributions and pull requests welcome:
https://github.com/stevemurch/onedrive-download
I'm trying to get this example to work from https://github.com/ozgur/python-linkedin. I'm using his example. When I run this code. I don't get the RETURN_URL and authorization_code talked about in the example. I'm not sure why, I think it is because I'm not setting up the HTTP API example correctly. I can't find http_api.py, and when I visit http://localhost:8080, I get a "this site can't be reached".
from linkedin import linkedin
API_KEY = 'wFNJekVpDCJtRPFX812pQsJee-gt0zO4X5XmG6wcfSOSlLocxodAXNMbl0_hw3Vl'
API_SECRET = 'daJDa6_8UcnGMw1yuq9TjoO_PMKukXMo8vEMo7Qv5J-G3SPgrAV0FqFCd0TNjQyG'
RETURN_URL = 'http://localhost:8000'
authentication = linkedin.LinkedInAuthentication(API_KEY, API_SECRET, RETURN_URL, linkedin.PERMISSIONS.enums.values())
# Optionally one can send custom "state" value that will be returned from OAuth server
# It can be used to track your user state or something else (it's up to you)
# Be aware that this value is sent to OAuth server AS IS - make sure to encode or hash it
#authorization.state = 'your_encoded_message'
print authentication.authorization_url # open this url on your browser
application = linkedin.LinkedInApplication(authentication)
http_api.py is one of the examples provided in the package. This is an HTTP server that will handle the response from LinkedIn's OAuth end point, so you'll need to boot it up for the example to work.
As stated in the guide, you'll need to execute that example file to get the server working. Note you'll also need to supply the following environment variables: LINKEDIN_API_KEY and LINKEDIN_API_SECRET.
You can run the example file by downloading the repo and calling LINKEDIN_API_KEY=yourkey LINKEDIN_API_SECRET=yoursecret python examples/http_api.py. Note you'll need Python 3.4 for it to work.
I have created an api gateway from my existing api using boto3 import command.
apiClient = boto3.client('apigateway', awsregion)
api_response=apiClient.import_rest_api
(
failOnWarnings=True,
body=open('apifileswagger.json', 'rb').read()
)
But i cant modify integration request. I tried with following Boto3 command.
apiClient = boto3.client('apigateway', awsregion)
api_response=apiClient.put_integration
(
restApiId=apiName,
resourceId='/api/v1/hub',
httpMethod='GET',
integrationHttpMethod='GET',
type='AWS',
uri='arn:aws:lambda:us-east-1:141697213513:function:test-lambda',
)
But I got error like this
Unexpected error: An error occurred () when calling the PutIntegration operation:
I need to change lambda function region & name using Boto3 command. is it possible? .
if it is possible what is the actual issue with this command?
In the put_integration() call listed above, your restApiId and resourceId look incorrect. Here's what you should do.
After importing your rest API, check to see if it is available by calling your apiClient's get_rest_apis(). If the API was imported correctly, you should see it listed in the response along with the API's ID (which is generated by AWS). Capture this ID for future operations.
Next, you'll need to look at all of the resources associated with this API by calling your apiClient's get_resources(). Capture the resource ID for the resource you wish to modify.
Using the API ID and resource ID, check to see if an integration config exists by calling your apiClient's get_integration(). If it does exist you can modify the integration request by calling update_integration(); if it does not exist, you need to create a new integration by calling put_integration() and passing the integration request as a parameter.
Here's an example of how that might look in code:
# Import API
api_response1 = apiClient.import_rest_api(failOnWarnings=True, body=open('apifileswagger.json', 'rb').read())
print(api_response1)
# Get API ID
api_response2 = apiClient.get_rest_apis()
for endpoint in api_response2['items']:
if endpoint['name'] == "YOUR_API_NAME":
api_ID = endpoint['id']
# Get Resource ID
api_response3 = apiClient.get_resources(restApiId=api_ID)
for resource in api_response3['items']:
if resource['path'] == "YOUR_PATH":
resource_ID = resource['id']
# Check for Existing Integrations
api_response4 = apiClient.get_integration(restApiId=api_ID, resourceId=resource_ID , httpMethod='GET')
print(api_response4)
# Create Integration with Request
integration_request = { 'application/json': '{\r\n "body" : $input.json(\'$\'),\r\n}' }
api_response5 = apiClient.put_integration(restApiId=api_ID, resourceId=resource_ID , httpMethod='GET', type='AWS',
integrationHttpMethod='GET', uri="YOUR_LAMBDA_URI", requestTemplates=integration_request)
print(api_response5)
All the methods listed above are explained in the Boto3 Documentation found here.
As with most API Gateway updates to API definitions, in order to update an integration request, you have to do a PATCH and pass a body with a patch document using the expected format. See documentation here
I can't seem to get the EMBED-API Server-side Authorization demo to work:
https://ga-dev-tools.appspot.com/embed-api/server-side-authorization/
In the demo it says the following:
Once the library is installed you can add the following python module
to your project and invoke the get_access_token() method to get an
access token that you can use to authorize the Embed API.
# service-account.py
from oauth2client.service_account import ServiceAccountCredentials
# The scope for the OAuth2 request.
SCOPE = 'https://www.googleapis.com/auth/analytics.readonly'
# The location of the key file with the key data.
KEY_FILEPATH = 'path/to/json-key.json'
# Defines a method to get an access token from the ServiceAccount object.
def get_access_token():
return ServiceAccountCredentials.from_json_keyfile_name(
KEY_FILEPATH, SCOPE).get_access_token().access_token
I've succesfully done all the previous steps, but this one I just can't get my head around. Where do I put this code? It seems as if it should be put in a .py file.
Can someone please help?
It depends on your implementation, but basically you want to run your service account code on your server, and have the access token passed to your client application so it can make authorized requests from the browser.
The whole app is open sourced and you can see where the service account code is in the source code.
As in the demo, if you are using django or app engine it is easy to put python server code in your site which will return the token and replace the value in template code.
Add that code in service-account.py file and upload it on your server using FTP. I saved the code using dreamweaver, updated the path and added following line at the end of the service-account.py file:
print get_access_token()
Upload .JSON file in same directory and ran the command python service-account.py to get access_token.
I created 2 applications in my Azure directory, 1 for my API Server and one for my API client. I am using the Python ADAL Library and can successfully obtain a token using the following code:
tenant_id = "abc123-abc123-abc123"
context = adal.AuthenticationContext('https://login.microsoftonline.com/' + tenant_id)
token = context.acquire_token_with_username_password(
'https://myapiserver.azurewebsites.net/',
'myuser',
'mypassword',
'my_apiclient_client_id'
)
I then try to send a request to my API app using the following method but keep getting 'unauthorized':
at = token['accessToken']
id_token = "Bearer {0}".format(at)
response = requests.get('https://myapiserver.azurewebsites.net/', headers={"Authorization": id_token})
I am able to successfully login using myuser/mypass from the loginurl. I have also given the client app access to the server app in Azure AD.
Although the question was posted a long time ago, I'll try to provide an answer. I stumbled across the question because we had the exact same problem here. We could successfully obtain a token with the adal library but then we were not able to access the resource I obtained the token for.
To make things worse, we sat up a simple console app in .Net, used the exact same parameters, and it was working. We could also copy the token obtained through the .Net app and use it in our Python request and it worked (this one is kind of obvious, but made us confident that the problem was not related to how I assemble the request).
The source of the problem was in the end in the oauth2_client of the adal python package. When I compared the actual HTTP requests sent by the .Net and the python app, a subtle difference was that the python app sent a POST request explicitly asking for api-version=1.0.
POST https://login.microsoftonline.com/common//oauth2/token?api-version=1.0
Once I changed the following line in oauth2_client.py in the adal library, I could access my resource.
Changed
return urlparse('{}?{}'.format(self._token_endpoint, urlencode(parameters)))
in the method _create_token_url, to
return urlparse(self._token_endpoint)
We are working on a pull request to patch the library in github.
For the current release of Azure Python SDK, it support authentication with a service principal. It does not support authentication using an ADAL library yet. Maybe it will in future releases.
See https://azure-sdk-for-python.readthedocs.io/en/latest/resourcemanagement.html#authentication for details.
See also Azure Active Directory Authentication Libraries for the platforms ADAL is available on.
#Derek,
Could you set your Issue URL on Azure Portal? If I set the wrong Issue URL, I could get the same error with you. It seems that your code is right.
Base on my experience, you need add your application into Azure AD and get a client ID.(I am sure you have done this.) And then you can get the tenant ID and input into Issue URL textbox on Azure portal.
NOTE:
On old portal(manage.windowsazure.com),in the bottom command bar, click View Endpoints, and then copy the Federation Metadata Document URL and download that document or navigate to it in a browser.
Within the root EntityDescriptor element, there should be an entityID attribute of the form https://sts.windows.net/ followed by a GUID specific to your tenant (called a "tenant ID"). Copy this value - it will serve as your Issuer URL. You will configure your application to use this later.
My demo is as following:
import adal
import requests
TenantURL='https://login.microsoftonline.com/*******'
context = adal.AuthenticationContext(TenantURL)
RESOURCE = 'http://wi****.azurewebsites.net'
ClientID='****'
ClientSect='7****'
token_response = context.acquire_token_with_client_credentials(
RESOURCE,
ClientID,
ClientSect
)
access_token = token_response.get('accessToken')
print(access_token)
id_token = "Bearer {0}".format(access_token)
response = requests.get(RESOURCE, headers={"Authorization": id_token})
print(response)
Please try to modified it. Any updates, please let me know.