Cloud SQL instance not being stopped using Cloud Scheduler - python

Cloud SQL instance is not being stopped using Cloud Schedule after these steps:
Create a pub/sub topic that is supposed to trigger the cloud function.
Deploy a cloud function using the topic already created in step 1, with the below python (3.8) code file and requirements. (Entry point: start_stop)
Create a cloud scheduler job to trigger the cloud function on a regular basis with the topic created in step 1.
The payload is set to start [CloudSQL instance name] or stop [CloudSQL instance name] to start or stop the specified instance
Main.py:
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
import base64
from pprint import pprint
credentials = GoogleCredentials.get_application_default()
service = discovery.build('sqladmin', 'v1beta4', credentials=credentials, cache_discovery=False)
project = 'projectID'
def start_stop(event, context):
print(event)
pubsub_message = base64.b64decode(event['data']).decode('utf-8')
print(pubsub_message)
command, instance_name = pubsub_message.split(' ', 1)
if command == 'start':
start(instance_name)
elif command == 'stop':
stop(instance_name)
else:
print("unknown command " + command)
def start(instance_name):
print("starting " + instance_name)
patch(instance_name, "ALWAYS")
def stop(instance_name):
print("stopping " + instance_name)
patch(instance_name, "NEVER")
def patch(instance, activation_policy):
request = service.instances().get(project=project, instance=instance)
response = request.execute()
j = response["settings"]
settingsVersion = int(j["settingsVersion"])
dbinstancebody = {
"settings": {
"settingsVersion": settingsVersion,
"activationPolicy": activation_policy
}
}
dbinstancebody = {
"settings": {
"settingsVersion": response["settings"]["settingsVersion"],
"activationPolicy": activation_policy
}
}
request = service.instances().update(
project=project,
instance=instance,
body=dbinstancebody)
response = request.execute()
pprint(response)
Requirements.txt
google-api-python-client==1.10.0
google-auth-httplib2==0.0.4
google-auth==1.21.1
oauth2client==4.1.3
When I click RUN NOW button in the stop scheduler, it's executed successfully, but when I navigate to SQL instance, it is not stopped.
Can someone spot what I am missing? If you need more details, just let me know please, I have just started with GCP. :)

Tier configuration was missing in the body sent to the GCP api:
dbinstancebody = {
"settings": {
"settingsVersion": settingsVersion,
"tier": "db-custom-2-13312"
"activationPolicy": activation_policy
}
}
If you click on the deployed function you will see all the details (along with the graphs), but in the end, there are also the errors displayed. (My PC didn't fit all the screen, that's why I noticed this section later on 😅)

Related

Access CosmosDB Data from Azure App Service by using managed identity (Failed)

A FastAPI-based API written in Python has been deployed as an Azure App Service. The API needs to read and write data from CosmosDB, and I attempted to use Managed Identity for this purpose, but encountered an error, stating Unrecognized credential type
These are the key steps that I took towards that goal
Step One: I used Terraform to configure the managed identity for Azure App Service, and assigned the 'contributor' role to the identity so that it can access and write data to CosmosDB. The role assignment was carried out in the file where the Azure App Service is provisioned.
resource "azurerm_linux_web_app" "this" {
name = var.appname
location = var.location
resource_group_name = var.rg_name
service_plan_id = azurerm_service_plan.this.id
app_settings = {
"PROD" = false
"DOCKER_ENABLE_CI" = true
"DOCKER_REGISTRY_SERVER_URL" = data.azurerm_container_registry.this.login_server
"WEBSITE_HTTPLOGGING_RETENTION_DAYS" = "30"
"WEBSITE_ENABLE_APP_SERVICE_STORAGE" = false
}
lifecycle {
ignore_changes = [
app_settings["WEBSITE_HTTPLOGGING_RETENTION_DAYS"]
]
}
https_only = true
identity {
type = "SystemAssigned"
}
data "azurerm_cosmosdb_account" "this" {
name = var.cosmosdb_account_name
resource_group_name = var.cosmosdb_resource_group_name
}
// built-in role that allow the app-service to read and write to an Azure Cosmos DB
resource "azurerm_role_assignment" "cosmosdbContributor" {
scope = data.azurerm_cosmosdb_account.this.id
principal_id = azurerm_linux_web_app.this.identity.0.principal_id
role_definition_name = "Contributor"
}
Step Two: I used the managed identity library to fetch the necessary credentials in the Python code.
from azure.identity import ManagedIdentityCredential
from azure.cosmos.cosmos_client import CosmosClient
client = CosmosClient(get_endpoint(),credential=ManagedIdentityCredential())
client = self._get_or_create_client()
database = client.get_database_client(DB_NAME)
container = database.get_container_client(CONTAINER_NAME)
container.query_items(query)
I received the following error when running the code locally and from Azure (the error can be viewed from the Log stream of the Azure App Service):
raise TypeError(
TypeError: Unrecognized credential type. Please supply the master key as str, or a dictionary or resource tokens, or a list of permissions.
Any help or discussion is welcome
If you are using the Python SDK, you can directly do this ,check the sample here
aad_credentials = ClientSecretCredential(
tenant_id="<azure-ad-tenant-id>",
client_id="<client-application-id>",
client_secret="<client-application-secret>")
client = CosmosClient("<account-endpoint>", aad_credentials)

Use iot_v1 in a GCP Cloud Function

I'm attempting to write a GCP Cloud Function in Python that calls the API for creating an IoT device. The initial challenge seems to be getting the appropriate module (specifically iot_v1) loaded within Cloud Functions so that it can make the call.
Example Python code from Google is located at https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/iot/api-client/manager/manager.py. The specific call desired is shown in "create_es_device". Trying to repurpose that into a Cloud Function (code below) errors out with "ImportError: cannot import name 'iot_v1' from 'google.cloud' (unknown location)"
Any thoughts?
import base64
import logging
import json
import datetime
from google.auth import compute_engine
from apiclient import discovery
from google.cloud import iot_v1
def handle_notification(event, context):
#Triggered from a message on a Cloud Pub/Sub topic.
#Args:
# event (dict): Event payload.
# context (google.cloud.functions.Context): Metadata for the event.
#
pubsub_message = base64.b64decode(event['data']).decode('utf-8')
logging.info('New device registration info: {}'.format(pubsub_message))
certData = json.loads(pubsub_message)['certs']
deviceID = certData['device-id']
certKey = certData['certificate']
projectID = certData['project-id']
cloudRegion = certData['cloud-region']
registryID = certData['registry-id']
newDevice = create_device(projectID, cloudRegion, registryID, deviceID, certKey)
logging.info('New device: {}'.format(newDevice))
def create_device(project_id, cloud_region, registry_id, device_id, public_key):
# from https://cloud.google.com/iot/docs/how-tos/devices#api_1
client = iot_v1.DeviceManagerClient()
parent = client.registry_path(project_id, cloud_region, registry_id)
# Note: You can have multiple credentials associated with a device.
device_template = {
#'id': device_id,
'id' : 'testing_device',
'credentials': [{
'public_key': {
'format': 'ES256_PEM',
'key': public_key
}
}]
}
return client.create_device(parent, device_template)
You need to have the google-cloud-iot project listed in your requirements.txt file.
See https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/iot/api-client/manager/requirements.txt

how to get locationId in Google App Engine not using terminal

To use scheduler_v1.CloudSchedulerClient().location_path() in Python I need parent with projectId and LocationId. I know how to get projectId in code and locationId from terminal, but how to do it in my code?
I've tried to check this website(https://cloud.google.com/functions/docs/reference/rpc/google.cloud.location), but there are no examples, idk what to do with it
from google.cloud import scheduler_v1
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file('/home/myname/folder/service_account.json')
service = googleapiclient.discovery.build('cloudresourcemanager', 'v1', credentials = credentials)
request = service.projects().list()
response = request.execute()
client = scheduler_v1.CloudSchedulerClient()
for project in response['projects']:
parent = client.location_path(project['projectId'], ['LocationID'])
for element in client.list_jobs(parent):
print(element)
Thank you!
I've recently begun using the http://metadata.google.internal/computeMetadata/v1/instance/region endpoint. While it works in App Engine, it's not explicitly documented right now (but it is documented for Cloud Run: https://cloud.google.com/run/docs/reference/container-contract).
The result from that endpoint will be something that looks like this:
projects/12345678901234/regions/us-central1
Obviously, the region is the last part (us-central1).
Here's a sample that I have in Go (remember to set the Metadata-Flavor header to Google):
var region string
{
domain := "http://metadata.google.internal"
path := "computeMetadata/v1/instance/region"
request, err := http.NewRequest(http.MethodGet, domain+"/"+path, nil)
if err != nil {
logrus.Errorf("Could not create request: %v", err)
panic(err)
}
request.Header.Set("Metadata-Flavor", "Google")
response, err := http.DefaultClient.Do(request)
if err != nil {
logrus.Errorf("Could not perform request: %v", err)
panic(err)
}
contents, err := ioutil.ReadAll(response.Body)
if err != nil {
logrus.Errorf("Could not read contents: %v", err)
panic(err)
}
originalRegion := string(contents)
logrus.Infof("Contents of %q: %s", path, originalRegion)
parts := strings.Split(originalRegion, "/")
if len(parts) > 0 {
region = parts[len(parts)-1]
}
}
logrus.Infof("Region: %s", region)
It is not necessary to use cloudresourcemanager to get the project ID, instead you can use the App Engine environment variable GOOGLE_CLOUD_PROJECT
You can use App engine admin API to get the location ID please check this code snippet.
credentials = GoogleCredentials.get_application_default()
#start discovery service and use appengine admin API
service = discovery.build('appengine', 'v1', credentials=credentials, cache_discovery=False)
#disable cache for app engine std (avoid noise in logs)
logging.getLogger('googleapiclient.discovery_cache').setLevel(logging.ERROR)
#take the project ID form the environment variables
project = os.environ['GOOGLE_CLOUD_PROJECT']
#get App engine application details
req = service.apps().get(appsId=project)
response =req.execute()
#this is the application location id
location_id = (response["locationId"])
In Google App Engine, there are some environment variables set by runtime environment.
You can get it by os.environ.get(environment_name).
https://cloud.google.com/appengine/docs/standard/python3/runtime#environment_variables

oauth2client is now deprecated

In the Python code for requesting data from Google Analytics ( https://developers.google.com/analytics/devguides/reporting/core/v4/quickstart/service-py ) via an API, oauth2client is being used. The code was last time updated in July 2018 and until now the oauth2client is deprecated. My question is can I get the same code, where instead of oauth2client, google-auth or oauthlib is being used ?
I was googling to find a solution how to replace the parts of code where oauth2client is being used. Yet since I am not a developer I didn't succeed. This is how I tried to adapt the code in this link ( https://developers.google.com/analytics/devguides/reporting/core/v4/quickstart/service-py ) to google-auth. Any idea how to fix this ?
import argparse
from apiclient.discovery import build
from google.oauth2 import service_account
from google.auth.transport.urllib3 import AuthorizedHttp
SCOPES = ['...']
DISCOVERY_URI = ('...')
CLIENT_SECRETS_PATH = 'client_secrets.json' # Path to client_secrets.json file.
VIEW_ID = '...'
def initialize_analyticsreporting():
"""Initializes the analyticsreporting service object.
Returns:l
analytics an authorized analyticsreporting service object.
"""
# Parse command-line arguments.
credentials = service_account.Credentials.from_service_account_file(CLIENT_SECRETS_PATH)
# Prepare credentials, and authorize HTTP object with them.
# If the credentials don't exist or are invalid run through the native client
# flow. The Storage object will ensure that if successful the good
# credentials will get written back to a file.
authed_http = AuthorizedHttp(credentials)
response = authed_http.request(
'GET', SCOPES)
# Build the service object.
analytics = build('analytics', 'v4', http=http, discoveryServiceUrl=DISCOVERY_URI)
return analytics
def get_report(analytics):
# Use the Analytics Service Object to query the Analytics Reporting API V4.
return analytics.reports().batchGet(
body=
{
"reportRequests":[
{
"viewId":VIEW_ID,
"dateRanges":[
{
"startDate":"2019-01-01",
"endDate":"yesterday"
}],
"dimensions":[
{
"name":"ga:transactionId"
},
{
"name":"ga:sourceMedium"
},
{
"name":"ga:date"
}],
"metrics":[
{
"expression":"ga:transactionRevenue"
}]
}]
}
).execute()
def printResults(response):
for report in response.get("reports", []):
columnHeader = report.get("columnHeader", {})
dimensionHeaders = columnHeader.get("dimensions", [])
metricHeaders = columnHeader.get("metricHeader", {}).get("metricHeaderEntries", [])
rows = report.get("data", {}).get("rows", [])
for row in rows:
dimensions = row.get("dimensions", [])
dateRangeValues = row.get("metrics", [])
for header, dimension in zip(dimensionHeaders, dimensions):
print (header + ": " + dimension)
for i, values in enumerate(dateRangeValues):
for metric, value in zip(metricHeaders, values.get("values")):
print (metric.get("name") + ": " + value)
def main():
analytics = initialize_analyticsreporting()
response = get_report(analytics)
printResults(response)
if __name__ == '__main__':
main()
I need to obtain response in form of a json with given dimensions and metrics from Google Analytics.
For those running into this problem and wish to port to the newer auth libraries, do a diff b/w the 2 different versions of the short/simple Google Drive API sample at the code repo for the G Suite APIs intro codelab to see what needs to be updated (and what can stay as-is). The bottom-line is that the API client library code can remain the same while all you do is swap out the auth libraries underneath.
Note that sample is only for user acct auth... for svc acct auth, the update is similar, but I don't have an example of that yet (working on one though... will update this once it's published).

Google Calendar Integration with Django

Is there a fully fledged Django-based example of a Google Calendar integration? I was reading through Google's example page but their link at the bottom is outdated.
I'm specifically struggeling with the refresh token as Google's examples focus solely on how to get the access token. That's what I have so far:
#staff_member_required
def authorize_access(request):
return redirect(get_flow(request).step1_get_authorize_url())
#staff_member_required
def oauth2_callback(request):
credentials = get_flow(request).step2_exchange(request.GET['code'])
store = get_store()
store.put(credentials)
credentials.set_store(store)
return redirect('...')
def get_flow(request):
flow = client.flow_from_clientsecrets(os.path.join(CREDENTIAL_DIR, CLIENT_SECRET_FILE),
SCOPES,
redirect_uri='%s://%s/google-calendar/oauth2-callback/' % (request.META['wsgi.url_scheme'], request.META['HTTP_HOST'],))
flow.params['access_type'] = 'offline'
flow.params['approval_prompt'] = 'force'
return flow
def get_store():
return oauth2client.file.Storage(os.path.join(CREDENTIAL_DIR, ACCESS_TOKEN_FILE))
def has_valid_api_credentials():
credentials = get_store().get()
return credentials is not None
def build_service():
credentials = get_store().get()
if not credentials:
return None
elif credentials.access_token_expired:
http = credentials.refresh(httplib2.Http())
http = get_store().get().authorize(http)
else:
http = credentials.authorize(httplib2.Http())
service = discovery.build('calendar', 'v3', http=http)
return service
def create_events(rental_request):
service = build_service()
event = service.events().insert(...).execute()
Researching a lot of different approaches I found out that server-to-server authentication is what I wanted. This way no user has to explicitly give permissions and acquired auth-tokens don't have to be renewed. Instead, using a service account, a server can make calls itself.
Before you can start coding, you have to setup such a service account and add it to your calendar that you want the service account to access. Google has written down the three steps to create an account here. Afterwards, go to https://calendar.google.com, locate on the left side of the screen the calendar you want to share with your new service account and click the triangle next to it. From the drop-down menu choose calendar settings. This takes you to a screen where you'll find the calendar-ID which you'll need later (so write it down) and also displays a tab at the top to share access to the calendar. As "person" insert the email address from the service account, give it the respective permissions and click save (if you don't click save the service account won't be added).
The code for this is actually pretty elegant:
import os
from datetime import timedelta
import datetime
import pytz
import httplib2
from googleapiclient.discovery import build
from oauth2client.service_account import ServiceAccountCredentials
service_account_email = 'XXX#YYY.iam.gserviceaccount.com'
CLIENT_SECRET_FILE = 'creds.p12'
SCOPES = 'https://www.googleapis.com/auth/calendar'
scopes = [SCOPES]
def build_service():
credentials = ServiceAccountCredentials.from_p12_keyfile(
service_account_email=service_account_email,
filename=CLIENT_SECRET_FILE,
scopes=SCOPES
)
http = credentials.authorize(httplib2.Http())
service = build('calendar', 'v3', http=http)
return service
def create_event():
service = build_service()
start_datetime = datetime.datetime.now(tz=pytz.utc)
event = service.events().insert(calendarId='<YOUR EMAIL HERE>#gmail.com', body={
'summary': 'Foo',
'description': 'Bar',
'start': {'dateTime': start_datetime.isoformat()},
'end': {'dateTime': (start_datetime + timedelta(minutes=15)).isoformat()},
}).execute()
print(event)
I'm using oauth2client version 2.2.0 (pip install oauth2client).
I hope this answer helps :)
Just a note here: Although the code works, as per https://github.com/googleapis/google-auth-library-python/blob/7a8641a7f0718c0dce413436f23691e8590face1/docs/index.rst, oauth2client has been deprecated recently in favour of google-auth library - https://github.com/googleapis/google-auth-library-python/tree/edfe24602051969e32917e82bcedd2bace43e260
You can find the documentation of the new library here - https://google-auth.readthedocs.io/en/latest/user-guide.html
To use the new library, the code can be written as
import datetime
from datetime import timedelta
import pytz
from google.oauth2 import service_account
from googleapiclient.discovery import build
service_account_email = "app-calendar#xxxxxxxxx.iam.gserviceaccount.com"
SCOPES = ["https://www.googleapis.com/auth/calendar"]
credentials = service_account.Credentials.from_service_account_file('google_calendar_credential.json')
scoped_credentials = credentials.with_scopes(SCOPES)
def build_service():
service = build("calendar", "v3", credentials=scoped_credentials)
return service
def create_event():
service = build_service()
start_datetime = datetime.datetime.now(tz=pytz.utc)
event = (
service.events()
.insert(
calendarId="primary",
body={
"summary": "Foo 2",
"description": "Bar",
"start": {"dateTime": start_datetime.isoformat()},
"end": {
"dateTime": (start_datetime + timedelta(minutes=15)).isoformat()
},
},
)
.execute()
)
print(event)
create_event()
As I do not have enough reputation to post this as comment, I am posting this as a separate post
As this post was quite a while ago I wanted to share my 2020 version of it. Thanks for this post. Helped me a lot to achieve my goal.
import datetime
from datetime import timedelta
import pytz
from googleapiclient.discovery import build
from oauth2client.service_account import ServiceAccountCredentials
service_account_email = "INSERT_HERE"
SCOPES = ["https://www.googleapis.com/auth/calendar"]
credentials = ServiceAccountCredentials.from_json_keyfile_name(
filename="FILENAME.json", scopes=SCOPES
)
def build_service():
service = build("calendar", "v3", credentials=credentials)
return service
def create_event():
service = build_service()
start_datetime = datetime.datetime.now(tz=pytz.utc)
event = (
service.events()
.insert(
calendarId="CALENDARID#group.calendar.google.com",
body={
"summary": "Foo",
"description": "Bar",
"start": {"dateTime": start_datetime.isoformat()},
"end": {
"dateTime": (start_datetime + timedelta(minutes=15)).isoformat()
},
},
)
.execute()
)
print(event)
create_event()
2022
credits to #joey Coder(i would have added this as comment but its too long)
If you want your website or app to make events or calendars without have to use the Google accounts of the users you should use service accounts.
In https://console.cloud.google.com/ choose your project or start new one.
In the navigation menu choose "APIs & Services"
enable new APIs and then look up "calendar API", enable the API
Under "APIs & Services">"Credentials", select "Create Credentials" and click on "service account", fill in the desired name, and continue. Set role as owner(or other desired)(owner gives full access you you might want to switch to something less powerful). Click "Done"
This will redirect you to the credentials page.
Under the "Service accounts" click on the desired account(this will redirect you to the IAM & Admin panel)
Under the tab "Keys" click "ADD KEY" and select json, this will download a json file to your computer.
in the Calendar page in google
get and add the calendar ID to the admin panel under AgendaClients "CalendarId"
add the service account to the people shared as admin (make changes to events)
This is how it looks like in my django project:
signals.py
import datetime
import json
import os
from django.db.models.signals import post_delete, post_save
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
from oauth2client.service_account import ServiceAccountCredentials
from .models import Event
# If modifying these scopes, delete the file token.json.
SCOPES = ["https://www.googleapis.com/auth/calendar"]
def get_service(refresh=False):
credentials = ServiceAccountCredentials.from_json_keyfile_dict(
json.loads(os.environ.get("client_secret")), scopes=SCOPES
)
# or if you have a file
# credentials = ServiceAccountCredentials.from_json_keyfile_name(
# filename="file.json", scopes=SCOPES
# )
service = build("calendar", "v3", credentials=credentials)
return service
def handle_event(sender, created, instance, **kwargs):
"""this function creates the events in the google agenda and updates them if changed in the website"""
service = get_service()
event = instance
if not event.end_date:
event.end_date = event.start_date
if not event.end_time and event.start_time:
event.end_time = event.start_time
elif not event.end_time:
event.end_time = datetime.datetime.min.time()
if not event.start_time:
event.start_time = datetime.datetime.min.time()
if event.end_date < event.start_date:
event.end_date, event.start_date = event.start_date, event.end_date
queryset = Event.objects.filter(
id=event.id
) # https://stackoverflow.com/questions/1555060/how-to-save-a-model-without-sending-a-signal
# this is used so that we can update the google event within this signal without reshooting this signal(signals shot every time an object is saved)
event = {
"summary": event.description,
"location": event.location or "",
"description": (event.description + " " + event.summary),
"start": {
"dateTime": datetime.datetime.combine(
event.start_date, event.start_time
).isoformat(),
"timeZone": "Europe/Amsterdam",
},
"end": {
"dateTime": datetime.datetime.combine(
event.end_date, event.end_time
).isoformat(),
"timeZone": "Europe/Amsterdam",
},
"recurrence": [],
"reminders": {},
}
if created or not instance.google_link:
try:
event = (
service.events()
.insert(
calendarId=os.environ.get("calendarId"),
body=event,
)
.execute()
)
queryset.update(google_link=event["id"])
except HttpError as error:
# print("An error occurred: %s" % error)
pass
else:
try:
event = (
service.events()
.update(
calendarId=os.environ.get("calendarId"),
body=event,
eventId=instance.google_link,
)
.execute()
)
queryset.update(google_link=event["id"])
except HttpError as error:
# print("An error occurred: %s" % error)
pass
# print("#############ADDED NEW #############")
def delete_event(sender, instance, **kwargs):
"""this function deletes an event from google agenda when deleted in the website"""
try:
service = get_service()
service.events().delete(
calendarId=os.environ.get("CalendarId"),
eventId=instance.google_link,
).execute()
except:
pass
post_save.connect(handle_event, sender=Event)
post_delete.connect(delete_event, sender=Event)
models.py
class Event(models.Model):
summary = models.CharField(max_length=50)
description = models.CharField(max_length=50, null=True, blank=True)
start_date = models.DateField()
start_time = models.TimeField(null=True, blank=True)
end_date = models.DateField(null=True, blank=True)
end_time = models.TimeField(null=True, blank=True)
location = models.CharField(max_length=50, null=True, blank=True)
google_link = models.CharField(max_length=150, null=True, blank=True)
# google link is used to edit events in google if you change them in your website
Feel free to ask any questions or point out anything

Categories

Resources