Logging Python and Google Cloud - python

I am creating an application which can be run locally or on Google Cloud. To set up google cloud logging I've used Google Cloud Logging, made a cloud logger and basically log using the class below
class CloudLogger():
def __init__(self, instance_id: str = LOGGER_INSTANCE_ID, instance_zone: str = LOGGER_INSTANCE_ZONE) -> None:
self.instance_id = instance_id
self.instance_zone = instance_zone
self.cred = service_account.Credentials.from_service_account_file(CREDENTIAL_FILE)
self.client = gcp_logging.Client(project = PROJECT, credentials=self.cred)
self.res = Resource(type="gce_instance",
labels={
"instance_id": self.instance_id,
"zone": self.instance_zone
})
self.hdlr = CloudLoggingHandler(self.client, resource = self.res)
self.logger = logging.getLogger('my_gcp_logger')
self.hdlr.setFormatter(logging.Formatter('%(message)s'))
if not self.logger.handlers:
self.logger.addHandler(self.hdlr)
self.logger.setLevel(logging.INFO)
def info(self, log_this):
self.logger.setLevel(logging.INFO)
self.logger.info(log_this)
I want to have this so that if it is running on the cloud, it uses the GCP logger, and if run locally, it uses python logging. I can either pass in as an argument ("Cloud", "Local") or make it intelligent enough to understand on its own. But I want the underlying logic to be the same so that I can log to cloud/local seamlessly. How would I go about doing this?
Wondering if (maybe) theres some way to create a local logger. And have those local logs parsed to GCP if running on the cloud.

I coped with this issue and Google is aware of it (and also for Go). My helper do this:
I perform a request to metadata server. A get to http://metadata.google.internal/computeMetadata/v1/project/ with this header Metadata-Flavor: Google
If I have a 404, I'm in local, and I set up my logger as local
If I have a 2XX, I'm on GCP and I set up the logger to use FluentD format (GCP Cloud Logging)
Not perfect, but enough for me!

Related

How can I access secrets from Hashicorp Vault deployed on OpenShift/Kubernetes directly in my Python code?

We can deploy a Hashicorp Vault on an OpenShift Cluster using Help (see this and this). Following the same links, this allows us to enable authentication by Kubernetes and to request secrets directly from Vault in these environments:
vault kv put foo/bar username="static-user" password="static-password"
vault kv get foo/bar
While this is ideal for an Openshift/Kubernetes environment, I would rather prefer to work with a Secret Manager client that I can access within my Python code as a library, for example something like this:
class SomeSecretManager():
"""Implementation of the contract of SecretManager"""
def __init__(self):
self.client = None
def initialize_connection(self):
self.client = SecretsManagerV1(authenticator="how do I make this link with an OpenShift cluster")
def get_secret(self, secret_id: str) -> Union[str, dict]:
response = self.client.get_secret(id=secret_id)
return response.get_result()["resources"][0]["secret_data"]["payload"]
I am wondering now how I can make this transition to access the deployed Vault pod on an OpenShift cluster directly in my Python code (both locally as well as when deployed on OpenShift), using a client similar to what has been illustrated above.

Having Trouble sending long message Python Logs From Google Cloud Functions Out Of Google Cloud Platform

I was wondering whether cloud functions has restrictions to export logging from python out of Google Cloud Platform for long messages (over 1500 characters).
The below are multiple examples from Papertrail on how to send log messages from Python, but the one I tried is below which worked sending messages to Papertrail on Google Compute Engine, but did not work on Google Cloud Functions. Google Cloud Function instead logs them in Google Cloud Operation Logging Service.
import logging
import socket
from logging.handlers import SysLogHandler
class ContextFilter(logging.Filter):
hostname = socket.gethostname()
def filter(self, record):
record.hostname = ContextFilter.hostname
return True
syslog = SysLogHandler(address=('logsN.papertrailapp.com', XXXXX))
syslog.addFilter(ContextFilter())
format = '%(asctime)s %(hostname)s YOUR_APP: %(message)s'
formatter = logging.Formatter(format, datefmt='%b %d %H:%M:%S')
syslog.setFormatter(formatter)
logger = logging.getLogger()
logger.addHandler(syslog)
logger.setLevel(logging.INFO)
logger.info("This is a message")
In particular, it seems that for the above code, the address I placed on the handler does not work in Google Cloud Function if the message I pass is too long.
syslog = SysLogHandler(address=('logsN.papertrailapp.com', XXXXX))
Is this normal or my cloud function has some issues?
Is there a way I can pass an address in my python logging within Google Cloud Functions that can be compatible for long messages?

Unable to perform Cloud Function trigger a HTTP triggered Cloud Function that doesn't allow unauthenticated invocations?

I have a situation where I am trying to create two Cloud Functions namely CF1 & CF2 and I have one Cloud Scheduler. Both cloud functions are having authenticated invocation enabled. My flow is Cloud Scheduler will trigger CF1. On completion of CF1, the CF1 will trigger CF2 as a http call. I have referred Cannot invoke Google Cloud Function from GCP Scheduler to access authenticated CF1 from Cloud Scheduler and able to access CF1. But I am getting problem when accessing CF2 from CF1. The CF1 does not trigger CF2 and also not giving any error message. Do we need to follow any other technique when accessing authenticated Cloud Function from another authenticated Cloud Function.
CF1 code:
import json
import logging
from requests_futures.sessions import FuturesSession
def main(request):
# To read parameter values from request (url arguments or Json body).
raw_request_data = request.data
string_request_data = raw_request_data.decode("utf-8")
request_json: dict = json.loads(string_request_data)
request_args = request.args
if request_json and 'cf2_endpoint' in request_json:
cf2_endpoint = request_json['cf2_endpoint']
elif request_args and 'cf2_endpoint' in request_args:
cf2_endpoint = request_args['cf2_endpoint']
else:
cf2_endpoint = 'Invalid endpoint for CF2'
logger = logging.getLogger('test')
try:
session = FuturesSession()
session.get("{}".format(cf2_endpoint))
logger.info("First cloud function executed successfully.")
except RuntimeError:
logger.error("Exception occurred {}".format(RuntimeError))
CF2 code:
import logging
def main(request):
logger = logging.getLogger('test')
logger.info("second cloud function executed successfully.")
Current output logs:
First cloud function executed successfully.
Expected output logs:
First cloud function executed successfully.
second cloud function executed successfully.
Note: Same flow is working if I use unauthenticated access to the both cloud functions.
Two things are happening here:
You're not using request-futures entirely correctly. Since the request is made asynchronously, you need to block on the result before the function implicitly returns, otherwise it might return before your HTTP request completes (although it probably is in this example):
session = FuturesSession()
future = session.get("{}".format(cf2_endpoint))
resp = future.result() # Block on the request completing
The request you're making to the second function is not actually an authenticated request. Outbound requests from a Cloud Function are not authenticated by default. If you looked at what the actual response is above, you would see:
>>> resp.status_code
403
>>> resp.content
b'\n<html><head>\n<meta http-equiv="content-type" content="text/html;charset=utf-8">\n<title>403 Forbidden</title>\n</head>\n<body text=#000000 bgcolor=#ffffff>\n<h1>Error: Forbidden</h1>\n<h2>Your client does not have permission to get URL <code>/function_two</code> from this server.</h2>\n<h2></h2>\n</body></html>\n'
You could jump through a lot of hoops to properly authenticate this request, as detailed in the docs: https://cloud.google.com/functions/docs/securing/authenticating#function-to-function
However, a better alternative would be to make your second function a "background" function and invoke it via a PubSub message published from the first function instead:
from google.cloud import pubsub
publisher = pubsub.PublisherClient()
topic_name = 'projects/{project_id}/topics/{topic}'.format(
project_id=<your project id>,
topic='MY_TOPIC_NAME', # Set this to something appropriate.
)
def function_one(request):
message = b'My first message!'
publisher.publish(topic_name, message)
def function_two(event, context):
message = event['data'].decode('utf-8')
print(message)
As long as your functions have the permissions to publish PubSub messages, this avoids the need to add authorization to the HTTP requests, and also ensures at-least-once delivery.
Google Cloud Function provide REST API interface what include call method that can be used in another Cloud Function HTTP invokation.
Although the documentation mention using Google-provided client libraries there is still non one for Cloud Function on Python.
And instead you need to use general Google API Client Libraries. [This is the python one].3
Probably, the main difficulties while using this approach is an understanding of authentification process.
Generally you need provide two things to build a client service:
credentials ans scopes.
The simpliest way to get credentials is relay on Application Default Credentials (ADC) library. The rigth documentation about that are:
https://cloud.google.com/docs/authentication/production
https://github.com/googleapis/google-api-python-client/blob/master/docs/auth.md
The place where to get scopes is the each REST API function documentation page.
Like, OAuth scope: https://www.googleapis.com/auth/cloud-platform
The complete code example of calling 'hello-world' clound fucntion is below.
Before run:
Create default Cloud Function on GCP in your project.
Keep and notice the default service account to use
Keep the default body.
Notice the project_id, function name, location where you deploy function.
If you will call function outside Cloud Function environment (locally for instance) setup the environment variable GOOGLE_APPLICATION_CREDENTIALS according the doc mentioned above
If you will call actualy from another Cloud Function you don't need to configure credentials at all.
from googleapiclient.discovery import build
from googleapiclient.discovery_cache.base import Cache
import google.auth
import pprint as pp
def get_cloud_function_api_service():
class MemoryCache(Cache):
_CACHE = {}
def get(self, url):
return MemoryCache._CACHE.get(url)
def set(self, url, content):
MemoryCache._CACHE[url] = content
scopes = ['https://www.googleapis.com/auth/cloud-platform']
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set,
# ADC uses the service account file that the variable points to.
#
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS isn't set,
# ADC uses the default service account that Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run,
# and Cloud Functions provide
#
# see more on https://cloud.google.com/docs/authentication/production
credentials, project_id = google.auth.default(scopes)
service = build('cloudfunctions', 'v1', credentials=credentials, cache=MemoryCache())
return service
google_api_service = get_cloud_function_api_service()
name = 'projects/{project_id}/locations/us-central1/functions/function-1'
body = {
'data': '{ "message": "It is awesome, you are develop on Stack Overflow language!"}' # json passed as a string
}
result_call = google_api_service.projects().locations().functions().call(name=name, body=body).execute()
pp.pprint(result_call)
# expected out out is:
# {'executionId': '3h4c8cb1kwe2', 'result': 'It is awesome, you are develop on Stack Overflow language!'}

Python logging to Azure

I am using Python and I was wondering if there is any package/simple way for logging directly to Azure?
I found a package (azure-storage-logging) that would be really nice, however it is not being maintained and not compatible with the new Azure API.
Any help is welcome.
You should use Application Insights which will send the logs to Azure Monitor (previously Log Analytics).
https://learn.microsoft.com/en-us/azure/azure-monitor/app/opencensus-python
I had the same requirement to log error and debug messages for small application and store the logs to Azure data lake. We did not want to use Azure Insight as our was not a web application and we just needed logs to debug the code.
To solve this I created temp.log file.
logging.basicConfig(filename='temp.log', format='%(asctime)s %(levelname)-8s [%(filename)s:%(lineno)d] %(message)s',
datefmt='%Y-%m-%d:%H:%M:%S')
At the end of program I uploaded the temp.log to azure using,
DataLakeFileClient.append_data
local_file = open("temp.log",'r')
file_contents = local_file.read()
file_client.append_data(data=file_contents, offset=0, length=len(file_contents))
file_client.flush_data(len(file_contents))
https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-python
You can just create your own handler. I show you how to log it to an azure table. Storing in blob can be similar. The biggest benefit is that you emit the log as you log it instead of sending logs at the end of a process.
Create a table in azure table storage and then run the following code.
from logging import Logger, getLogger
from azure.core.credentials import AzureSasCredential
from azure.data.tables import TableClient, TableEntity
class _AzureTableHandler(logging.Handler):
def __init__(self, *args, **kwargs):
super(_AzureTableHandler, self).__init__(*args, **kwargs)
credentials: AzureSasCredential = AzureSasCredential(signature=<sas-token>)
self._table_client: TableClient = TableClient(endpoint=<storage-account-url>, table_name=<table-name>, credential=credentials)
def emit(self, record):
level = record.levelname
message = record.getMessage()
self._table_client.create_entity(TableEntity({'Severity': level,
'Message': message,
'PartitionKey': f'{datetime.now().date()}',
'RowKey': f'{datetime.now().microsecond}'}))
if __name__ == "__main__":
logger: Logger = getLogger(__name__)
logger.addHandler(_AzureTableHandler())
logger.warning('testing azure logging')
In this approach, you also have the benefit of creating custom columns for your table. For example, you can have separate columns for the project name which is logging, or the username of the dev who is running the script.
logger.addHandler(_AzureTableHandler(Username="Hesam", Project="Deployment-support-tools-client-portal"))
Make sure to add your custom column names to the table_entity dictionary. Or you can put the project name as partition key.

How to access a remote datastore when running dev_appserver.py?

I'm attempting to run a localhost web server that has remote api access to a remote datastore using the remote_api_stub method ConfigureRemoteApiForOAuth.
I have been using the following Google doc for reference but find it rather sparse:
https://cloud.google.com/appengine/docs/python/tools/remoteapi
I believe I'm missing the authentication bit, but can't find a concrete resource to guide me. What would be the easiest way, given the follow code example, to access a remote datastore while running dev_appserver.py?
import webapp2
from google.appengine.ext import ndb
from google.appengine.ext.remote_api import remote_api_stub
class Topic(ndb.Model):
created_by = ndb.StringProperty()
subject = ndb.StringProperty()
#classmethod
def query_by_creator(cls, creator):
return cls.query(Topic.created_by == creator)
class MainPage(webapp2.RequestHandler):
def get(self):
remote_api_stub.ConfigureRemoteApiForOAuth(
'#####.appspot.com',
'/_ah/remote_api'
)
topics = Topic.query_by_creator('bill')
self.response.headers['Content-Type'] = 'text/plain'
self.response.out.write('<html><body>')
self.response.out.write('<h1>TOPIC SUBJECTS:<h1>')
for topic in topics.fetch(10):
self.response.out.write('<h3>' + topic.subject + '<h3>')
self.response.out.write('</body></html>')
app = webapp2.WSGIApplication([
('/', MainPage)
], debug=True)
This get's asked a lot, simply because you can't use app engines libraries outside of the SDK. However, there is also an easier way to do it from within the App Engine SDK as well.
I would use gcloud for this. Here's how to set it up:
If you want to interact with google cloud storage services inside or outside of the App Engine environment, you may use Gcloud (https://googlecloudplatform.github.io/gcloud-python/stable/) to do so.
You need a service account on your application as well as download the JSON credentials file. You do this on the app engine console under the authentication tab. Create it, and then download it. Call it client_secret.json or something.
With those, once you install the proper packages for gcloud with pip, you'll be able to make queries as well as write data.
Here is an example of authenticating yourself to use the library:
from gcloud import datastore
# the location of the JSON file on your local machine
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/location/client_secret.json"
# project ID from the Developers Console
projectID = "THE_ID_OF_YOUR_PROJECT"
os.environ["GCLOUD_TESTS_PROJECT_ID"] = projectID
os.environ["GCLOUD_TESTS_DATASET_ID"] = projectID
client = datastore.Client(dataset_id=projectID)
Once that's done, you can make queries like this:
query = client.query(kind='Model').fetch()
It's actually super easy. Any who, that's how I would do that! Cheers.

Categories

Resources