Cloud Function logging on execution ID from Django

Stefan Norman
2 min readOct 27, 2020

We have multiple Cloud Scheduler jobs to the same Cloud Function running Django. The jobs only differentiate on payload, making it hard to get an overview of a single function execution and see your own log messages grouped with the GCP output.

Ideally you want to have all your output logged with the same execution_id as the “Function execution started” messages in the above example.

To solve this problem you will have to create a custom cloud logger. That is because Django has the habit to conform the Resource class to a ConvertingTuple . That won’t work. By extending CloudLoggingHandler that will be worked around.

from google.cloud.logging.handlers import CloudLoggingHandler
from google.cloud.logging.handlers.transports import BackgroundThreadTransport
from google.cloud.logging.resource import Resource

DEFAULT_LOGGER_NAME = "python"


class CustomCloudLoggingHandler(CloudLoggingHandler):
"""
Custom Cloud logging handler to work around the issue of `resource` being transformed to
a ConvertingTuple if initialized in settings.LOGGING
See: https://stackoverflow.com/questions/56281319/
app-engine-stackdriver-logging-to-global-log-instead-of-service-log#comment99366233_56287031
"""

def __init__(
self,
client,
name=DEFAULT_LOGGER_NAME,
transport=BackgroundThreadTransport,
resource=None,
labels=None,
stream=None,
):
super(CloudLoggingHandler, self).__init__(stream)
self.name = name
self.client = client
self.transport = transport(client, name)
self.resource = Resource(type=resource['type'], labels=resource['labels'])
self.labels = labels

Django can then be configured with the custom handler in settings.py.

# StackDriver setup
from google.cloud import logging as cloud_logging

log_client = cloud_logging.Client()
# App Engine setup
resource = {
'type': 'gae_app',
'labels': {
'project_id': os.getenv('GOOGLE_CLOUD_PROJECT'),
'module_id': os.getenv('GAE_SERVICE'),
'version_id': os.getenv('GAE_VERSION'),
'zone': GOOGLE_CLOUD_LOCATION
}
}
# Cloud Function setup
if os.getenv('GCP_PROJECT'):
resource = {
'type': 'cloud_function',
'labels': {
'project_id': os.getenv('GCP_PROJECT'),
'function_name': os.getenv('FUNCTION_NAME'),
'region': GOOGLE_CLOUD_LOCATION
}
}

LOGGING['handlers']['stackdriver'] = {
'class': 'logging_handlers.CustomCloudLoggingHandler',
'client': log_client,
"level": "DEBUG",
'resource': resource
}

The above configuration will log to Stackdriver, but the main problem is still not solved. To group log messages on execution_id, we need to reconfigure the logging on the fly for the Cloud Function.

When triggering the Cloud Function through HTTP, GCP will send the Function-Execution-Id header. In main.py, we can then grab the header and pass it as a label to our custom handler along with the name to get log grouping to work.

def run_background_job(request):
"""
Google Cloud Function for running various background jobs.
"""

# Reconfigure logging handler to add Cloud Function execution id.
from django.utils.log import configure_logging
settings.LOGGING['handlers']['stackdriver'][
'name'] = "cloudfunctions.googleapis.com%2Fcloud-functions"
settings.LOGGING['handlers']['stackdriver']['labels'] = {
'execution_id': request.headers['Function-Execution-Id']
}
configure_logging(settings.LOGGING_CONFIG, settings.LOGGING)

This will group the logs nicely.

--

--