Skip to content

Add support to custom JSONEncoder #656

Closed
@thiromi

Description

@thiromi

Is your feature request related to a problem? Please describe.
My company is moving out from structlog to start using this package, as it is intended to have better integration with google products. We're using StructuredLogHandler to send the log records to google cloud logging. But something structlog used to do out-of-the-box was the ability to automatically convert the extra data we provide to the logger to be sent to the stream, and therefore, we need to check all the calls we're making to logger.info(), debug(), warning(), etc. to check if the data type is supported by JSONEncoder, otherwise the call to hadler.format() will raise TypeError, stating that Object of type PosixPath is not JSON serializable (this is one example of error we're getting, because the default encoder cannot encode pathlib.Path instances) and the log record will not be pushed to gcloud logging
Describe the solution you'd like
I'd like to be able to specify a custom class to encode the message to JSON, so the formatting is handled by the encoder, not the caller of logger.info() and its siblings
Describe alternatives you've considered
An alternative would be having a filter that prepares the data to be used by the handler, I'm even wrote a filter to handle that:

class JsonableConverterFilter(logging.Filter):
    """Prepare log record to be JSON encoded

    :class:`google.cloud.logging_v2.handlers.StructuredLogHandler` encodes directly to default JSON encoder without
    the possibility to set up a custom encoder, so we need to hydrate the log record to not break when emitting
    (because broken logs break silently)

    The ideal solution would be to have a LogFormatter, but as GCloud's log handler formats to JSON inside the handler,
    we need to circumvent this issue with a :class:`logging.Filter`
    """

    def filter(self, record: logging.LogRecord) -> bool:
        fields: Dict[str, Any] = getattr(record, "json_fields", {})

        for key, value in fields.items():
            fields[key] = JsonableConverterFilter.prepare_field(value)

        return True

    @classmethod
    def prepare_field(cls, value: Any) -> TypeJSON:
        """Converts the provided type into a plain JSON type"""
        # basic types
        if isinstance(value, (int, str, float, bool)) or value is None:
            return value

        try:
            iterator = iter(value)
        except TypeError:
            # probably a custom class
            if not hasattr(value, "__dict__"):
                return str(value)

            iterator = iter(value.__dict__)

        # hashmaps
        if hasattr(value, "keys") and hasattr(value, "values"):
            return {key: cls.prepare_field(val) for key, val in value.items()}

        # sequences
        return [cls.prepare_field(item) for item in iterator]

but it seems to me I'm deviating the purpose of log filters by doing that
Additional context
n/a for now

Metadata

Metadata

Assignees

Labels

api: loggingIssues related to the googleapis/python-logging API.priority: p3Desirable enhancement or fix. May not be included in next release.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions