-
Notifications
You must be signed in to change notification settings - Fork 626
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there a way to make loguru automatically print the trace_id in logs like logging does? #3615
Comments
What I ended up doing was making a custom wrapper for my loguru logger, which automatically binds the traceID and spanID for me tracing_log_format = (
"<green>{time:YYYY-MM-DD HH:mm:ss}</green> | "
"<level>{level: <8}</level> | "
"<yellow>req_id:{extra[request_id]}</yellow> | "
"<red>t:{extra[otelTraceID]} s:{extra[otelSpanID]}</red> | "
"<cyan>{name}</cyan>:<cyan>{function}</cyan>:<cyan>{line}</cyan> - <level>{message}</level>"
"<blue>context: {extra}</blue>"
)
logger.configure(
extra={"otelSpanID": 0, "otelTraceID": 0}
)
logger.add(sys.stderr, format=tracing_log_format, level="INFO")
def _prepare_logger_extras() -> Dict[str, Any]:
extra_data = {}
span = trace.get_current_span()
extra_data["otelSpanID"] = trace.format_span_id(
span.get_span_context().span_id
)
extra_data["otelTraceID"] = trace.format_trace_id(
span.get_span_context().trace_id
)
return extra_data
def info(__message: str, *args: Any, **kwargs: Any):
logger.opt(depth=1).bind(**_prepare_logger_extras()).info(
__message, *args, **kwargs
) Though I'm not sure if this can be considered good practice or not, maybe others can also help enlighten me on this |
Only the standard Python logging library is currently supported by OpenTelemetry. Converting this to a feature request to support loguru. |
Related: #3806 |
If anyone more knowledgable on Otel can comment, the creator of |
I can print out the trace_id using the default logging, but I can't do it with loguru. Is there a way to make loguru also print out the trace_id?
The text was updated successfully, but these errors were encountered: