Skip to content

A comprehensive structlog configuration with sensible defaults for development and production environments, featuring context management, exception formatting, and path prettification.

Notifications You must be signed in to change notification settings

iloveitaly/structlog-config

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Opinionated Defaults for Structlog

Logging is really important. Getting logging to work well in python feels like black magic: there's a ton of configuration across structlog, warnings, std loggers, fastapi + celery context, JSON logging in production, etc that requires lots of fiddling and testing to get working. I finally got this working for me in my project template and extracted this out into a nice package.

Here are the main goals:

  • High performance JSON logging in production
  • All loggers, even plugin or system loggers, should route through the same formatter
  • Structured logging everywhere
  • Ability to easily set thread-local log context
  • Nice log formatters for stack traces, ORM (ActiveModel/SQLModel), etc
  • Ability to log level and output (i.e. file path) by logger for easy development debugging
  • If you are using fastapi, structured logging for access logs

Installation

pip install structlog-config

Or with uv:

uv add structlog-config

Usage

from structlog_config import configure_logger

log = configure_logger()

log.info("the log", key="value")

JSON Logging for Production

JSON logging is automatically enabled in production and staging environments (PYTHON_ENV=production or PYTHON_ENV=staging):

from structlog_config import configure_logger

# Automatic JSON logging in production
log = configure_logger()
log.info("User login", user_id="123", action="login")
# Output: {"action":"login","event":"User login","level":"info","timestamp":"2025-09-24T18:03:00Z","user_id":"123"}

# Force JSON logging regardless of environment
log = configure_logger(json_logger=True)

# Force console logging regardless of environment
log = configure_logger(json_logger=False)

JSON logs use orjson for performance, include sorted keys and ISO timestamps, and serialize exceptions cleanly. Note that PYTHON_LOG_PATH is ignored with JSON logging (stdout only).

TRACE Logging Level

This package adds support for a custom TRACE logging level (level 5) that's even more verbose than DEBUG. This is useful for extremely detailed debugging scenarios.

The TRACE level is automatically set up when you call configure_logger(). You can use it like any other logging level:

import logging
from structlog_config import configure_logger

log = configure_logger()

# Using structlog
log.info("This is info")
log.debug("This is debug") 
log.trace("This is trace")  # Most verbose

# Using stdlib logging
logging.trace("Module-level trace message")
logger = logging.getLogger(__name__)
logger.trace("Instance trace message")

Set the log level to TRACE using the environment variable:

LOG_LEVEL=TRACE

Stdlib Log Management

By default, all stdlib loggers are:

  1. Given the same global logging level, with some default adjustments for noisy loggers (looking at you, httpx)
  2. Use a structlog formatter (you get structured logging, context, etc in any stdlib logger calls)
  3. The root processor is overwritten so any child loggers created after initialization will use the same formatter

You can customize loggers by name (i.e. the name used in logging.getLogger(__name__)) using ENV variables.

For example, if you wanted to mimic OPENAI_LOG functionality:

  • LOG_LEVEL_OPENAI=DEBUG
  • LOG_PATH_OPENAI=tmp/openai.log
  • LOG_LEVEL_HTTPX=DEBUG
  • LOG_PATH_HTTPX=tmp/openai.log

Custom Formatters

This package includes several custom formatters that automatically clean up log output:

Path Prettifier

Automatically formats pathlib.Path and PosixPath objects to show relative paths when possible, removing the wrapper class names:

from pathlib import Path
log.info("Processing file", file_path=Path.cwd() / "data" / "users.csv")
# Output: file_path=data/users.csv (instead of PosixPath('/home/user/data/users.csv'))

Whenever Datetime Formatter

Note: Requires pip install whenever to be installed.

Formats whenever datetime objects without their class wrappers for cleaner output:

from whenever import ZonedDateTime

log.info("Event scheduled", event_time=ZonedDateTime(2025, 11, 2, 0, 0, 0, tz="UTC"))
# Output: event_time=2025-11-02T00:00:00+00:00[UTC]
# Instead of: event_time=ZonedDateTime("2025-11-02T00:00:00+00:00[UTC]")

Supports all whenever datetime types: ZonedDateTime, Instant, LocalDateTime, PlainDateTime, etc.

ActiveModel Object Formatter

Note: Requires pip install activemodel and pip install typeid-python to be installed.

Automatically converts ActiveModel BaseModel instances to their ID representation and TypeID objects to strings:

from activemodel import BaseModel

user = User(id="user_123", name="Alice")
log.info("User action", user=user)
# Output: user_id=user_123 (instead of full object representation)

FastAPI Context

Note: Requires pip install starlette-context to be installed.

Automatically includes all context data from starlette-context in your logs, useful for request tracing:

# Context data (request_id, correlation_id, etc.) automatically included in all logs
log.info("Processing request")
# Output includes: request_id=abc-123 correlation_id=xyz-789 ...

All formatters are optional and automatically enabled when their respective dependencies are installed. They work seamlessly in both development (console) and production (JSON) logging modes.

FastAPI Access Logger

Note: Requires pip install structlog-config[fastapi] for FastAPI dependencies.

Structured, simple access log with request timing to replace the default fastapi access log. Why?

  1. It's less verbose
  2. Uses structured logging params instead of string interpolation
  3. debug level logs any static assets

Here's how to use it:

  1. Disable fastapi's default logging.
  2. Add the middleware to your FastAPI app.

Pytest Plugin: Capture Logs on Failure

A pytest plugin that captures logs per-test and displays them only when tests fail. This keeps your test output clean while ensuring you have all the debugging information you need when something goes wrong.

Features

  • Only shows logs for failing tests (keeps output clean)
  • Captures logs from all test phases (setup, call, teardown)
  • Unique log file per test
  • Optional persistent log storage for debugging
  • Automatically handles PYTHON_LOG_PATH environment variable

Usage

Enable the plugin with the --capture-logs-on-fail flag:

pytest --capture-logs-on-fail

Or enable it permanently in pytest.ini or pyproject.toml:

[tool.pytest.ini_options]
addopts = ["--capture-logs-on-fail"]

Persist Logs to Directory

To keep all test logs for later inspection (useful for CI/CD debugging):

pytest --capture-logs-dir=./test-logs

This creates a log file for each test and disables automatic cleanup.

How It Works

  1. Sets PYTHON_LOG_PATH environment variable to a unique temp file for each test
  2. Your application logs (via configure_logger()) write to this file
  3. On test failure, the plugin prints the captured logs to stdout
  4. Log files are cleaned up after the test session (unless --capture-logs-dir is used)

Example Output

When a test fails, you'll see:

FAILED tests/test_user.py::test_user_login

--- Captured logs for failed test (call): tests/test_user.py::test_user_login ---
2025-11-01 18:30:00 [info] User login started user_id=123
2025-11-01 18:30:01 [error] Database connection failed timeout=5.0

For passing tests, no log output is shown, keeping your test output clean and focused.

Beautiful Traceback Support

Optional support for beautiful-traceback provides enhanced exception formatting with improved readability, smart coloring, path aliasing (e.g., <pwd>, <site>), and better alignment. Automatically activates when installed:

uv add beautiful-traceback --group dev

No configuration needed - just install and configure_logger() will use it automatically.

iPython

Often it's helpful to update logging level within an iPython session. You can do this and make sure all loggers pick up on it.

%env LOG_LEVEL=DEBUG
from structlog_config import configure_logger
configure_logger()

Related Projects

References

General logging:

FastAPI access logger:

About

A comprehensive structlog configuration with sensible defaults for development and production environments, featuring context management, exception formatting, and path prettification.

Topics

Resources

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages