Scaffold production-ready dbt projects in seconds.
Installation · Quick Start · Commands · Adapters · Contributing
Starting a new dbt project means creating dozens of files, configuring profiles, setting up CI, adding linters, and writing boilerplate YAML. dbt-forge does all of this in one command — with interactive prompts or sensible defaults.
- One command to scaffold a complete, production-ready dbt project
- 8 database adapters supported out of the box
- 13
addsubcommands to extend projects after init - 11 health checks via
doctorwith actionable remediation hints - 6 architectural lint rules via
lint— fan-out, source-to-mart, complexity, duplicates, cycles, drift - Impact analysis — see which models break when you change an upstream model
- Query cost estimation — identify expensive models from warehouse usage data
- Data contracts — auto-generate dbt contracts by introspecting warehouse column types
- Model changelog — detect breaking and non-breaking schema changes between git refs
- Team presets to enforce standards across projects
- SQL migration — convert legacy SQL scripts into a dbt project with
ref()andsource() - Warehouse introspection — generate sources and staging models from live database metadata
- dbt Mesh — scaffold multi-project setups with access controls and contracts
- AI documentation — generate model and column descriptions using Claude, OpenAI, or Ollama
# With pip
pip install dbt-forge
# With uv (recommended)
uv tool install dbt-forge
# Or run directly without installing
uvx dbt-forge init my_projectRequirements: Python 3.11, 3.12, or 3.13
# Interactive — choose adapter, marts, packages, CI, and features
dbt-forge init
# Non-interactive — use opinionated defaults
dbt-forge init my_project --defaults
# Preview what would be generated
dbt-forge init my_project --defaults --dry-run
# Use a team preset
dbt-forge init my_project --preset company-standard.yml
# Scaffold a multi-project dbt Mesh setup
dbt-forge init my_mesh --meshExample generated project structure
my_project/
├── dbt_project.yml
├── packages.yml
├── selectors.yml
├── profiles/
│ └── profiles.yml # adapter-specific
├── models/
│ ├── staging/
│ │ └── example_source/
│ │ ├── stg_example_source__orders.sql
│ │ ├── _example_source__models.yml
│ │ └── _example_source__sources.yml
│ ├── intermediate/
│ │ └── int_example.sql
│ └── marts/
│ ├── orders.sql
│ └── __mart__models.yml
├── tests/
├── macros/
├── seeds/
├── snapshots/
├── .sqlfluff # if enabled
├── .pre-commit-config.yaml # if enabled
├── .github/workflows/ # if GitHub Actions selected
└── README.md
dbt-forge init [PROJECT_NAME] [--defaults] [--dry-run] [--preset FILE] [--output DIR]
dbt-forge init my_mesh --mesh # multi-project dbt Mesh setupRun from inside a dbt project directory:
# Structural components
dbt-forge add mart finance
dbt-forge add source salesforce
dbt-forge add source raw --from-database # introspect warehouse for real metadata
dbt-forge add snapshot orders
dbt-forge add seed dim_country
dbt-forge add exposure weekly_revenue
dbt-forge add macro cents_to_dollars
# Interactive generators
dbt-forge add model users # prompts for layer, materialization, columns
dbt-forge add test stg_orders # data, unit, or schema test
dbt-forge add package dbt-utils # curated registry of 20 packages
# Tooling
dbt-forge add ci github # also: gitlab, bitbucket
dbt-forge add pre-commit # hooks + .editorconfig
dbt-forge add project analytics # add sub-project to a dbt Meshdbt-forge migrate ./legacy_sql/ # convert SQL scripts to dbt models
dbt-forge migrate ./legacy_sql/ --dry-run # preview without writingdbt-forge docs generate # generate docs for all undocumented models
dbt-forge docs generate --model stg_orders # single model
dbt-forge docs generate --provider ollama # use local Ollamadbt-forge lint # run all 6 rules
dbt-forge lint --rule fan-out # single rule
dbt-forge lint --ci # exit 1 on warnings
dbt-forge lint --config custom.yml # custom thresholds
dbt-forge lint --format json # machine-readable outputAll 6 lint rules
| Rule | What it checks |
|---|---|
fan-out |
Models with too many downstream dependents |
source-to-mart |
Marts referencing source() directly (no staging layer) |
complexity |
CTE count, JOIN count, or line count exceeding thresholds |
duplicate-logic |
Identical CTE bodies across different models |
circular-deps |
Circular ref() dependencies in the DAG |
yaml-sql-drift |
Columns in YAML not matching the SQL SELECT clause |
dbt-forge impact stg_orders # downstream tree for one model
dbt-forge impact --diff # detect changed models from git diff
dbt-forge impact --diff --base main # custom base ref
dbt-forge impact --pr # markdown output for PR descriptions
dbt-forge impact --format json # machine-readable outputdbt-forge cost # connect + show top 10
dbt-forge cost --days 7 --top 20 # last 7 days, top 20 models
dbt-forge cost --report # markdown report
dbt-forge cost --target prod # use a specific dbt target
dbt-forge cost --format json # machine-readable outputdbt-forge contracts generate orders # single model
dbt-forge contracts generate --all-public # all public models
dbt-forge contracts generate --dry-run # preview without writing
dbt-forge contracts generate --yes # auto-acceptdbt-forge changelog generate # latest tag to HEAD
dbt-forge changelog generate --from v1.0 --to v2.0 # between specific refs
dbt-forge changelog generate --format json # machine-readable
dbt-forge changelog generate --breaking-only # only breaking changes
dbt-forge changelog generate -o CHANGELOG.md # write to filedbt-forge doctor # run all 11 checks
dbt-forge doctor --fix # auto-fix schema stubs + contract config
dbt-forge doctor --ci # exit code 1 on failures (for CI)
dbt-forge doctor --check test-coverage # run a single check
dbt-forge doctor --format json # machine-readable outputAll 11 checks
| Check | What it verifies |
|---|---|
naming-conventions |
Models follow stg_, int_, mart naming |
schema-coverage |
Models are documented in YAML |
test-coverage |
Models have at least one test |
hardcoded-refs |
No hardcoded database.schema.table |
packages-pinned |
packages.yml has version pins |
source-freshness |
Sources have freshness config |
orphaned-yml |
No YAML files without corresponding models |
sqlfluff-config |
.sqlfluff file exists |
gitignore |
.gitignore is configured |
disabled-models |
No disabled models in production |
contract-enforcement |
Mart models have contract: { enforced: true } |
dbt-forge status # model counts, test/doc coverage, sources, packagesdbt-forge update --dry-run # preview changes
dbt-forge update # interactively accept/skip each filedbt-forge preset validate company-standard.ymlPreset file format
name: "Company Standard"
description: "Enforced dbt project defaults"
defaults:
adapter: "Snowflake"
marts: ["finance", "marketing"]
add_sqlfluff: true
ci_providers: ["GitHub Actions"]
locked:
- adapter
- ci_providersLocked fields cannot be overridden during init.
| Adapter | Profile Template | Package |
|---|---|---|
| BigQuery | profiles/bigquery.yml |
dbt-bigquery |
| Snowflake | profiles/snowflake.yml |
dbt-snowflake |
| PostgreSQL | profiles/postgresql.yml |
dbt-postgres |
| DuckDB | profiles/duckdb.yml |
dbt-duckdb |
| Databricks | profiles/databricks.yml |
dbt-databricks |
| Redshift | profiles/redshift.yml |
dbt-redshift |
| Trino | profiles/trino.yml |
dbt-trino |
| Spark | profiles/spark.yml |
dbt-spark |
This is a monorepo:
| Directory | Purpose |
|---|---|
cli/ |
Python package — published to PyPI |
website/ |
Docs site — Astro + Starlight (deployed separately) |
See CONTRIBUTING.md for development setup, test structure, and commit conventions.
cd cli
uv sync --all-groups
uv run ruff check .
uv run pytest -m "not integration"This project uses Conventional Commits — install the hook with:
uv run pre-commit install --hook-type commit-msgMIT © Marouane