Skip to content

Commit 61a086b

Browse files
added some notes
1 parent 2e3d395 commit 61a086b

File tree

1 file changed

+35
-0
lines changed

1 file changed

+35
-0
lines changed

notes.md

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
Setting up dbt
2+
3+
- used dbt-core, not dbt-cloud
4+
- warehouse (bigquery in this case) - structured-app-test
5+
- github repo - shivam-singhal/dbt-tutorial
6+
- ran `dbt init jaffle_shop` - this create a `jaffle_shop` and `logs` dir in `dbt-tutorial`
7+
- dbt has models (which are select sql statements)
8+
- These models can be composed of other models - hence a DAG structure. Each node can be run independently (given its dependencies are run too)
9+
- Testing is built-in
10+
- Version control is "built-in" via storing dbt configs in git (Github)
11+
- commands
12+
- `dbt run` - run the sql queries against the data in the warehouse
13+
- `dbt run --full-refresh`
14+
- `dbt run --select <>` - to only run (or test specific models)
15+
- `dbt test` - validate that data has certain properties (e.g. non-null, unique, consists of certain values, etc.)
16+
- `dbt debug` - test .dbt/profiles.yml configuration (where bigquery connection information is stored)
17+
18+
What's missing are **metrics**. Lightdash takes the dbt models, and each column of the dbt model becomes a dimension of the table.
19+
20+
How are Lightdash tables different from dbt models? is it 1:1?
21+
22+
`docker-compose -f docker-compose.yml --env-file .env up --detach --remove-orphans` to run lightdash locally from the repo
23+
24+
`docker exec -it <container_id> psql -U postgres -d postgres` to run psql from inside the postgres container on my local machine to inspect the postgres table
25+
26+
`lightdash preview` allows me to update `schema.yml` and have it updated in preview mode - so I don't always have to push to github and refresh
27+
28+
lightdash defines its own metrics - via the `schema.yml` file in the dbt project - these are other metas, like dimensions. The other way to add metrics is via dbt's semantic layer (availabe in dbt cloud).
29+
- This is what we'd be replacing with our own metrics layer
30+
- This is done using the `meta` tag in the `schema.yml` file that dbt uses.
31+
- this kinda sucks - it's mixing sql w/ yaml
32+
33+
34+
MetricFlow
35+
`dbt-metricflow`

0 commit comments

Comments
 (0)