This application is a containerized Analytics suite for an imaginary company selling postcards. The company sells both directly but also through resellers in the majority of European countries.
- Docker (docker compose)
- DuckDB
- SQLMesh using dbt adapter
- Superset
Generation of example data and the underlying dbt-core model is available in the postcard-company-datamart project.
- portable-data-stack-dagster
- portable-data-stack-airflow
- portable-data-stack-mage
- portable-data-stack-bruin
-
Rename
.env.examplefile to.envand set your desired Superset password. Remember to never commit files containing passwords or any other sensitive information. -
Rename
shared/db/datamart.duckdb.exampletoshared/db/datamart.duckdbor init an empty database file there with that name. -
With Docker engine installed, change directory to the root folder of the project (also the one that contains docker-compose.yml) and run
docker compose up --build -
Once the Docker suite has finished starting, you will see the output of the SQLMesh plan & apply commands, which run the dbt core models against DuckDB.
- To explore the data and build dashboards you can open the Superset interface
Demo credentials are set in the .env file mentioned above.
- SQLMesh: 8000
- Superset: 8088
Generated parquet files are saved in the shared/parquet folder.
The data is fictional and automatically generated. Any similarities with existing persons, entities, products or businesses are purely coincidental.
- Test data is generated as parquet files using Python (generator)
- Data is imported from parquet files to the staging area in the Data Warehouse (DuckDB)
- The data is modelled, building fact and dimension tables, loading the Data Warehouse using SQLMesh (with the dbt adapter for model compatibility)
- Analyze and visually explore the data using Superset or directly querying the datamart via the SQL IDE provided by SQLMesh
For Superset, the default credentials are set in the .env file: user = admin, password = admin
The Docker process will begin building the application suite. The suite is made up of the following components, each within its own Docker container:
- generator: a collection of Python scripts that generate and export the example data, using the postcard-company-datamart project
- sqlmesh-dbt: the data model, sourced from the postcard-company-datamart project
- superset: the web-based Business Intelligence application used to explore the data; exposed on port 8088.
After the models have been applied you can either analyze the data using the querying and visualization tools provided by Superset (available locally on port 8088), or query the Data Warehouse (available as a DuckDB database).


