Skip to content

Commit

Permalink
add bigquery integration test shim (#16)
Browse files Browse the repository at this point in the history
Just realized that the BigQuery integration tests when the
`fullstory_events_integration_tests` didn't previously exist. This
creates a BigQuery shim that is very similar to the Snowflake one except
that this one doesn't do any JSON parsing, it just takes the seed file
as is.

This also makes tests for each platform all work the same way to
increase clarity.
  • Loading branch information
huttotw authored Jan 23, 2024
1 parent 01ccec8 commit 3140ac9
Show file tree
Hide file tree
Showing 4 changed files with 39 additions and 0 deletions.
1 change: 1 addition & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,7 @@ jobs:
cd integration_tests
dbt seed --target bigquery --full-refresh
dbt compile --target bigquery
dbt run --target bigquery --full-refresh --select bigquery_events_shim
dbt run --target bigquery --full-refresh
dbt test --target bigquery
Expand Down
28 changes: 28 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -163,3 +163,31 @@ Remember, fine tuning model performance and costs is a balancing act. Incrementa

### Other models
Although, we often find the incrementalization of the `events` model to be sufficient, you can customize the materialization method of any model in this package. Enabling additional incrementalization can be done in the same way as the `events` table, simply add a configuration block to your `dbt_project.yml`.

## Running Integration Tests
The `integration_tests` directory is a DBT project itself that depends on `dbt_fullstory`. We use this package to test how our models will execute in the real world as it simulates a live environment and is used in CI to hit actual databases. If you wish, you can run these tests locally. All you need is a target configured in your `profiles.yml` that is authenticated to a supported warehouse type.

> Internally, we name our profiles after the type of warehouse we are connecting (e.g. `bigquery`, `snowflake`, etc.). It makes the command more clear, like: `dbt run --target bigquery`.

To create the test data in your database:
```
dbt seed --target my-target
```

To run the shim for your warehouse:
> The shim will emulate how data is synced for your particular warehouse. As an example, data is loaded in JSON columns in Snowflake but as strings in BigQuery. You can choose from:
> - bigquery_events_shim
> - snowflake_events_shim
```
dbt run --target my-target --select <my-warehouse>_events_shim
```
To run the models:
```
dbt run --target my-target
```
To run the tests:
```
dbt test --target my-target
```
7 changes: 7 additions & 0 deletions integration_tests/models/_bigquery_events_shim.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
version: 2

models:
- name: bigquery_events_shim
config:
enabled: "{{ (target.type == 'bigquery') | as_bool }}"
alias: fullstory_events_integration_tests
3 changes: 3 additions & 0 deletions integration_tests/models/bigquery_events_shim.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
select
*
from {{ ref('fullstory_events_integration_seeds') }}

0 comments on commit 3140ac9

Please sign in to comment.