-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
🐛 BigQuery source: Fix nested arrays (#4981)
* unfinished jdbcsource separation * creation AbstactRelation * Migrate StateManager to new abstract level (JdbcSource -> RelationalSource) * fix imports * move configs to Database level + fix MySql source * make in line jdbc source with a new impl * Fix ScaffoldJavaJdbcSource template * rename `AbstractField` to `CommonField`. Now it s not an abstract class. + add default implementation for `AbstractRelationalDbSource.getFullyQualifiedTableName` * format * rename generated files in line with their location * bonus renaming * move utility methods specific for jdbc source to a proper module * internal review update * BigQueryDatabase impl without row transformation * add Static method for BigQueryDatabase instancing * remove data type parameter limitation + rename class parameters * Move DataTypeUtils from jdbs to common + impl basic types BigQueryUtils * make DB2 in line with new relational abstract classes * add missing import * cover all biqquery classes + add type transformation method from StandardSQLTypeName to JsonSchemaPrimitive * close unused connections * add table list extract method * bigquery source connector * return all tables for a whole project instead of a dataset * impl incremental fetch * bigquery source connector * bigquery source connector * remove unnecessary databaseid * add primitive type filtering * add temporary workaround for test database. * add dataset location * fix table info retrieving * handle dataset config * Add working comprehensive test without data cases * minor changes in the source processing * acceptance tests; discover method fix * discover method fix * first comprehensinve test * Comprehensive tests for the BigQuery source + database timeout config * bigquery acceptance tests fix; formatting * fix incremental sync using date, datetime, time and timestamp types * Implement source checks: basic and dataset * format * revert: airbyte_protocol.by * internal review update * Add possibility to get list of comprehensive tests in a Markdown table format. * Update airbyte-integrations/connectors/source-bigquery/src/main/resources/spec.json Co-authored-by: Sherif A. Nada <snadalive@gmail.com> * review update * Implement processing for arrays and structures * format * added bigquery secrets * added bigquery secrets * spec fix * test configs fix * extend mapping for Arrays and Structs * Process nested arrays * handle arrays of records properly. * format * BigQuery source docs * docs readme update * hide evidences * fix changlog order * Add bigquery to source_defintions yaml Co-authored-by: heade <danildubinin2@gmail.com> Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
- Loading branch information
1 parent
e1b4957
commit 9151d83
Showing
10 changed files
with
168 additions
and
5 deletions.
There are no files selected for viewing
7 changes: 7 additions & 0 deletions
7
...ain/resources/config/STANDARD_SOURCE_DEFINITION/bfd1ddf8-ae8a-4620-b1d7-55597d2ba08c.json
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
{ | ||
"sourceDefinitionId": "bfd1ddf8-ae8a-4620-b1d7-55597d2ba08c", | ||
"name": "BigQuery", | ||
"dockerRepository": "airbyte/source-bigquery", | ||
"dockerImageTag": "0.1.1", | ||
"documentationUrl": "https://docs.airbyte.io/integrations/sources/bigquery" | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,21 @@ | ||
# BigQuery Test Configuration | ||
|
||
In order to test the BigQuery source, you need a service account key file. | ||
|
||
## Community Contributor | ||
|
||
As a community contributor, you will need access to a GCP project and BigQuery to run tests. | ||
|
||
1. Go to the `Service Accounts` page on the GCP console | ||
1. Click on `+ Create Service Account" button | ||
1. Fill out a descriptive name/id/description | ||
1. Click the edit icon next to the service account you created on the `IAM` page | ||
1. Add the `BigQuery Data Editor` and `BigQuery User` role | ||
1. Go back to the `Service Accounts` page and use the actions modal to `Create Key` | ||
1. Download this key as a JSON file | ||
1. Move and rename this file to `secrets/credentials.json` | ||
|
||
## Airbyte Employee | ||
|
||
1. Access the `BigQuery Integration Test User` secret on Rippling under the `Engineering` folder | ||
1. Create a file with the contents at `secrets/credentials.json` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,92 @@ | ||
--- | ||
description: >- | ||
BigQuery is a serverless, highly scalable, and cost-effective data warehouse | ||
offered by Google Cloud Provider. | ||
--- | ||
|
||
# BigQuery | ||
|
||
## Overview | ||
|
||
The BigQuery source supports both Full Refresh and Incremental syncs. You can choose if this connector will copy only the new or updated data, or all rows in the tables and columns you set up for replication, every time a sync is running. | ||
|
||
### Resulting schema | ||
|
||
The BigQuery source does not alter the schema present in your database. Depending on the destination connected to this source, however, the schema may be altered. See the destination's documentation for more details. | ||
|
||
### Data type mapping | ||
|
||
The BigQuery data types mapping: | ||
|
||
| CockroachDb Type | Resulting Type | Notes | | ||
| :--- | :--- | :--- | | ||
| `BOOL` | Boolean | | | ||
| `INT64` | Number | | | ||
| `FLOAT64` | Number | | | ||
| `NUMERIC` | Number | | | ||
| `BIGNUMERIC` | Number | | | ||
| `STRING` | String | | | ||
| `BYTES` | String | | | ||
| `DATE` | String | In ISO8601 format | | ||
| `DATETIME` | String | In ISO8601 format | | ||
| `TIMESTAMP` | String | In ISO8601 format | | ||
| `TIME` | String | | | ||
| `ARRAY` | Array | | | ||
| `STRUCT` | Object | | | ||
| `GEOGRAPHY` | String | | | ||
|
||
### Features | ||
|
||
| Feature | Supported | Notes | | ||
| :--- | :--- | :--- | | ||
| Full Refresh Sync | Yes | | | ||
| Incremental Sync| Yes | | | ||
| Change Data Capture | No | | | ||
| SSL Support | Yes | | | ||
|
||
## Getting started | ||
|
||
### Requirements | ||
|
||
To use the BigQuery source, you'll need: | ||
|
||
* A Google Cloud Project with BigQuery enabled | ||
* A Google Cloud Service Account with the "BigQuery User" and "BigQuery Data Editor" roles in your GCP project | ||
* A Service Account Key to authenticate into your Service Account | ||
|
||
See the setup guide for more information about how to create the required resources. | ||
|
||
#### Service account | ||
|
||
In order for Airbyte to sync data from BigQuery, it needs credentials for a [Service Account](https://cloud.google.com/iam/docs/service-accounts) with the "BigQuery User" and "BigQuery Data Editor" roles, which grants permissions to run BigQuery jobs, write to BigQuery Datasets, and read table metadata. We highly recommend that this Service Account is exclusive to Airbyte for ease of permissioning and auditing. However, you can use a pre-existing Service Account if you already have one with the correct permissions. | ||
|
||
The easiest way to create a Service Account is to follow GCP's guide for [Creating a Service Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts). Once you've created the Service Account, make sure to keep its ID handy as you will need to reference it when granting roles. Service Account IDs typically take the form `<account-name>@<project-name>.iam.gserviceaccount.com` | ||
|
||
Then, add the service account as a Member in your Google Cloud Project with the "BigQuery User" role. To do this, follow the instructions for [Granting Access](https://cloud.google.com/iam/docs/granting-changing-revoking-access#granting-console) in the Google documentation. The email address of the member you are adding is the same as the Service Account ID you just created. | ||
|
||
At this point you should have a service account with the "BigQuery User" project-level permission. | ||
|
||
#### Service account key | ||
|
||
Service Account Keys are used to authenticate as Google Service Accounts. For Airbyte to leverage the permissions you granted to the Service Account in the previous step, you'll need to provide its Service Account Keys. See the [Google documentation](https://cloud.google.com/iam/docs/service-accounts#service_account_keys) for more information about Keys. | ||
|
||
Follow the [Creating and Managing Service Account Keys](https://cloud.google.com/iam/docs/creating-managing-service-account-keys) guide to create a key. Airbyte currently supports JSON Keys only, so make sure you create your key in that format. As soon as you created the key, make sure to download it, as that is the only time Google will allow you to see its contents. Once you've successfully configured BigQuery as a source in Airbyte, delete this key from your computer. | ||
|
||
### Setup the BigQuery source in Airbyte | ||
|
||
You should now have all the requirements needed to configure BigQuery as a source in the UI. You'll need the following information to configure the BigQuery source: | ||
|
||
* **Project ID** | ||
* **Default Dataset ID [Optional]**: the schema name if only one schema is interested. Dramatically boost source discover operation. | ||
* **Credentials JSON**: the contents of your Service Account Key JSON file | ||
|
||
Once you've configured BigQuery as a source, delete the Service Account Key from your computer. | ||
|
||
## CHANGELOG | ||
|
||
### source-bigquery | ||
|
||
| Version | Date | Pull Request | Subject | | ||
| :--- | :--- | :--- | :--- | | ||
| 0.1.1 | 2021-07-28 | [#4981](https://github.com/airbytehq/airbyte/pull/4981) | 🐛 BigQuery source: Fix nested arrays | | ||
| 0.1.0 | 2021-07-22 | [#4457](https://github.com/airbytehq/airbyte/pull/4457) | 🎉 New Source: Big Query. | |