Skip to content

Commit

Permalink
chore(doc): Update BigQuery Connection database connection UI into doc (
Browse files Browse the repository at this point in the history
#17191)

* Update google-bigquery.mdx

Update BigQuery Connection database connection UI

* fix grammar

Co-authored-by: Geido <60598000+geido@users.noreply.github.com>

* fix grammar

Co-authored-by: Geido <60598000+geido@users.noreply.github.com>

* pre-commit prettier

Co-authored-by: Geido <60598000+geido@users.noreply.github.com>
  • Loading branch information
rosemarie-chiu and geido authored Oct 29, 2021
1 parent f0c0ef7 commit ca6a1ec
Showing 1 changed file with 45 additions and 17 deletions.
62 changes: 45 additions & 17 deletions docs/src/pages/docs/Connecting to Databases/google-bigquery.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,31 +11,55 @@ version: 1
The recommended connector library for BigQuery is
[pybigquery](https://github.com/mxmzdlv/pybigquery).

The connection string for BigQuery looks like:

### Install BigQuery Driver
Follow the steps [here](/docs/databases/dockeradddrivers) about how to
install new database drivers when setting up Superset locally via docker-compose.
```
bigquery://{project_id}
echo "pybigquery" >> ./docker/requirements-local.txt
```

When adding a new BigQuery connection in Superset, you'll also need to add the GCP Service Account
### Connecting to BigQuery
When adding a new BigQuery connection in Superset, you'll need to add the GCP Service Account
credentials file (as a JSON).

1. Create your Service Account via the Google Cloud Platform control panel, provide it access to the
appropriate BigQuery datasets, and download the JSON configuration file for the service account.

2. n Superset, Add a JSON blob to the **Secure Extra** field in the database configuration form with
the following format:

2. In Superset, you can either upload that JSON or add the JSON blob in the following format (this should be the content of your credential JSON file):
```
{
"credentials_info": <contents of credentials JSON file>
}
```
"type": "service_account",
"project_id": "...",
"private_key_id": "...",
"private_key": "...",
"client_email": "...",
"client_id": "...",
"auth_uri": "...",
"token_uri": "...",
"auth_provider_x509_cert_url": "...",
"client_x509_cert_url": "..."
}
```

The resulting file should have this structure:
![CleanShot 2021-10-22 at 04 18 11](https://user-images.githubusercontent.com/52086618/138352958-a18ef9cb-8880-4ef1-88c1-452a9f1b8105.gif)

```
{

3. Additionally, can connect via SQLAlchemy URI instead

The connection string for BigQuery looks like:

```
bigquery://{project_id}
```
Go to the **Advanced** tab, Add a JSON blob to the **Secure Extra** field in the database configuration form with
the following format:
```
{
"credentials_info": <contents of credentials JSON file>
}
```

The resulting file should have this structure:
```
{
"credentials_info": {
"type": "service_account",
"project_id": "...",
Expand All @@ -47,11 +71,15 @@ The resulting file should have this structure:
"token_uri": "...",
"auth_provider_x509_cert_url": "...",
"client_x509_cert_url": "..."
}
}
}
```
```

You should then be able to connect to your BigQuery datasets.

![CleanShot 2021-10-22 at 04 47 08](https://user-images.githubusercontent.com/52086618/138354340-df57f477-d3e5-42d4-b032-d901c69d2213.gif)



To be able to upload CSV or Excel files to BigQuery in Superset, you'll need to also add the
[pandas_gbq](https://github.com/pydata/pandas-gbq) library.

0 comments on commit ca6a1ec

Please sign in to comment.