Skip to content

Commit

Permalink
📖 Clarify staging setup guide for bq & gcs destination (airbytehq#9255)
Browse files Browse the repository at this point in the history
* clarify confusing parts of setting up staging for bq destination

* Added Storage Admin

* update gcs destination docs too

* fix indentation

* Update required permission list

Co-authored-by: Liren Tu <tuliren.git@outlook.com>
  • Loading branch information
cgardens and tuliren authored Jan 6, 2022
1 parent 80695ad commit e80d614
Show file tree
Hide file tree
Showing 2 changed files with 26 additions and 12 deletions.
19 changes: 13 additions & 6 deletions docs/integrations/destinations/bigquery.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,15 +111,22 @@ This is the recommended configuration for uploading data to BigQuery. It works b
* **GCS Bucket Path**
* **Block Size (MB) for GCS multipart upload**
* **GCS Bucket Keep files after migration**
* See [this](https://cloud.google.com/storage/docs/creating-buckets) for instructions on how to create a GCS bucket.
* See [this](https://cloud.google.com/storage/docs/creating-buckets) for instructions on how to create a GCS bucket. The bucket cannot have a retention policy. Set Protection Tools to none or Object versioning.
* **HMAC Key Access ID**
* See [this](https://cloud.google.com/storage/docs/authentication/hmackeys) on how to generate an access key.
* We recommend creating an Airbyte-specific user or service account. This user or account will require read and write permissions to objects in the bucket.
* See [this](https://cloud.google.com/storage/docs/authentication/managing-hmackeys) on how to generate an access key. For more information on hmac keys please reference the [GCP docs](https://cloud.google.com/storage/docs/authentication/hmackeys)
* We recommend creating an Airbyte-specific user or service account. This user or account will require the following permissions for the bucket:
```
storage.multipartUploads.abort
storage.multipartUploads.create
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.list
```
You can set those by going to the permissions tab in the GCS bucket and adding the appropriate the email address of the service account or user and adding the aforementioned permissions.
* **Secret Access Key**
* Corresponding key to the above access ID.
* Make sure your GCS bucket is accessible from the machine running Airbyte.
* This depends on your networking setup.
* The easiest way to verify if Airbyte is able to connect to your GCS bucket is via the check connection tool in the UI.
* Make sure your GCS bucket is accessible from the machine running Airbyte. This depends on your networking setup. The easiest way to verify if Airbyte is able to connect to your GCS bucket is via the check connection tool in the UI.
### `Standard` uploads
This uploads data directly from your source to BigQuery. While this is faster to setup initially, **we strongly recommend that you do not use this option for anything other than a quick demo**. It is more than 10x slower than the GCS uploading option and will fail for many datasets. Please be aware you may see some failures for big datasets and slow sources, e.g. if reading from source takes more than 10-12 hours. This is caused by the Google BigQuery SDK client limitations. For more details please check [https://github.com/airbytehq/airbyte/issues/3549](https://github.com/airbytehq/airbyte/issues/3549)
Expand Down
19 changes: 13 additions & 6 deletions docs/integrations/destinations/gcs.md
Original file line number Diff line number Diff line change
Expand Up @@ -207,16 +207,23 @@ Under the hood, an Airbyte data stream in Json schema is first converted to an A

* Fill up GCS info
* **GCS Bucket Name**
* See [this](https://cloud.google.com/storage/docs/creating-buckets) to create an S3 bucket.
* See [this](https://cloud.google.com/storage/docs/creating-buckets) for instructions on how to create a GCS bucket. The bucket cannot have a retention policy. Set Protection Tools to none or Object versioning.
* **GCS Bucket Region**
* **HMAC Key Access ID**
* See [this](https://cloud.google.com/storage/docs/authentication/hmackeys) on how to generate an access key.
* We recommend creating an Airbyte-specific user or service account. This user or account will require read and write permissions to objects in the bucket.
* See [this](https://cloud.google.com/storage/docs/authentication/managing-hmackeys) on how to generate an access key. For more information on hmac keys please reference the [GCP docs](https://cloud.google.com/storage/docs/authentication/hmackeys)
* We recommend creating an Airbyte-specific user or service account. This user or account will require the following permissions for the bucket:
```
storage.multipartUploads.abort
storage.multipartUploads.create
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.list
```
You can set those by going to the permissions tab in the GCS bucket and adding the appropriate the email address of the service account or user and adding the aforementioned permissions.
* **Secret Access Key**
* Corresponding key to the above access ID.
* Make sure your GCS bucket is accessible from the machine running Airbyte.
* This depends on your networking setup.
* The easiest way to verify if Airbyte is able to connect to your GCS bucket is via the check connection tool in the UI.
* Make sure your GCS bucket is accessible from the machine running Airbyte. This depends on your networking setup. The easiest way to verify if Airbyte is able to connect to your GCS bucket is via the check connection tool in the UI.
## CHANGELOG
Expand Down

0 comments on commit e80d614

Please sign in to comment.