diff --git a/CHANGELOG.md b/CHANGELOG.md
index 06565d3424..3fb15cc602 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -4,6 +4,14 @@ All notable changes to this project will be documented in this file.
## [Unreleased]
+## [5.0.0] - 2021-06-17
+
+- Fix `message_retention_duration` variable type in `pubsub` module
+- Move `bq` robot service account into the robot service account project output
+- Add IAM cryptDecrypt role to robot service account on specified keys
+- Add Service Identity creation on `project` module if secretmanager enabled
+- add Data Foundation end to end example
+
## [4.9.0] - 2021-06-04
- **incompatible change** updated resource name for `google_dns_policy` on the `net-vpc` module
@@ -316,8 +324,9 @@ All notable changes to this project will be documented in this file.
- merge development branch with suite of new modules and end-to-end examples
-[Unreleased]: https://github.com/terraform-google-modules/cloud-foundation-fabric/compare/v4.9.0...HEAD
-[4.9.0]: https://github.com/terraform-google-modules/cloud-foundation-fabric/compare/v4.8.0...v4..0
+[Unreleased]: https://github.com/terraform-google-modules/cloud-foundation-fabric/compare/v5.0.0...HEAD
+[5.0.0]: https://github.com/terraform-google-modules/cloud-foundation-fabric/compare/v4.9.0...v5.0.0
+[4.9.0]: https://github.com/terraform-google-modules/cloud-foundation-fabric/compare/v4.8.0...v4.9.0
[4.8.0]: https://github.com/terraform-google-modules/cloud-foundation-fabric/compare/v4.7.0...v4.8.0
[4.7.0]: https://github.com/terraform-google-modules/cloud-foundation-fabric/compare/v4.6.1...v4.7.0
[4.6.1]: https://github.com/terraform-google-modules/cloud-foundation-fabric/compare/v4.6.0...v4.6.1
diff --git a/data-solutions/README.md b/data-solutions/README.md
index 18e4382510..d3007f85fb 100644
--- a/data-solutions/README.md
+++ b/data-solutions/README.md
@@ -14,3 +14,11 @@ They are meant to be used as minimal but complete starting points to create actu
### Cloud Storage to Bigquery with Cloud Dataflow
This [example](./gcs-to-bq-with-dataflow/) implements [Cloud Storage](https://cloud.google.com/kms/docs/cmek) to Bigquery data import using Cloud Dataflow.
All resources use CMEK hosted in Cloud KMS running in a centralized project. The example shows the basic resources and permissions for the typical use case to read, transform and import data from Cloud Storage to Bigquery.
+
+
+### Data Platform Foundations
+
+
+This [example](./data-platform-foundations/) implements a robust and flexible Data Foundation on GCP that provides opinionated defaults, allowing customers to build and scale out additional data pipelines quickly and reliably.
+
+
diff --git a/data-solutions/data-platform-foundations/01-environment/README.md b/data-solutions/data-platform-foundations/01-environment/README.md
new file mode 100644
index 0000000000..60d4e71594
--- /dev/null
+++ b/data-solutions/data-platform-foundations/01-environment/README.md
@@ -0,0 +1,53 @@
+# Data Platform Foundations - Environment (Step 1)
+
+This is the first step needed to deploy Data Platform Foundations, which creates projects and service accounts. Please refer to the [top-level Data Platform README](../README.md) for prerequisites.
+
+The projects that will be created are:
+
+- Common services
+- Landing
+- Orchestration & Transformation
+- DWH
+- Datamart
+
+A main service account named `projects-editor-sa` will be created under the common services project, and it will be granted editor permissions on all the projects in scope.
+
+This is a high level diagram of the created resources:
+
+
+
+## Running the example
+
+To create the infrastructure:
+
+- specify your variables in a `terraform.tvars`
+
+```tfm
+billing_account = "1234-1234-1234"
+parent = "folders/12345678"
+```
+
+- make sure you have the right authentication setup (application default credentials, or a service account key)
+- **The output of this stage contains the values for the resources stage**
+- run `terraform init` and `terraform apply`
+
+Once done testing, you can clean up resources by running `terraform destroy`.
+
+
+## Variables
+
+| name | description | type | required | default |
+|---|---|:---: |:---:|:---:|
+| billing_account_id | Billing account id. | string
| ✓ | |
+| root_node | Parent folder or organization in 'folders/folder_id' or 'organizations/org_id' format. | string
| ✓ | |
+| *prefix* | Prefix used to generate project id and name. | string
| | null
|
+| *project_names* | Override this variable if you need non-standard names. | object({...})
| | ...
|
+| *service_account_names* | Override this variable if you need non-standard names. | object({...})
| | ...
|
+
+## Outputs
+
+| name | description | sensitive |
+|---|---|:---:|
+| project_ids | Project ids for created projects. | |
+| service_account | Main service account. | |
+
diff --git a/data-solutions/data-platform-foundations/01-environment/diagram.png b/data-solutions/data-platform-foundations/01-environment/diagram.png
new file mode 100644
index 0000000000..eb9508d8e4
Binary files /dev/null and b/data-solutions/data-platform-foundations/01-environment/diagram.png differ
diff --git a/data-solutions/data-platform-foundations/01-environment/main.tf b/data-solutions/data-platform-foundations/01-environment/main.tf
new file mode 100644
index 0000000000..49be4a50dc
--- /dev/null
+++ b/data-solutions/data-platform-foundations/01-environment/main.tf
@@ -0,0 +1,115 @@
+/**
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+###############################################################################
+# projects #
+###############################################################################
+
+module "project-datamart" {
+ source = "../../../modules/project"
+ parent = var.root_node
+ billing_account = var.billing_account_id
+ prefix = var.prefix
+ name = var.project_names.datamart
+ services = [
+ "bigtable.googleapis.com",
+ "bigtableadmin.googleapis.com",
+ "bigquery.googleapis.com",
+ "bigquerystorage.googleapis.com",
+ "bigqueryreservation.googleapis.com",
+ "storage-component.googleapis.com",
+ ]
+ iam = {
+ "roles/editor" = [module.sa-services-main.iam_email]
+ }
+}
+
+module "project-dwh" {
+ source = "../../../modules/project"
+ parent = var.root_node
+ billing_account = var.billing_account_id
+ prefix = var.prefix
+ name = var.project_names.dwh
+ services = [
+ "bigquery.googleapis.com",
+ "bigquerystorage.googleapis.com",
+ "bigqueryreservation.googleapis.com",
+ "storage-component.googleapis.com",
+ ]
+ iam = {
+ "roles/editor" = [module.sa-services-main.iam_email]
+ }
+}
+
+module "project-landing" {
+ source = "../../../modules/project"
+ parent = var.root_node
+ billing_account = var.billing_account_id
+ prefix = var.prefix
+ name = var.project_names.landing
+ services = [
+ "pubsub.googleapis.com",
+ "storage-component.googleapis.com",
+ ]
+ iam = {
+ "roles/editor" = [module.sa-services-main.iam_email]
+ }
+}
+
+module "project-services" {
+ source = "../../../modules/project"
+ parent = var.root_node
+ billing_account = var.billing_account_id
+ prefix = var.prefix
+ name = var.project_names.services
+ services = [
+ "storage-component.googleapis.com",
+ "sourcerepo.googleapis.com",
+ "stackdriver.googleapis.com",
+ "cloudasset.googleapis.com",
+ ]
+ iam = {
+ "roles/editor" = [module.sa-services-main.iam_email]
+ }
+}
+
+module "project-transformation" {
+ source = "../../../modules/project"
+ parent = var.root_node
+ billing_account = var.billing_account_id
+ prefix = var.prefix
+ name = var.project_names.transformation
+ services = [
+ "cloudbuild.googleapis.com",
+ "compute.googleapis.com",
+ "dataflow.googleapis.com",
+ "servicenetworking.googleapis.com",
+ "storage-component.googleapis.com",
+ ]
+ iam = {
+ "roles/editor" = [module.sa-services-main.iam_email]
+ }
+}
+
+###############################################################################
+# service accounts #
+###############################################################################
+
+module "sa-services-main" {
+ source = "../../../modules/iam-service-account"
+ project_id = module.project-services.project_id
+ name = var.service_account_names.main
+}
diff --git a/data-solutions/data-platform-foundations/01-environment/outputs.tf b/data-solutions/data-platform-foundations/01-environment/outputs.tf
new file mode 100644
index 0000000000..b13d8fe073
--- /dev/null
+++ b/data-solutions/data-platform-foundations/01-environment/outputs.tf
@@ -0,0 +1,31 @@
+/**
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+output "project_ids" {
+ description = "Project ids for created projects."
+ value = {
+ datamart = module.project-datamart.project_id
+ dwh = module.project-dwh.project_id
+ landing = module.project-landing.project_id
+ services = module.project-services.project_id
+ transformation = module.project-transformation.project_id
+ }
+}
+
+output "service_account" {
+ description = "Main service account."
+ value = module.sa-services-main.email
+}
diff --git a/data-solutions/data-platform-foundations/01-environment/variables.tf b/data-solutions/data-platform-foundations/01-environment/variables.tf
new file mode 100644
index 0000000000..596a4340d9
--- /dev/null
+++ b/data-solutions/data-platform-foundations/01-environment/variables.tf
@@ -0,0 +1,57 @@
+# Copyright 2020 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+variable "billing_account_id" {
+ description = "Billing account id."
+ type = string
+}
+
+variable "prefix" {
+ description = "Prefix used to generate project id and name."
+ type = string
+ default = null
+}
+
+variable "project_names" {
+ description = "Override this variable if you need non-standard names."
+ type = object({
+ datamart = string
+ dwh = string
+ landing = string
+ services = string
+ transformation = string
+ })
+ default = {
+ datamart = "datamart"
+ dwh = "datawh"
+ landing = "landing"
+ services = "services"
+ transformation = "transformation"
+ }
+}
+
+variable "root_node" {
+ description = "Parent folder or organization in 'folders/folder_id' or 'organizations/org_id' format."
+ type = string
+}
+
+variable "service_account_names" {
+ description = "Override this variable if you need non-standard names."
+ type = object({
+ main = string
+ })
+ default = {
+ main = "data-platform-main"
+ }
+}
diff --git a/data-solutions/data-platform-foundations/01-environment/versions.tf b/data-solutions/data-platform-foundations/01-environment/versions.tf
new file mode 100644
index 0000000000..ab35a81c81
--- /dev/null
+++ b/data-solutions/data-platform-foundations/01-environment/versions.tf
@@ -0,0 +1,17 @@
+# Copyright 2020 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+terraform {
+ required_version = ">= 0.13"
+}
diff --git a/data-solutions/data-platform-foundations/02-resources/README.md b/data-solutions/data-platform-foundations/02-resources/README.md
new file mode 100644
index 0000000000..3f1e6e9ab9
--- /dev/null
+++ b/data-solutions/data-platform-foundations/02-resources/README.md
@@ -0,0 +1,78 @@
+# Data Platform Foundations - Resources (Step 2)
+
+This is the second step needed to deploy Data Platform Foundations, which creates resources needed to store and process the data, in the projects created in the [previous step](./../environment/). Please refer to the [top-level README](../README.md) for prerequisites and how to run the first step.
+
+
+
+The resources that will be create in each project are:
+
+- Common
+- Landing
+ - [x] GCS
+ - [x] Pub/Sub
+- Orchestration & Transformation
+ - [x] Dataflow
+- DWH
+ - [x] Bigquery (L0/1/2)
+ - [x] GCS
+- Datamart
+ - [x] Bigquery (views/table)
+ - [x] GCS
+ - [ ] BigTable
+
+## Running the example
+
+In the previous step, we created the environment (projects and service account) which we are going to use in this step.
+
+To create the resources, copy the output of the environment step (**project_ids**) and paste it into the `terraform.tvars`:
+
+- Specify your variables in a `terraform.tvars`, you can use the ouptu from the environment stage
+
+```tfm
+project_ids = {
+ datamart = "datamart-project_id"
+ dwh = "dwh-project_id"
+ landing = "landing-project_id"
+ services = "services-project_id"
+ transformation = "transformation-project_id"
+}
+```
+
+- Get a key for the service account created in the environment stage:
+ - Go into services project
+ - Go into IAM page
+ - Go into the service account section
+ - Creaet a new key for the service account created in previeous step (**service_account**)
+ - Download the json key into the current folder
+- make sure you have the right authentication setup: `export GOOGLE_APPLICATION_CREDENTIALS=PATH_TO_SERVICE_ACCOUT_KEY.json`
+- run `terraform init` and `terraform apply`
+
+Once done testing, you can clean up resources by running `terraform destroy`.
+
+
+## Variables
+
+| name | description | type | required | default |
+|---|---|:---: |:---:|:---:|
+| project_ids | Project IDs. | object({...})
| ✓ | |
+| *datamart_bq_datasets* | Datamart Bigquery datasets | map(object({...}))
| | ...
|
+| *dwh_bq_datasets* | DWH Bigquery datasets | map(object({...}))
| | ...
|
+| *landing_buckets* | List of landing buckets to create | map(object({...}))
| | ...
|
+| *landing_pubsub* | List of landing pubsub topics and subscriptions to create | map(map(object({...})))
| | ...
|
+| *landing_service_account* | landing service accounts list. | string
| | sa-landing
|
+| *service_account_names* | Project service accounts list. | object({...})
| | ...
|
+| *transformation_buckets* | List of transformation buckets to create | map(object({...}))
| | ...
|
+| *transformation_subnets* | List of subnets to create in the transformation Project. | list(object({...}))
| | ...
|
+| *transformation_vpc_name* | Name of the VPC created in the transformation Project. | string
| | transformation-vpc
|
+
+## Outputs
+
+| name | description | sensitive |
+|---|---|:---:|
+| datamart-datasets | List of bigquery datasets created for the datamart project. | |
+| dwh-datasets | List of bigquery datasets created for the dwh project. | |
+| landing-buckets | List of buckets created for the landing project. | |
+| landing-pubsub | List of pubsub topics and subscriptions created for the landing project. | |
+| transformation-buckets | List of buckets created for the transformation project. | |
+| transformation-vpc | Transformation VPC details | |
+
diff --git a/data-solutions/data-platform-foundations/02-resources/diagram.png b/data-solutions/data-platform-foundations/02-resources/diagram.png
new file mode 100644
index 0000000000..7a3393212d
Binary files /dev/null and b/data-solutions/data-platform-foundations/02-resources/diagram.png differ
diff --git a/data-solutions/data-platform-foundations/02-resources/main.tf b/data-solutions/data-platform-foundations/02-resources/main.tf
new file mode 100644
index 0000000000..19816cbcbb
--- /dev/null
+++ b/data-solutions/data-platform-foundations/02-resources/main.tf
@@ -0,0 +1,163 @@
+/**
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+###############################################################################
+# IAM #
+###############################################################################
+
+module "datamart-sa" {
+ source = "../../../modules/iam-service-account"
+ project_id = var.project_ids.datamart
+ name = var.service_account_names.datamart
+ iam_project_roles = {
+ "${var.project_ids.datamart}" = ["roles/editor"]
+ }
+}
+
+module "dwh-sa" {
+ source = "../../../modules/iam-service-account"
+ project_id = var.project_ids.dwh
+ name = var.service_account_names.dwh
+}
+
+module "landing-sa" {
+ source = "../../../modules/iam-service-account"
+ project_id = var.project_ids.landing
+ name = var.service_account_names.landing
+ iam_project_roles = {
+ "${var.project_ids.landing}" = ["roles/pubsub.publisher"]
+ }
+}
+
+module "services-sa" {
+ source = "../../../modules/iam-service-account"
+ project_id = var.project_ids.services
+ name = var.service_account_names.services
+ iam_project_roles = {
+ "${var.project_ids.services}" = ["roles/editor"]
+ }
+}
+
+module "transformation-sa" {
+ source = "../../../modules/iam-service-account"
+ project_id = var.project_ids.transformation
+ name = var.service_account_names.transformation
+ iam_project_roles = {
+ "${var.project_ids.transformation}" = [
+ "roles/logging.logWriter",
+ "roles/monitoring.metricWriter",
+ "roles/dataflow.admin",
+ "roles/iam.serviceAccountUser",
+ "roles/bigquery.dataOwner",
+ "roles/bigquery.jobUser",
+ "roles/dataflow.worker",
+ "roles/bigquery.metadataViewer",
+ "roles/storage.objectViewer",
+ ]
+ }
+}
+
+###############################################################################
+# GCS #
+###############################################################################
+
+module "landing-buckets" {
+ source = "../../../modules/gcs"
+ for_each = var.landing_buckets
+ project_id = var.project_ids.landing
+ prefix = var.project_ids.landing
+ name = each.value.name
+ location = each.value.location
+ iam = {
+ "roles/storage.objectCreator" = [module.landing-sa.iam_email]
+ "roles/storage.admin" = [module.transformation-sa.iam_email]
+ }
+}
+
+module "transformation-buckets" {
+ source = "../../../modules/gcs"
+ for_each = var.transformation_buckets
+ project_id = var.project_ids.transformation
+ prefix = var.project_ids.transformation
+ name = each.value.name
+ location = each.value.location
+ iam = {
+ "roles/storage.admin" = [module.transformation-sa.iam_email]
+ }
+}
+
+###############################################################################
+# Bigquery #
+###############################################################################
+
+module "datamart-bq" {
+ source = "../../../modules/bigquery-dataset"
+ for_each = var.datamart_bq_datasets
+ project_id = var.project_ids.datamart
+ id = each.key
+ location = each.value.location
+ iam = {
+ for k, v in each.value.iam : k => (
+ k == "roles/bigquery.dataOwner"
+ ? concat(v, [module.datamart-sa.iam_email])
+ : v
+ )
+ }
+}
+
+module "dwh-bq" {
+ source = "../../../modules/bigquery-dataset"
+ for_each = var.dwh_bq_datasets
+ project_id = var.project_ids.dwh
+ id = each.key
+ location = each.value.location
+ iam = {
+ for k, v in each.value.iam : k => (
+ k == "roles/bigquery.dataOwner"
+ ? concat(v, [module.dwh-sa.iam_email])
+ : v
+ )
+ }
+}
+
+###############################################################################
+# Network #
+###############################################################################
+module "vpc-transformation" {
+ source = "../../../modules/net-vpc"
+ project_id = var.project_ids.transformation
+ name = var.transformation_vpc_name
+ subnets = var.transformation_subnets
+}
+
+###############################################################################
+# Pub/Sub #
+###############################################################################
+
+module "landing-pubsub" {
+ source = "../../../modules/pubsub"
+ for_each = var.landing_pubsub
+ project_id = var.project_ids.landing
+ name = each.key
+ subscriptions = {
+ for k, v in each.value : k => { labels = v.labels, options = v.options }
+ }
+ subscription_iam = {
+ for k, v in each.value : k => merge(v.iam, {
+ "roles/pubsub.subscriber" = [module.transformation-sa.iam_email]
+ })
+ }
+}
diff --git a/data-solutions/data-platform-foundations/02-resources/outputs.tf b/data-solutions/data-platform-foundations/02-resources/outputs.tf
new file mode 100644
index 0000000000..3023587d90
--- /dev/null
+++ b/data-solutions/data-platform-foundations/02-resources/outputs.tf
@@ -0,0 +1,60 @@
+/**
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+output "datamart-datasets" {
+ description = "List of bigquery datasets created for the datamart project."
+ value = [
+ for k, datasets in module.datamart-bq : datasets.dataset_id
+ ]
+}
+
+output "dwh-datasets" {
+ description = "List of bigquery datasets created for the dwh project."
+ value = [for k, datasets in module.dwh-bq : datasets.dataset_id]
+}
+
+output "landing-buckets" {
+ description = "List of buckets created for the landing project."
+ value = [for k, bucket in module.landing-buckets : bucket.name]
+}
+
+output "landing-pubsub" {
+ description = "List of pubsub topics and subscriptions created for the landing project."
+ value = {
+ for t in module.landing-pubsub : t.topic.name => {
+ id = t.topic.id
+ subscriptions = { for s in t.subscriptions : s.name => s.id }
+ }
+ }
+}
+
+output "transformation-buckets" {
+ description = "List of buckets created for the transformation project."
+ value = [for k, bucket in module.transformation-buckets : bucket.name]
+}
+
+output "transformation-vpc" {
+ description = "Transformation VPC details"
+ value = {
+ name = module.vpc-transformation.name
+ subnets = {
+ for k, s in module.vpc-transformation.subnets : k => {
+ ip_cidr_range = s.ip_cidr_range
+ region = s.region
+ }
+ }
+ }
+}
diff --git a/data-solutions/data-platform-foundations/02-resources/variables.tf b/data-solutions/data-platform-foundations/02-resources/variables.tf
new file mode 100644
index 0000000000..bd139724f7
--- /dev/null
+++ b/data-solutions/data-platform-foundations/02-resources/variables.tf
@@ -0,0 +1,171 @@
+# Copyright 2020 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+variable "datamart_bq_datasets" {
+ description = "Datamart Bigquery datasets"
+ type = map(object({
+ iam = map(list(string))
+ location = string
+ }))
+ default = {
+ bq_datamart_dataset = {
+ location = "EU"
+ iam = {
+ # "roles/bigquery.dataOwner" = []
+ # "roles/bigquery.dataEditor" = []
+ # "roles/bigquery.dataViewer" = []
+ }
+ }
+ }
+}
+
+variable "dwh_bq_datasets" {
+ description = "DWH Bigquery datasets"
+ type = map(object({
+ location = string
+ iam = map(list(string))
+ }))
+ default = {
+ bq_raw_dataset = {
+ iam = {}
+ location = "EU"
+ }
+ }
+}
+
+variable "landing_buckets" {
+ description = "List of landing buckets to create"
+ type = map(object({
+ location = string
+ name = string
+ }))
+ default = {
+ raw-data = {
+ location = "EU"
+ name = "raw-data"
+ }
+ data-schema = {
+ location = "EU"
+ name = "data-schema"
+ }
+ }
+}
+
+variable "landing_pubsub" {
+ description = "List of landing pubsub topics and subscriptions to create"
+ type = map(map(object({
+ iam = map(list(string))
+ labels = map(string)
+ options = object({
+ ack_deadline_seconds = number
+ message_retention_duration = number
+ retain_acked_messages = bool
+ expiration_policy_ttl = number
+ })
+ })))
+ default = {
+ landing-1 = {
+ sub1 = {
+ iam = {
+ # "roles/pubsub.subscriber" = []
+ }
+ labels = {}
+ options = null
+ }
+ sub2 = {
+ iam = {}
+ labels = {},
+ options = null
+ },
+ }
+ }
+}
+
+variable "landing_service_account" {
+ description = "landing service accounts list."
+ type = string
+ default = "sa-landing"
+}
+
+variable "project_ids" {
+ description = "Project IDs."
+ type = object({
+ datamart = string
+ dwh = string
+ landing = string
+ services = string
+ transformation = string
+ })
+}
+
+
+variable "service_account_names" {
+ description = "Project service accounts list."
+ type = object({
+ datamart = string
+ dwh = string
+ landing = string
+ services = string
+ transformation = string
+ })
+ default = {
+ datamart = "sa-datamart"
+ dwh = "sa-datawh"
+ landing = "sa-landing"
+ services = "sa-services"
+ transformation = "sa-transformation"
+ }
+}
+
+variable "transformation_buckets" {
+ description = "List of transformation buckets to create"
+ type = map(object({
+ location = string
+ name = string
+ }))
+ default = {
+ temp = {
+ location = "EU"
+ name = "temp"
+ },
+ templates = {
+ location = "EU"
+ name = "templates"
+ },
+ }
+}
+
+variable "transformation_subnets" {
+ description = "List of subnets to create in the transformation Project."
+ type = list(object({
+ ip_cidr_range = string
+ name = string
+ region = string
+ secondary_ip_range = map(string)
+ }))
+ default = [
+ {
+ ip_cidr_range = "10.1.0.0/20"
+ name = "transformation-subnet"
+ region = "europe-west3"
+ secondary_ip_range = {}
+ },
+ ]
+}
+
+variable "transformation_vpc_name" {
+ description = "Name of the VPC created in the transformation Project."
+ type = string
+ default = "transformation-vpc"
+}
diff --git a/data-solutions/data-platform-foundations/02-resources/versions.tf b/data-solutions/data-platform-foundations/02-resources/versions.tf
new file mode 100644
index 0000000000..a9701d5bae
--- /dev/null
+++ b/data-solutions/data-platform-foundations/02-resources/versions.tf
@@ -0,0 +1,17 @@
+# Copyright 2020 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+terraform {
+ required_version = ">= 0.13"
+}
\ No newline at end of file
diff --git a/data-solutions/data-platform-foundations/03-pipeline/README.md b/data-solutions/data-platform-foundations/03-pipeline/README.md
new file mode 100644
index 0000000000..52662ee25d
--- /dev/null
+++ b/data-solutions/data-platform-foundations/03-pipeline/README.md
@@ -0,0 +1,8 @@
+# Manual pipeline Example
+
+Once you deployed projects [step 1](../infra/tf-phase1/README.md) and resources [step 1](../infra/tf-phase2/README.md) you can use it to run your data pipeline.
+
+Here we will demo 2 pipelines:
+
+* [GCS to Bigquery](./gcs_to_bigquery.md)
+* [PubSub to Bigquery](./pubsub_to_bigquery.md)
diff --git a/data-solutions/data-platform-foundations/03-pipeline/gcs_to_bigquery.md b/data-solutions/data-platform-foundations/03-pipeline/gcs_to_bigquery.md
new file mode 100644
index 0000000000..785bc0b0cd
--- /dev/null
+++ b/data-solutions/data-platform-foundations/03-pipeline/gcs_to_bigquery.md
@@ -0,0 +1,151 @@
+# Manual pipeline Example: GCS to Bigquery
+
+In this example we will publish person message in the following format:
+
+```bash
+Lorenzo,Caggioni,1617898199
+```
+
+a Dataflow pipeline will read those messages and import them into a Bigquery table in the DWH project.
+
+[TODO] An autorized view will be created in the datamart project to expose the table.
+[TODO] Remove hardcoded 'lcaggio' variables and made ENV variable for it.
+[TODO] Further automation is expected in future.
+
+Create and download keys for Service accounts you created.
+
+## Create BQ table
+
+Those steps should be done as Transformation Service Account:
+
+```bash
+gcloud auth activate-service-account sa-dwh@dwh-lc01.iam.gserviceaccount.com --key-file=sa-dwh.json --project=dwh-lc01
+```
+
+and you can run the command to create a table:
+
+```bash
+bq mk \
+-t \
+--description "This is a Test Person table" \
+dwh-lc01:bq_raw_dataset.person \
+name:STRING,surname:STRING,timestamp:TIMESTAMP
+```
+
+## Produce CSV data file, JSON schema file and UDF JS file
+
+Those steps should be done as landing Service Account:
+
+```bash
+gcloud auth activate-service-account sa-landing@landing-lc01.iam.gserviceaccount.com --key-file=sa-landing.json --project=landing-lc01
+```
+
+Let's now create a series of messages we can use to import:
+
+```bash
+for i in {0..10}
+do
+ echo "Lorenzo,Caggioni,$(date +%s)" >> person.csv
+done
+```
+
+and copy files to the GCS bucket:
+
+```bash
+gsutil cp person.csv gs://landing-lc01-eu-raw-data
+```
+
+Let's create the data JSON schema:
+
+```bash
+cat <<'EOF' >> person_schema.json
+{
+ "BigQuery Schema": [
+ {
+ "name": "name",
+ "type": "STRING"
+ },
+ {
+ "name": "surname",
+ "type": "STRING"
+ },
+ {
+ "name": "timestamp",
+ "type": "TIMESTAMP"
+ }
+ ]
+}
+EOF
+```
+
+and copy files to the GCS bucket:
+
+```bash
+gsutil cp person_schema.json gs://landing-lc01-eu-data-schema
+```
+
+Let's create the data UDF function to transform message data:
+
+```bash
+cat <<'EOF' >> person_udf.js
+function transform(line) {
+ var values = line.split(',');
+
+ var obj = new Object();
+ obj.name = values[0];
+ obj.surname = values[1];
+ obj.timestamp = values[2];
+ var jsonString = JSON.stringify(obj);
+
+ return jsonString;
+}
+EOF
+```
+
+and copy files to the GCS bucket:
+
+```bash
+gsutil cp person_udf.js gs://landing-lc01-eu-data-schema
+```
+
+if you want to check files copied to GCS, you can use the Transformation service account:
+
+```bash
+gcloud auth activate-service-account sa-transformation@transformation-lc01.iam.gserviceaccount.com --key-file=sa-transformation.json --project=transformation-lc01
+```
+
+and read a message (message won't be acked and will stay in the subscription):
+
+```bash
+gsutil ls gs://landing-lc01-eu-raw-data
+gsutil ls gs://landing-lc01-eu-data-schema
+```
+
+## Dataflow
+
+Those steps should be done as transformation Service Account:
+
+```bash
+gcloud auth activate-service-account sa-transformation@transformation-lc01.iam.gserviceaccount.com --key-file=sa-transformation.json --project=transformation-lc01
+```
+
+Let's than start a Dataflwo batch pipeline using a Google provided template using internal only IPs, the created network and subnetwork, the appropriate service account and requested parameters:
+
+```bash
+gcloud dataflow jobs run test_batch_lcaggio01 \
+ --gcs-location gs://dataflow-templates/latest/GCS_Text_to_BigQuery \
+ --project transformation-lc01 \
+ --region europe-west3 \
+ --disable-public-ips \
+ --network transformation-vpc \
+ --subnetwork regions/europe-west3/subnetworks/transformation-subnet \
+ --staging-location gs://transformation-lc01-eu-temp \
+ --service-account-email sa-transformation@transformation-lc01.iam.gserviceaccount.com \
+ --parameters \
+javascriptTextTransformFunctionName=transform,\
+JSONPath=gs://landing-lc01-eu-data-schema/person_schema.json,\
+javascriptTextTransformGcsPath=gs://landing-lc01-eu-data-schema/person_udf.js,\
+inputFilePattern=gs://landing-lc01-eu-raw-data/person.csv,\
+outputTable=dwh-lc01:bq_raw_dataset.person,\
+bigQueryLoadingTemporaryDirectory=gs://transformation-lc01-eu-temp
+```
diff --git a/data-solutions/data-platform-foundations/03-pipeline/pubsub_to_bigquery.md b/data-solutions/data-platform-foundations/03-pipeline/pubsub_to_bigquery.md
new file mode 100644
index 0000000000..5778258e46
--- /dev/null
+++ b/data-solutions/data-platform-foundations/03-pipeline/pubsub_to_bigquery.md
@@ -0,0 +1,96 @@
+# Manual pipeline Example: PubSub to Bigquery
+
+In this example we will publish person message in the following format:
+
+```txt
+name: Lorenzo
+surname: Caggioni
+timestamp: 1617898199
+```
+
+a Dataflow pipeline will read those messages and import them into a Bigquery table in the DWH project.
+
+An autorized view will be created in the datamart project to expose the table.
+
+[TODO] Remove hardcoded 'lcaggio' variables and made ENV variable for it.
+[TODO] Further automation is expected in future.
+
+Create and download keys for Service accounts you created, be sure to have `iam.serviceAccountKeys.create` permission on projects or at folder level.
+
+```bash
+gcloud iam service-accounts keys create sa-landing.json --iam-account=sa-landing@landing-lc01.iam.gserviceaccount.com
+gcloud iam service-accounts keys create sa-transformation.json --iam-account=sa-transformation@transformation-lc01.iam.gserviceaccount.com
+gcloud iam service-accounts keys create sa-dwh.json --iam-account=sa-dwh@dwh-lc01.iam.gserviceaccount.com
+```
+
+## Create BQ table
+
+Those steps should be done as Transformation Service Account:
+
+```bash
+gcloud auth activate-service-account sa-dwh@dwh-lc01.iam.gserviceaccount.com --key-file=sa-dwh.json --project=dwh-lc01
+```
+
+and you can run the command to create a table:
+
+```bash
+bq mk \
+-t \
+--description "This is a Test Person table" \
+dwh-lc01:bq_raw_dataset.person \
+name:STRING,surname:STRING,timestamp:TIMESTAMP
+```
+
+## Produce PubSub messages
+
+Those steps should be done as landing Service Account:
+
+```bash
+gcloud auth activate-service-account sa-landing@landing-lc01.iam.gserviceaccount.com --key-file=sa-landing.json --project=landing-lc01
+```
+
+and let's now create a series of messages we can use to import:
+
+```bash
+for i in {0..10}
+do
+ gcloud pubsub topics publish projects/landing-lc01/topics/landing-1 --message="{\"name\": \"Lorenzo\", \"surname\": \"Caggioni\", \"timestamp\": \"$(date +%s)\"}"
+done
+```
+
+if you want to check messages published, you can use the Transformation service account:
+
+```bash
+gcloud auth activate-service-account sa-transformation@transformation-lc01.iam.gserviceaccount.com --key-file=sa-transformation.json --project=transformation-lc01
+```
+
+and read a message (message won't be acked and will stay in the subscription):
+
+```bash
+gcloud pubsub subscriptions pull projects/landing-lc01/subscriptions/sub1
+```
+
+## Dataflow
+
+Those steps should be done as transformation Service Account:
+
+```bash
+gcloud auth activate-service-account sa-transformation@transformation-lc01.iam.gserviceaccount.com --key-file=sa-transformation.json --project=transformation-lc01
+```
+
+Let's than start a Dataflwo streaming pipeline using a Google provided template using internal only IPs, the created network and subnetwork, the appropriate service account and requested parameters:
+
+```bash
+gcloud dataflow jobs run test_lcaggio01 \
+ --gcs-location gs://dataflow-templates/latest/PubSub_Subscription_to_BigQuery \
+ --project transformation-lc01 \
+ --region europe-west3 \
+ --disable-public-ips \
+ --network transformation-vpc \
+ --subnetwork regions/europe-west3/subnetworks/transformation-subnet \
+ --staging-location gs://transformation-lc01-eu-temp \
+ --service-account-email sa-transformation@transformation-lc01.iam.gserviceaccount.com \
+ --parameters \
+inputSubscription=projects/landing-lc01/subscriptions/sub1,\
+outputTableSpec=dwh-lc01:bq_raw_dataset.person
+```
diff --git a/data-solutions/data-platform-foundations/03-pipeline/resource/raw_data.json b/data-solutions/data-platform-foundations/03-pipeline/resource/raw_data.json
new file mode 100644
index 0000000000..d1b25ea9a1
--- /dev/null
+++ b/data-solutions/data-platform-foundations/03-pipeline/resource/raw_data.json
@@ -0,0 +1,26 @@
+{
+ "schema": {
+ "fields": [
+ {
+ "mode": "NULLABLE",
+ "name": "name",
+ "type": "STRING"
+ },
+ {
+ "mode": "NULLABLE",
+ "name": "surname",
+ "type": "STRING"
+ },
+ {
+ "mode": "NULLABLE",
+ "name": "age",
+ "type": "INTEGER"
+ },
+ {
+ "mode": "NULLABLE",
+ "name": "boolean_val",
+ "type": "BOOLEAN"
+ }
+ ]
+ }
+}
diff --git a/data-solutions/data-platform-foundations/README.md b/data-solutions/data-platform-foundations/README.md
new file mode 100644
index 0000000000..3017786c51
--- /dev/null
+++ b/data-solutions/data-platform-foundations/README.md
@@ -0,0 +1,63 @@
+# Data Foundation Platform
+
+The goal of this example is to Build a robust and flexible Data Foundation on GCP, providing opinionated defaults while still allowing customers to quickly and reliably build and scale out additional data pipelines.
+
+The example is composed of three separate provisioning workflows, which are deisgned to be plugged together and create end to end Data Foundations, that support multiple data pipelines on top.
+
+1. **[Environment Setup](./01-environment/)**
+ *(once per environment)*
+ * projects
+ * VPC configuration
+ * Composer environment and identity
+ * shared buckets and datasets
+1. **[Data Source Setup](./02-resources)**
+ *(once per data source)*
+ * landing and archive bucket
+ * internal and external identities
+ * domain specific datasets
+1. **[Pipeline Setup](./03-pipeline)**
+ *(once per pipeline)*
+ * pipeline-specific tables and views
+ * pipeline code
+ * Composer DAG
+
+The resulting GCP architecture is outlined in this diagram
+
+
+A demo pipeline is also part of this example: it can be built and run on top of the foundational infrastructure to quickly verify or test the setup.
+
+## Prerequisites
+
+In order to bring up this example, you will need
+
+- a folder or organization where new projects will be created
+- a billing account that will be associated to new projects
+- an identity (user or service account) with owner permissions on the folder or org, and billing user permissions on the billing account
+
+## Bringing up the platform
+
+[](https://ssh.cloud.google.com/cloudshell/editor?cloudshell_git_repo=https%3A%2F%2Fgithub.com%2Fterraform-google-modules%2Fcloud-foundation-fabric.git&cloudshell_open_in_editor=README.md&cloudshell_workspace=data-solutions%2Fdata-platform-foundations)
+
+The end-to-end example is composed of 2 foundational, and 1 optional steps:
+
+1. [Environment setup](./01-environment/)
+1. [Data source setup](./02-resources/)
+1. (Optional) [Pipeline setup](./03-pipeline/)
+
+The environment setup is designed to manage a single environment. Various strategies like workspaces, branching, or even separate clones can be used to support multiple environments.
+
+## TODO
+
+| Description | Priority (1:High - 5:Low ) | Status | Remarks |
+|-------------|----------|:------:|---------|
+| DLP best practices in the pipeline | 2 | Not Started | |
+| KMS support (CMEK) | 2 | Not Started | |
+| VPC-SC | 3 | Not Started | |
+| Add Composer with a static DAG running the example | 3 | Not Started | |
+| Integrate [CI/CD composer data processing workflow framework](https://github.com/jaketf/ci-cd-for-data-processing-workflow) | 3 | Not Started | |
+| Schema changes, how to handle | 4 | Not Started | |
+| Data lineage | 4 | Not Started | |
+| Data quality checks | 4 | Not Started | |
+| Shared-VPC | 5 | Not Started | |
+| Logging & monitoring | TBD | Not Started | |
+| Orcestration for ingestion pipeline (just in the readme) | TBD | Not Started | |
diff --git a/data-solutions/gcs-to-bq-with-dataflow/main.tf b/data-solutions/gcs-to-bq-with-dataflow/main.tf
index 647cf6d3c5..39c13122ca 100644
--- a/data-solutions/gcs-to-bq-with-dataflow/main.tf
+++ b/data-solutions/gcs-to-bq-with-dataflow/main.tf
@@ -134,7 +134,7 @@ module "kms" {
},
key-bq = {
"roles/cloudkms.cryptoKeyEncrypterDecrypter" = [
- "serviceAccount:${module.project-service.service_accounts.default.bq}",
+ "serviceAccount:${module.project-service.service_accounts.robots.bq}",
#"serviceAccount:${data.google_bigquery_default_service_account.bq_sa.email}",
]
},
diff --git a/modules/net-interconnect-attachment-direct/README.md b/modules/net-interconnect-attachment-direct/README.md
new file mode 100644
index 0000000000..e9e4c15c19
--- /dev/null
+++ b/modules/net-interconnect-attachment-direct/README.md
@@ -0,0 +1,131 @@
+# Direct Interconnect VLAN Attachment and router
+
+This module allows creation of a VLAN attachment for Direct Interconnect and router (router creation is optional).
+
+## Examples
+
+### Direct Interconnect VLAN attachment using default parameters for bgp session and router
+
+```hcl
+module "vlan-attachment-1" {
+ source = "./modules/net-interconnect-attachment-direct"
+ project_id = "dedicated-ic-5-8492"
+ region = "us-west2"
+ router_network = "myvpc"
+ name = "vlan-604-x"
+ interconnect = "https://www.googleapis.com/compute/v1/projects/mylab/global/interconnects/mylab-interconnect-1"
+ peer = {
+ ip_address = "169.254.63.2"
+ asn = 65418
+ }
+}
+# tftest:modules=1:resources=4
+```
+#### Direct Interconnect VLAN attachments to achieve 99.9% SLA setup
+
+```hcl
+module "vlan-attachment-1" {
+ source = "./modules/net-interconnect-attachment-direct"
+ project_id = "dedicated-ic-3-8386"
+ region = "us-west2"
+ router_name = "router-1"
+ router_config = {
+ description = ""
+ asn = 65003
+ advertise_config = {
+ groups = ["ALL_SUBNETS"]
+ ip_ranges = {
+ "199.36.153.8/30" = "custom"
+ }
+ mode = "CUSTOM"
+ }
+ }
+ router_network = "myvpc"
+ name = "vlan-603-1"
+ interconnect = "https://www.googleapis.com/compute/v1/projects/mylab/global/interconnects/mylab-interconnect-1"
+
+ config = {
+ description = ""
+ vlan_id = 603
+ bandwidth = "BPS_10G"
+ admin_enabled = true
+ mtu = 1440
+ }
+ peer = {
+ ip_address = "169.254.63.2"
+ asn = 65418
+ }
+ bgp = {
+ session_range = "169.254.63.1/29"
+ advertised_route_priority = 0
+ candidate_ip_ranges = ["169.254.63.0/29"]
+ }
+}
+
+module "vlan-attachment-2" {
+ source = "./modules/net-interconnect-attachment-direct"
+ project_id = "dedicated-ic-3-8386"
+ region = "us-west2"
+ router_name = "router-2"
+ router_config = {
+ description = ""
+ asn = 65003
+ advertise_config = {
+ groups = ["ALL_SUBNETS"]
+ ip_ranges = {
+ "199.36.153.8/30" = "custom"
+ }
+ mode = "CUSTOM"
+ }
+
+ }
+ router_network = "myvpc"
+ name = "vlan-603-2"
+
+ interconnect = "https://www.googleapis.com/compute/v1/projects/mylab/global/interconnects/mylab-interconnect-2"
+
+ config = {
+ description = ""
+ vlan_id = 603
+ bandwidth = "BPS_10G"
+ admin_enabled = true
+ mtu = 1440
+ }
+ peer = {
+ ip_address = "169.254.63.10"
+ asn = 65418
+ }
+ bgp = {
+ session_range = "169.254.63.9/29"
+ advertised_route_priority = 0
+ candidate_ip_ranges = ["169.254.63.8/29"]
+ }
+}
+# tftest:modules=2:resources=8
+```
+
+
+## Variables
+
+| name | description | type | required | default |
+|---|---|:---: |:---:|:---:|
+| interconnect | URL of the underlying Interconnect object that this attachment's traffic will traverse through. | string
| ✓ | |
+| peer | Peer Ip address and asn. Only IPv4 supported | object({...})
| ✓ | |
+| project_id | The project containing the resources | string
| ✓ | |
+| *bgp* | Bgp session parameters | object({...})
| | null
|
+| *config* | VLAN attachment parameters: description, vlan_id, bandwidth, admin_enabled, interconnect | object({...})
| | ...
|
+| *name* | The name of the vlan attachment | string
| | vlan-attachment
|
+| *region* | Region where the router resides | string
| | europe-west1-b
|
+| *router_config* | Router asn and custom advertisement configuration, ip_ranges is a map of address ranges and descriptions.. | object({...})
| | ...
|
+| *router_create* | Create router. | bool
| | true
|
+| *router_name* | Router name used for auto created router, or to specify an existing router to use if `router_create` is set to `true`. Leave blank to use vlan attachment name for auto created router. | string
| | router-vlan-attachment
|
+| *router_network* | A reference to the network to which this router belongs | string
| | null
|
+
+## Outputs
+
+| name | description | sensitive |
+|---|---|:---:|
+| bgpsession | bgp session | |
+| interconnect_attachment | interconnect attachment | |
+| router | Router resource (only if auto-created). | |
+
diff --git a/modules/net-interconnect-attachment-direct/main.tf b/modules/net-interconnect-attachment-direct/main.tf
new file mode 100644
index 0000000000..01dd8dbcae
--- /dev/null
+++ b/modules/net-interconnect-attachment-direct/main.tf
@@ -0,0 +1,98 @@
+
+/**
+ * Copyright 2021 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+locals {
+ router = (
+ var.router_create
+ ? try(google_compute_router.router[0].name, null)
+ : var.router_name
+ )
+ vlan_interconnect = try(google_compute_interconnect_attachment.interconnect_vlan_attachment.name)
+}
+
+resource "google_compute_router" "router" {
+ count = var.router_create ? 1 : 0
+ project = var.project_id
+ region = var.region
+ name = var.router_name == "" ? "router-${var.name}" : var.router_name
+ description = var.router_config.description
+ network = var.router_network
+ bgp {
+ advertise_mode = (
+ var.router_config.advertise_config == null
+ ? null
+ : var.router_config.advertise_config.mode
+ )
+ advertised_groups = (
+ var.router_config.advertise_config == null ? null : (
+ var.router_config.advertise_config.mode != "CUSTOM"
+ ? null
+ : var.router_config.advertise_config.groups
+ )
+ )
+ dynamic "advertised_ip_ranges" {
+ for_each = (
+ var.router_config.advertise_config == null ? {} : (
+ var.router_config.advertise_config.mode != "CUSTOM"
+ ? null
+ : var.router_config.advertise_config.ip_ranges
+ )
+ )
+ iterator = range
+ content {
+ range = range.key
+ description = range.value
+ }
+ }
+ asn = var.router_config.asn
+ }
+}
+
+resource "google_compute_interconnect_attachment" "interconnect_vlan_attachment" {
+ project = var.project_id
+ region = var.region
+ router = local.router
+ name = var.name
+ description = var.config.description
+ interconnect = var.interconnect
+ bandwidth = var.config.bandwidth
+ mtu = var.config.mtu
+ vlan_tag8021q = var.config.vlan_id
+ candidate_subnets = var.bgp == null ? null : var.bgp.candidate_ip_ranges
+ admin_enabled = var.config.admin_enabled
+ provider = google-beta
+}
+
+resource "google_compute_router_interface" "interface" {
+ project = var.project_id
+ region = var.region
+ name = "interface-${var.name}"
+ router = local.router
+ ip_range = var.bgp == null ? null : var.bgp.session_range
+ interconnect_attachment = local.vlan_interconnect
+}
+
+resource "google_compute_router_peer" "peer" {
+ project = var.project_id
+ region = var.region
+ name = "bgp-session-${var.name}"
+ router = local.router
+ peer_ip_address = var.peer.ip_address
+ peer_asn = var.peer.asn
+ advertised_route_priority = var.bgp == null ? null : var.bgp.advertised_route_priority
+ interface = local.vlan_interconnect
+}
diff --git a/modules/net-interconnect-attachment-direct/outputs.tf b/modules/net-interconnect-attachment-direct/outputs.tf
new file mode 100644
index 0000000000..392f4d122c
--- /dev/null
+++ b/modules/net-interconnect-attachment-direct/outputs.tf
@@ -0,0 +1,31 @@
+/**
+ * Copyright 2021 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+output "bgpsession" {
+ description = "bgp session"
+ value = google_compute_router_peer.peer
+}
+
+output "interconnect_attachment" {
+ description = "interconnect attachment"
+ value = google_compute_interconnect_attachment.interconnect_vlan_attachment
+}
+
+output "router" {
+ description = "Router resource (only if auto-created)."
+ value = google_compute_router.router
+}
+
+
diff --git a/modules/net-interconnect-attachment-direct/variables.tf b/modules/net-interconnect-attachment-direct/variables.tf
new file mode 100644
index 0000000000..69d837c201
--- /dev/null
+++ b/modules/net-interconnect-attachment-direct/variables.tf
@@ -0,0 +1,115 @@
+/**
+ * Copyright 2021 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+variable "bgp" {
+ description = "Bgp session parameters"
+ type = object({
+ session_range = string
+ candidate_ip_ranges = list(string)
+ advertised_route_priority = number
+
+ })
+ default = null
+}
+
+variable "config" {
+ description = "VLAN attachment parameters: description, vlan_id, bandwidth, admin_enabled, interconnect"
+ type = object({
+ description = string
+ vlan_id = number
+ bandwidth = string
+ admin_enabled = bool
+ mtu = number
+ })
+ default = {
+ description = null
+ vlan_id = null
+ bandwidth = "BPS_10G"
+ admin_enabled = true
+ mtu = 1440
+ }
+}
+
+variable "interconnect" {
+ description = "URL of the underlying Interconnect object that this attachment's traffic will traverse through."
+ type = string
+}
+
+variable "name" {
+ description = "The name of the vlan attachment"
+ type = string
+ default = "vlan-attachment"
+}
+
+variable "peer" {
+ description = "Peer Ip address and asn. Only IPv4 supported"
+ type = object({
+ ip_address = string
+ asn = number
+ })
+}
+
+variable "project_id" {
+ description = "The project containing the resources"
+ type = string
+}
+
+variable "region" {
+ description = "Region where the router resides"
+ type = string
+ default = "europe-west1-b"
+}
+
+variable "router_config" {
+ description = "Router asn and custom advertisement configuration, ip_ranges is a map of address ranges and descriptions.. "
+ type = object({
+ description = string
+ asn = number
+ advertise_config = object({
+ groups = list(string)
+ ip_ranges = map(string)
+ mode = string
+ })
+ })
+
+ default = {
+ description = null
+ asn = 64514
+ advertise_config = null
+ }
+}
+
+variable "router_create" {
+ description = "Create router."
+ type = bool
+ default = true
+}
+
+variable "router_name" {
+ description = "Router name used for auto created router, or to specify an existing router to use if `router_create` is set to `true`. Leave blank to use vlan attachment name for auto created router."
+ type = string
+ default = "router-vlan-attachment"
+}
+
+variable "router_network" {
+ description = "A reference to the network to which this router belongs"
+ type = string
+ default = null
+}
+
+
+
+
diff --git a/modules/net-interconnect-attachment-direct/versions.tf b/modules/net-interconnect-attachment-direct/versions.tf
new file mode 100644
index 0000000000..897f817c2a
--- /dev/null
+++ b/modules/net-interconnect-attachment-direct/versions.tf
@@ -0,0 +1,18 @@
+/**
+ * Copyright 2021 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+terraform {
+ required_version = ">= 0.12.6"
+}
\ No newline at end of file
diff --git a/modules/project/README.md b/modules/project/README.md
index 8000082ef5..697f9d25a8 100644
--- a/modules/project/README.md
+++ b/modules/project/README.md
@@ -149,6 +149,29 @@ module "project-host" {
# tftest:modules=5:resources=12
```
+## Cloud KMS encryption keys
+```hcl
+module "project" {
+ source = "./modules/project"
+ name = "my-project"
+ billing_account = "123456-123456-123456"
+ prefix = "foo"
+ services = [
+ "compute.googleapis.com",
+ "storage.googleapis.com"
+ ]
+ service_encryption_key_ids = {
+ compute = [
+ "projects/kms-central-prj/locations/europe-west3/keyRings/my-keyring/cryptoKeys/europe3-gce",
+ "projects/kms-central-prj/locations/europe-west4/keyRings/my-keyring/cryptoKeys/europe4-gce"
+ ]
+ storage = [
+ "projects/kms-central-prj/locations/europe/keyRings/my-keyring/cryptoKeys/europe-gcs"
+ ]
+ }
+}
+# tftest:modules=1:resources=7
+```
## Variables
@@ -177,6 +200,7 @@ module "project-host" {
| *prefix* | Prefix used to generate project id and name. | string
| | null
|
| *project_create* | Create project. When set to false, uses a data source to reference existing project. | bool
| | true
|
| *service_config* | Configure service API activation. | object({...})
| | ...
|
+| *service_encryption_key_ids* | Cloud KMS encryption key in {SERVICE => [KEY_URL]} format. | map(list(string))
| | {}
|
| *service_perimeter_bridges* | Name of VPC-SC Bridge perimeters to add project into. Specify the name in the form of 'accessPolicies/ACCESS_POLICY_NAME/servicePerimeters/PERIMETER_NAME'. | list(string)
| | null
|
| *service_perimeter_standard* | Name of VPC-SC Standard perimeter to add project into. Specify the name in the form of 'accessPolicies/ACCESS_POLICY_NAME/servicePerimeters/PERIMETER_NAME'. | string
| | null
|
| *services* | Service APIs to enable. | list(string)
| | []
|
diff --git a/modules/project/main.tf b/modules/project/main.tf
index 4f07a595a2..08bf0e0011 100644
--- a/modules/project/main.tf
+++ b/modules/project/main.tf
@@ -65,6 +65,14 @@ locals {
if sink.iam && sink.type == type
}
}
+ service_encryption_key_ids = flatten([
+ for service in keys(var.service_encryption_key_ids) : [
+ for key in var.service_encryption_key_ids[service] : {
+ service = service
+ key = key
+ }
+ ]
+ ])
}
data "google_project" "project" {
@@ -356,3 +364,19 @@ resource "google_access_context_manager_service_perimeter_resource" "service-per
perimeter_name = each.value
resource = "projects/${local.project.number}"
}
+
+resource "google_kms_crypto_key_iam_member" "crypto_key" {
+ for_each = {
+ for service_key in local.service_encryption_key_ids : "${service_key.service}.${service_key.key}" => service_key
+ }
+ crypto_key_id = each.value.key
+ role = "roles/cloudkms.cryptoKeyEncrypterDecrypter"
+ member = "serviceAccount:${local.service_accounts_robots[each.value.service]}"
+ depends_on = [
+ google_project.project,
+ google_project_service.project_services,
+ data.google_bigquery_default_service_account.bq_sa,
+ data.google_project.project,
+ data.google_storage_project_service_account.gcs_sa,
+ ]
+}
diff --git a/modules/project/outputs.tf b/modules/project/outputs.tf
index 4f54bc65a1..f7547d90ac 100644
--- a/modules/project/outputs.tf
+++ b/modules/project/outputs.tf
@@ -23,7 +23,8 @@ output "project_id" {
google_project_organization_policy.boolean,
google_project_organization_policy.list,
google_project_service.project_services,
- google_compute_shared_vpc_service_project.service_projects
+ google_compute_shared_vpc_service_project.service_projects,
+ google_kms_crypto_key_iam_member.crypto_key
]
}
@@ -34,7 +35,8 @@ output "name" {
google_project_organization_policy.boolean,
google_project_organization_policy.list,
google_project_service.project_services,
- google_compute_shared_vpc_service_project.service_projects
+ google_compute_shared_vpc_service_project.service_projects,
+ google_kms_crypto_key_iam_member.crypto_key
]
}
@@ -45,7 +47,8 @@ output "number" {
google_project_organization_policy.boolean,
google_project_organization_policy.list,
google_project_service.project_services,
- google_compute_shared_vpc_service_project.service_projects
+ google_compute_shared_vpc_service_project.service_projects,
+ google_kms_crypto_key_iam_member.crypto_key
]
}
@@ -56,7 +59,10 @@ output "service_accounts" {
default = local.service_accounts_default
robots = local.service_accounts_robots
}
- depends_on = [google_project_service.project_services]
+ depends_on = [
+ google_project_service.project_services,
+ google_kms_crypto_key_iam_member.crypto_key
+ ]
}
output "custom_roles" {
diff --git a/modules/project/service_accounts.tf b/modules/project/service_accounts.tf
index b0a64017ba..5c7f12b7b7 100644
--- a/modules/project/service_accounts.tf
+++ b/modules/project/service_accounts.tf
@@ -17,12 +17,11 @@
locals {
service_account_cloud_services = "${local.project.number}@cloudservices.gserviceaccount.com"
service_accounts_default = {
- # TODO: Find a better place to store BQ service account
- bq = "bq-${local.project.number}@bigquery-encryption.iam.gserviceaccount.com"
compute = "${local.project.number}-compute@developer.gserviceaccount.com"
gae = "${local.project.project_id}@appspot.gserviceaccount.com"
}
service_accounts_robot_services = {
+ bq = "bigquery-encryption"
cloudasset = "gcp-sa-cloudasset"
cloudbuild = "gcp-sa-cloudbuild"
compute = "compute-system"
@@ -33,10 +32,32 @@ locals {
gae-flex = "gae-api-prod"
gcf = "gcf-admin-robot"
pubsub = "gcp-sa-pubsub"
+ secretmanager = "gcp-sa-secretmanager"
storage = "gs-project-accounts"
}
service_accounts_robots = {
for service, name in local.service_accounts_robot_services :
- service => "service-${local.project.number}@${name}.iam.gserviceaccount.com"
+ service => "${service == "bq" ? "bq" : "service"}-${local.project.number}@${name}.iam.gserviceaccount.com"
}
}
+
+data "google_storage_project_service_account" "gcs_sa" {
+ count = contains(var.services, "storage.googleapis.com") ? 1 : 0
+ project = local.project.project_id
+ depends_on = [google_project_service.project_services]
+}
+
+data "google_bigquery_default_service_account" "bq_sa" {
+ count = contains(var.services, "bigquery.googleapis.com") ? 1 : 0
+ project = local.project.project_id
+ depends_on = [google_project_service.project_services]
+}
+
+# Secret Manager SA created just in time, we need to trigger the creation.
+resource "google_project_service_identity" "sm_sa" {
+ provider = google-beta
+ count = contains(var.services, "secretmanager.googleapis.com") ? 1 : 0
+ project = local.project.project_id
+ service = "secretmanager.googleapis.com"
+ depends_on = [google_project_service.project_services]
+}
diff --git a/modules/project/variables.tf b/modules/project/variables.tf
index fa4c84da7c..d4f917b33e 100644
--- a/modules/project/variables.tf
+++ b/modules/project/variables.tf
@@ -148,6 +148,12 @@ variable "service_config" {
}
}
+variable "service_encryption_key_ids" {
+ description = "Cloud KMS encryption key in {SERVICE => [KEY_URL]} format."
+ type = map(list(string))
+ default = {}
+}
+
variable "shared_vpc_host_config" {
description = "Configures this project as a Shared VPC host project (mutually exclusive with shared_vpc_service_project)."
type = object({
@@ -192,7 +198,6 @@ variable "logging_exclusions" {
default = {}
}
-
variable "contacts" {
description = "List of essential contacts for this resource. Must be in the form EMAIL -> [NOTIFICATION_TYPES]. Valid notification types are ALL, SUSPENSION, SECURITY, TECHNICAL, BILLING, LEGAL, PRODUCT_UPDATES"
type = map(list(string))
@@ -205,7 +210,6 @@ variable "service_perimeter_standard" {
default = null
}
-
variable "service_perimeter_bridges" {
description = "Name of VPC-SC Bridge perimeters to add project into. Specify the name in the form of 'accessPolicies/ACCESS_POLICY_NAME/servicePerimeters/PERIMETER_NAME'."
type = list(string)
diff --git a/modules/pubsub/README.md b/modules/pubsub/README.md
index 868f693f0f..938f27166a 100644
--- a/modules/pubsub/README.md
+++ b/modules/pubsub/README.md
@@ -96,14 +96,14 @@ module "pubsub" {
| name | PubSub topic name. | string
| ✓ | |
| project_id | Project used for resources. | string
| ✓ | |
| *dead_letter_configs* | Per-subscription dead letter policy configuration. | map(object({...}))
| | {}
|
-| *defaults* | Subscription defaults for options. | object({...})
| | ...
|
+| *defaults* | Subscription defaults for options. | object({...})
| | ...
|
| *iam* | IAM bindings for topic in {ROLE => [MEMBERS]} format. | map(list(string))
| | {}
|
| *kms_key* | KMS customer managed encryption key. | string
| | null
|
| *labels* | Labels. | map(string)
| | {}
|
| *push_configs* | Push subscription configurations. | map(object({...}))
| | {}
|
| *regions* | List of regions used to set persistence policy. | list(string)
| | []
|
| *subscription_iam* | IAM bindings for subscriptions in {SUBSCRIPTION => {ROLE => [MEMBERS]}} format. | map(map(list(string)))
| | {}
|
-| *subscriptions* | Topic subscriptions. Also define push configs for push subscriptions. If options is set to null subscription defaults will be used. Labels default to topic labels if set to null. | map(object({...}))
| | {}
|
+| *subscriptions* | Topic subscriptions. Also define push configs for push subscriptions. If options is set to null subscription defaults will be used. Labels default to topic labels if set to null. | map(object({...}))
| | {}
|
## Outputs
diff --git a/modules/pubsub/variables.tf b/modules/pubsub/variables.tf
index b76d3c1175..6657d43545 100644
--- a/modules/pubsub/variables.tf
+++ b/modules/pubsub/variables.tf
@@ -27,7 +27,7 @@ variable "defaults" {
description = "Subscription defaults for options."
type = object({
ack_deadline_seconds = number
- message_retention_duration = number
+ message_retention_duration = string
retain_acked_messages = bool
expiration_policy_ttl = string
})
@@ -93,7 +93,7 @@ variable "subscriptions" {
labels = map(string)
options = object({
ack_deadline_seconds = number
- message_retention_duration = number
+ message_retention_duration = string
retain_acked_messages = bool
expiration_policy_ttl = string
})
diff --git a/tests/cloud_operations/scheduled_asset_inventory_export_bq/test_plan.py b/tests/cloud_operations/scheduled_asset_inventory_export_bq/test_plan.py
index 74023fff27..a8766f484e 100644
--- a/tests/cloud_operations/scheduled_asset_inventory_export_bq/test_plan.py
+++ b/tests/cloud_operations/scheduled_asset_inventory_export_bq/test_plan.py
@@ -24,4 +24,4 @@ def test_resources(e2e_plan_runner):
"Test that plan works and the numbers of resources is as expected."
modules, resources = e2e_plan_runner(FIXTURES_DIR)
assert len(modules) == 5
- assert len(resources) == 17
+ assert len(resources) == 18
diff --git a/tests/data_solutions/data_platform_foundations/__init__.py b/tests/data_solutions/data_platform_foundations/__init__.py
new file mode 100644
index 0000000000..d46dbae5eb
--- /dev/null
+++ b/tests/data_solutions/data_platform_foundations/__init__.py
@@ -0,0 +1,13 @@
+# Copyright 2021 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/tests/data_solutions/data_platform_foundations/fixture/main.tf b/tests/data_solutions/data_platform_foundations/fixture/main.tf
new file mode 100644
index 0000000000..66de8aca1f
--- /dev/null
+++ b/tests/data_solutions/data_platform_foundations/fixture/main.tf
@@ -0,0 +1,26 @@
+/**
+ * Copyright 2021 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+module "test-environment" {
+ source = "../../../../data-solutions/data-platform-foundations/01-environment"
+ billing_account_id = var.billing_account
+ root_node = var.root_node
+}
+
+module "test-resources" {
+ source = "../../../../data-solutions/data-platform-foundations/02-resources"
+ project_ids = module.test-environment.project_ids
+}
diff --git a/tests/data_solutions/data_platform_foundations/fixture/variables.tf b/tests/data_solutions/data_platform_foundations/fixture/variables.tf
new file mode 100644
index 0000000000..499e1e4640
--- /dev/null
+++ b/tests/data_solutions/data_platform_foundations/fixture/variables.tf
@@ -0,0 +1,26 @@
+/**
+ * Copyright 2021 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+variable "billing_account" {
+ type = string
+ default = "123456-123456-123456"
+}
+
+variable "root_node" {
+ description = "The resource name of the parent Folder or Organization. Must be of the form folders/folder_id or organizations/org_id."
+ type = string
+ default = "folders/12345678"
+}
diff --git a/tests/data_solutions/data_platform_foundations/test_plan.py b/tests/data_solutions/data_platform_foundations/test_plan.py
new file mode 100644
index 0000000000..80f2973343
--- /dev/null
+++ b/tests/data_solutions/data_platform_foundations/test_plan.py
@@ -0,0 +1,27 @@
+# Copyright 2021 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+import os
+import pytest
+
+
+FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'fixture')
+
+
+def test_resources(e2e_plan_runner):
+ "Test that plan works and the numbers of resources is as expected."
+ modules, resources = e2e_plan_runner(FIXTURES_DIR)
+ assert len(modules) == 6
+ assert len(resources) == 32
diff --git a/tests/data_solutions/gcs_to_bq_with_dataflow/test_plan.py b/tests/data_solutions/gcs_to_bq_with_dataflow/test_plan.py
index 7342b018ee..54f186e4d7 100644
--- a/tests/data_solutions/gcs_to_bq_with_dataflow/test_plan.py
+++ b/tests/data_solutions/gcs_to_bq_with_dataflow/test_plan.py
@@ -24,4 +24,4 @@ def test_resources(e2e_plan_runner):
"Test that plan works and the numbers of resources is as expected."
modules, resources = e2e_plan_runner(FIXTURES_DIR)
assert len(modules) == 14
- assert len(resources) == 61
+ assert len(resources) == 62
diff --git a/tests/foundations/business_units/test_plan.py b/tests/foundations/business_units/test_plan.py
index e04e82e55b..97c118cfb2 100644
--- a/tests/foundations/business_units/test_plan.py
+++ b/tests/foundations/business_units/test_plan.py
@@ -24,4 +24,4 @@ def test_resources(e2e_plan_runner):
"Test that plan works and the numbers of resources is as expected."
modules, resources = e2e_plan_runner(FIXTURES_DIR)
assert len(modules) == 8
- assert len(resources) == 82
+ assert len(resources) == 83
diff --git a/third-party-solutions/openshift/README.md b/third-party-solutions/openshift/README.md
index 5665fbb3cf..37e58e885d 100644
--- a/third-party-solutions/openshift/README.md
+++ b/third-party-solutions/openshift/README.md
@@ -116,7 +116,7 @@ gcloud iam service-accounts keys create $OCP_DIR/credentials.json \
--iam-account $OCP_SA
```
-If you need more fine-grained control on the service account's permissions instead, refer to the Mint Mode documentation linked above for the individual roles needed.
+If you need more fine-grained control on the service account's permissions instead, refer to the [OpenShift documentation](https://docs.openshift.com/container-platform/4.7/installing/installing_gcp/installing-restricted-networks-gcp.html#installation-gcp-permissions_installing-restricted-networks-gcp) for the individual roles needed.
## Installation