Skip to content

Commit

Permalink
Update README.md (GoogleCloudPlatform#628)
Browse files Browse the repository at this point in the history
minor edits, mostly typos
  • Loading branch information
IreneAbezgauz authored Mar 25, 2021
1 parent a16912f commit 4115649
Showing 1 changed file with 15 additions and 15 deletions.
30 changes: 15 additions & 15 deletions tools/vpc-flowlogs-enforcer/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,28 +2,28 @@

## Description

This sample code shows how a Cloud Function can be used to enforce VPC Flow logs in all the networks under a particular folder. The Cloud Function will listen on a Pub/Sub topic for notifications about chenges in subnets. The notifications can be configured in two ways:
This sample code shows how a Cloud Function can be used to enforce VPC Flow logs in all the networks under a particular folder. The Cloud Function will listen on a Pub/Sub topic for notifications about changes in subnets. The notifications can be configured in two ways:

1. By creating a log sink that will filter all the network change events under the chosen folders and send them to a Pub Sub topic. This can be enabled using the `configure_log_sinks` variable.
2. By creating Cloud Asset Inventry feeds that will send ntifications to a Pub Sub topic when a subnet is modified under a particular folder. This can be enabled using the `configure_asset_feeds` variable. Currently, the Cloud Asset Inventory feeds cannot be configured via Terraform. The terraform configuration will create shell scripts to create and delete the feeds.
2. By creating Cloud Asset Inventory feeds that will send notifications to a Pub Sub topic when a subnet is modified under a particular folder. This can be enabled using the `configure_asset_feeds` variable. Currently, the Cloud Asset Inventory feeds cannot be configured via Terraform. The terraform configuration will create shell scripts to create and delete the feeds.

You should only enable one of the two options to avoid duplicate operations. If you enable both, you will see hat the asset inventory notifications are received a few seconds before the stackdriver based ones. You will probably want to use these in production, but I have included both options for demonstration purposes.
You should only enable one of the two options to avoid duplicate operations. If you enable both, you will see that the Asset Inventory notifications are received a few seconds before the Cloud Monitoring (Stackdriver) based ones. You will probably want to use these in production, but I have included both options for demonstration purposes.

The Terraform code included in this example creates the following resources:

* A GCP project where all the resources will be located.
* A pub Sub topic where the subnet change events will be published by the log sink or the inventory feed.
* A cloud function that listens to subnet change notifications and makes sure VPC flow logs are activated.
* If enabled, it will create the log sinks that will send notifications each time a subnet is modified.
* If enabled, it will create two shell scripts for creating and deleting the asset inventory feeds (not yet supported by terraform).
* A Pub/Sub topic where the subnet change events will be published by the log sink or the inventory feed.
* A Cloud Function that listens to subnet change notifications and makes sure VPC flow logs are activated.
* If configured to do so, it will create the log sinks that will send notifications each time a subnet is modified.
* If configured to do so, it will create two shell scripts for creating and deleting the asset inventory feeds (not yet supported by terraform).
* Creates all the necessary permissions:
* For the Cloud Function to be able to modify the subnets under the folders being monitored.
* For the log sin service accounts to publish notifications in the Pub Sub topic.
* For the log sink service accounts to publish notifications in the Pub Sub topic.
* For the Cloud Asset Inventory service account to publish notifications in the Pub Sub topic.

## Setup instructions

This terraform code uses service account impersionation to authenticate in the GCP APIs. The reasons for this are:
This terraform code uses service account impersonation to authenticate in the GCP APIs. The reasons for this are:

1. It is recommended to only grant high level permissions to one service account, and allow impersonating this SA to specific users who will need to run the terraform code. This way individual users will not need to have excessive permissions.
2. By always using a service account it is simpler to configure fine-grained permissions to run terraform code.
Expand All @@ -35,31 +35,31 @@ In order to run this terraform code, you will need to:
1. Create a service account in the main project and grant it the necessary permissions. You will need:
* If using CAI feeds, you will need the Cloud Asset Viewer role at the organization level, or above the folders you want to monitor.
* If using log sinks, you will need the Logs Configuration Writer role at the organization level, or above the folders you want to monitor.
* Project Creator and Billing account User roles, if you want to creat ethe project using terraform. If you are using an existing project, you will need to grant the service account the project Editor or Owner role.
2. Identify the team members who will need to run teh terraform code, and grant them the `roles/iam.serviceAccountTokenCreator` role.
* Project Creator and Billing account User roles, if you want to create the project using terraform. If you are using an existing project, you will need to grant the service account the project Editor or Owner role.
2. Identify the team members who will need to run the terraform code, and grant them the `roles/iam.serviceAccountTokenCreator` role.
3. Edit the `terraform.tfvars` files and replace the value of the `terraform_service_account` variable with the email of your service account.

The setup is quite straightforward:

1. Decide on which which folders in your organization you want to enfoce VPC flow logs and take note of those folder IDs.
1. Decide which which folders in your organization you want to enfoce VPC flow logs and take note of those folder IDs.
2. Choose a name for your demo project and a folder where you want to place it.
3. Decide the VPC flow logs configuration you want to apply to your networks. See [here](https://cloud.google.com/compute/docs/reference/rest/v1/subnetworks) for the options.
3. Decide on the VPC flow logs configuration you want to apply to your networks. See [here](https://cloud.google.com/compute/docs/reference/rest/v1/subnetworks) for the options.
4. Decide if you want to use asset inventory feeds or log sinks for the subnet change notifications.
5. Edit the `terraform.tf` file with your configuration options.
6. Apply the terraform configuration.
7. If you chose to use asset inventory feeds, make sure you run terraform using a service account. The Cloud Asset Inventory API requires being invoked using a service account.

## Testing locally

You can test the cloud function locally using the sample logs provided. To do this, you will first need to configure you rpython environment. While in this filder (vpc_log_enforcer), use [virtualenv](https://virtualenv.pypa.io/en/latest/) to set up a python3 environment:
You can test the cloud function locally using the sample logs provided. To do this, you will first need to configure your python environment. While in this folder (vpc_log_enforcer), use [virtualenv](https://virtualenv.pypa.io/en/latest/) to set up a python3 environment:

```
virtualenv --python python3 env
source env/bin/activate
pip3 install -r terraform/templates/cloud_function/requirements.txt
```

Run the cloud funtion using any of the sample logs provided (or create your own):
Run the cloud function using any of the sample logs provided (or create your own):

```
python3 terraform/templates/cloud_function/main.py sample_logs/insert_subnet_call_last.json
Expand Down

0 comments on commit 4115649

Please sign in to comment.