This module includes the cleanup of resources created for the lab.
1. Declare variables
2. Delete Buckets
3. Delete Spark Persistent History Server
4. Delete BQ Dataset
5. Delete Sessions
5. Delete Managed Notebook
Note the project number and project ID.
We will need this for the rest fo the lab
Grant the following permissions
- Viewer
- Dataproc Editor
- Storage Admin
- BigQuery DataEditor
- Composer Administrator
- Service Usage Admin
Open Cloud shell or navigate to shell.cloud.google.com
Run the below command to set the project in the cloud shell terminal:
gcloud config set project $PROJECT_ID
We will use these throughout the lab.
Run the below in cloud shells coped to the project you selected-
PROJECT_ID= #Project ID
REGION= #Region to be used
BUCKET_PHS= #Bucket name for Persistent History Server
BUCKET_CODE=
BQ_DATASET_NAME=
PHS_NAME = Name of your PHS cluster in the dataproc.
Follow the commands to delete the following buckets
- Bucket attached to spark history server
- Bucket with code files
gcloud alpha storage rm --recursive gs://$BUCKET_PHS
gcloud alpha storage rm --recursive gs://$BUCKET_CODE
Run the below command to delete Spark PHS
gcloud dataproc clusters delete ${PHS_NAME} \
--region=${REGION}
Run the below command to delete BQ dataset and all the tables within the dataset
gcloud alpha bq datasets delete $BQ_DATASET_NAME \
--remove-tables
Select the sessions created as part of this lab and click on Delete
Select the notebook created and click on delete.
Select your artifact registry and click on delete
Select your VM and click on delete