Skip to content

Commit

Permalink
Merge branch 'master' into jakePR185
Browse files Browse the repository at this point in the history
  • Loading branch information
Jacob Ferriero authored Apr 8, 2019
2 parents 80aca86 + 76a11ec commit 57d4813
Show file tree
Hide file tree
Showing 8 changed files with 531 additions and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ The examples folder contains example solutions across a variety of Google Cloud
* [BigQuery Group Sync For Row Level Access](examples/bigquery-row-access-groups) - Sample code to synchronize group membership from G Suite/Cloud Identity into BigQuery and join that with your data to control access at row level.
* [Bigtable Dataflow Cyptocurrencies Exchange RealTime Example](examples/cryptorealtime) - Apache Beam example that reads from the Crypto Exchanges WebSocket API as Google Cloud Dataflow pipeline and saves the feed in Google Cloud Bigtable. Real time visualization and query examples from GCP Bigtable running on Flask server are included.
* [Cloud Composer Examples](examples/cloud-composer-examples) - Examples of using Cloud Composer, GCP's managed Apache Airflow service.
* [Cloud SQL Custom Metric](examples/cloud-sql-custom-metric) - An example of creating a Stackdriver custom metric monitoring Cloud SQL Private Services IP consumption.
* [CloudML Bank Marketing](examples/cloudml-bank-marketing) - Notebook for creating a classification model for marketing using CloudML.
* [CloudML Bee Health Detection](examples/cloudml-bee-health-detection) - Detect if a bee is unhealthy based on an image of it and its subspecies.
* [CloudML Energy Price Forecasting](examples/cloudml-energy-price-forecasting) - Predicting the future energy price based on historical price and weather.
Expand Down
1 change: 1 addition & 0 deletions examples/cloudsql-custom-metric/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
.vscode/
36 changes: 36 additions & 0 deletions examples/cloudsql-custom-metric/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# Copyright 2019 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# The Google App Engine python runtime is Debian Jessie with Python installed
# and various os-level packages to allow installation of popular Python
# libraries. The source is on github at:
# https://github.com/GoogleCloudPlatform/python-docker

# This code is a prototype and not engineered for production use.
# Error handling is incomplete or inappropriate for usage beyond
# a development sample.

FROM gcr.io/google-appengine/python

RUN virtualenv -p python3 /env

ENV VIRTUAL_ENV /env
ENV PATH /env/bin:$PATH

ADD requirements.txt /app/requirements.txt
RUN pip install -r /app/requirements.txt

ADD . /app

ENTRYPOINT [ "python", "main.py" ]
96 changes: 96 additions & 0 deletions examples/cloudsql-custom-metric/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
# CloudSQL Custom Metric
This example demonstrates how to create a custom metric for Stackdriver Monitoring. This example
estimates the number of IP's consumed in a CloudSQL private services subnet.
![Cloud SQL Metric Architecture](images/CloudSQL_Metric.png)
## Component Description
A Stackdriver log sink at the organization level populates BigQuery with logs when a CloudSQL
instance is created or deleted. The metric app periodically querries BigQuery to determine
which projects have CloudSQL instances with private networks. This means the app can avoid
polling the organization for a list of projects then iterating over each project to determine
if there are relevant CloudSQL instances.

Once the projects with CloudSQL instances are known, then the app querries the CloudSQL Admin
API for more information on each CloudSQL instance. A list of instances is aggregated for the
organization and a count is calculated.

This information is then converted to a StackDriver Monitoring TimeSeries which is fed to the
StackDriver Monitoring API.
## Caveats
This code is a prototype and not engineered for production use. Error handling
is incomplete or inappropriate for usage beyond a development sample.

## Pre-Requisites
This requires GKE. The following scopes must be enabled on the nodes in the
resource pool.

* BigQuery - Enabled
* Loging - Write Only
* Monitoring - Full
* CloudSQL - Enabled

## Project Setup
Log in to gcloud
```shell
glcoud auth login
```
Enable the necessary API's
```
gcloud services enable bigquery-json.googleapis.com \
cloudapis.googleapis.com \
cloudtrace.googleapis.com \
compute.googleapis.com \
container.googleapis.com \
containerregistry.googleapis.com \
logging.googleapis.com \
monitoring.googleapis.com \
sourcerepo.googleapis.com \
stackdriver.googleapis.com \
stackdriverprovisioning.googleapis.com \
sqladmin.googleapis.com
```
## Setup BigQuery
Export the dataset
```shell
export DATASET=<your-data-set-id>
```
Create the BigQuery Dataset
```shell
bq --location=US mk -d --description "This is a dataset for the CloudSQL Sink." $DATASET
```
## Create the Sink
Export the organization ID
```shell
export ORG_ID=<your-organizaion-id>
```
Export the project ID.
```shell
export PROJECT_ID=<your-project-id>
```
Create the Logging Sink at the organization level
```shell
gcloud logging sinks create cloud-sql-sink \
bigquery.googleapis.com/projects/$PROJECT_ID/datasets/$DATASET \
--log-filter='resource.type="cloudsql_database" AND (protoPayload.methodName="cloudsql.instances.create" OR protoPayload.methodName="cloudsql.instances.delete")' --include-children --organization=$ORG_ID
```
The output of the command will provide a service account. Give this service account **bigquery.dataOwner**. However, if you are running this in GKE, this step isn't necessary.
## Run the Application
Change the zonal and regional increments as desired. Feel free to change the periodicity as well. With the command below, the job will run every two minutes.
```shell
kubectl run cloud-sql-monitor --schedule="0/2 * * * ?" --image=gcr.io/$PROJECT_ID/cloud-sql-monitor:latest --restart=Never --env="ZONAL_INCREMENT=1" --env="REGIONAL_INCREMENT=1" -- --project $PROJECT_ID --dataset $DATASET --all
```
## Create the Monitoring Dashboard
### Open Stackdriver Monitoring
1. Open the Google Cloud Console in your browser
2. Open the "hamburger menu" in the upper left
3. Scroll down to Stackdriver
4. Click on Monitoring

This will bring up the Stackdriver Monitoring console. If necessary, create a new Stackdriver workspace.
### Create the Metric Chart
1. Create a new dashboard or open an existing dashboard.
2. In the upper right corner of the dashboard, click "Add Chart".
3. Fill in the information as seen in the following graphic.

<img src="https://github.com/kevensen/professional-services/blob/cloud-sql-custom-metric/examples/cloudsql-custom-metric/images/metric_config.png?raw=true" width="250">

**Note:** It is recommended that you choose an alignment period larger than the CronJob period.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 57d4813

Please sign in to comment.