Skip to content

Commit a2efbc9

Browse files
committed
Update exercises 13 and 14 to use YAML plugin instead of gomatic
1 parent 19dbd8d commit a2efbc9

File tree

4 files changed

+132
-192
lines changed

4 files changed

+132
-192
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ sharing, DevOps is showing that developers and system engineers have much to
1515
learn from each other. Through a series of hands-on exercises, Danilo Sato will
1616
use a sample web application to demonstrate how to automate its build and
1717
deployment pipeline, using infrastructure and pipeline as code techniques. You
18-
will learn how to combine tools such as Docker, Terraform, GoCD, Gomatic, and
18+
will learn how to combine tools such as Docker, Terraform, GoCD, and
1919
Kubernetes to create deployment pipelines for your infrastructure, your services,
2020
and applications. But even if your company is not using any of these tools, we
2121
will discuss alternative technologies and highlight the patterns and principles

SETUP.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -111,7 +111,6 @@ can create a project for the workshop and name it "DevOps Workshop".
111111
`sudo` on Linux):
112112
* `docker pull openjdk:8-jdk-alpine`
113113
* `docker pull mysql:5.7`
114-
* `docker pull dtsato/gomatic`
115114
2. **Maven Dependencies**: have Maven download all dependencies by running:
116115
* `./mvnw clean install`
117116
3. **Minikube VM**: ensure you have the minikube VirtualBox VM image by running:

instructions/13-pipeline-as-code.md

Lines changed: 84 additions & 165 deletions
Original file line numberDiff line numberDiff line change
@@ -3,193 +3,112 @@
33
## Goals
44

55
* Learn about Pipeline as Code
6-
* Learn about GoMatic
7-
* Create a new pipeline for updating GoCD pipelines
6+
* Learn about GoCD YAML Config Plugin
7+
* Learn about GoCD Environments
8+
* Move PetClinic pipeline definition to YAML
89

910
## Acceptance Criteria
1011

11-
* Pipeline code should be placed under a new `pipelines` folder at the root of
12-
the project
13-
* Exclude the `pipelines` folder as a trigger to the "PetClinic" pipeline
14-
* Create a new "Meta" pipeline triggered on changes to the `pipelines` folder
15-
only, that executes the python scripts using GoMatic
16-
* Use GoMatic locally to export the current pipeline configurations and convert
17-
them into pipeline scripts
12+
* Pipeline configuration should be placed under a new `PetClinic.gocd.yaml` file
13+
at the root of the project
14+
* Exclude the `PetClinic.gocd.yaml` file as a trigger to the "PetClinic" pipeline
15+
* Setup a new Configuration Repo using the YAML Config Plugin
16+
* Export the current pipeline configuration to YAML and move to the repository
17+
* Extract the common environment variables and secret variables to a new `gcp`
18+
GoCD environment
19+
* Display test results and artifacts on job "Tests" tab
1820

1921
## Step by Step Instructions
2022

21-
### Creating placeholder scripts
23+
### Extracting and configuring the PetClinic Pipeline as Code
2224

23-
First, let's create the folder and a few dummy scripts for each of the
24-
pipelines, to serve as placeholder test scripts:
25+
First, let's export the current configuration to the YAML configuration format.
26+
Click on the "ADMIN" menu and select "Pipelines". Click on "Export using" for
27+
"PetClinc" pipeline and select the "YAML Configuration Plugin" option.
28+
tab. This will download a file called `PetClinic.gocd.yaml`, that you can move
29+
to the root of the project.
2530

26-
```shell
27-
$ mkdir pipelines
28-
$ echo "print \"Updating Meta Pipeline...\"" > pipelines/meta_pipeline.py
29-
$ echo "print \"Updating PetClinic Pipeline...\"" > pipelines/pet_clinic_pipeline.py
30-
```
31-
32-
Let's also create a `pipelines/update.sh` script that will be executed by the
33-
meta pipeline to invoke the dummy scripts inside a Docker container with
34-
Python and GoMatic:
35-
36-
```bash
37-
#!/bin/bash
38-
set -xe
39-
40-
CWD=$(cd $(dirname $0) && pwd)
41-
for pipeline in $CWD/*.py; do
42-
docker run -i --rm -v "$CWD":/usr/src/meta -w /usr/src/meta -e GO_SERVER_URL=$GO_SERVER_URL dtsato/gomatic /bin/bash -c "python $(basename $pipeline)"
43-
done
44-
```
45-
46-
Make sure the new script is executable:
47-
48-
```shell
49-
$ chmod a+x pipelines/update.sh
50-
```
51-
52-
Now let's update the current "PetClinic" pipeline configuration to not trigger
53-
when changes occur to the `pipelines` folder. Click on the "ADMIN" menu and select
54-
"Pipelines". Click on "Edit" for "PetClinc" pipeline and go to the "Materials"
55-
tab. Opening the Git material, add the following configuration to the Blacklist:
56-
57-
* Paths to be excluded: `pipelines/*`
58-
59-
Now we can commit and push our changes to introduce the placeholder scripts and
60-
it should not trigger a pipeline execution.
31+
Fix some of the formatting on the file to make sure all the tasks are defined in
32+
a single line. Also update the `format_version` to version `4`.
6133

62-
### Creating the meta pipeline
34+
In order to use this file, we need to create a new Configuration Repository,
35+
that will poll for changes and update the pipeline definitions when the YAML
36+
file changes. Click on the "ADMIN" menu and select "Config Repositories". Click
37+
the "Add" button and create the new configuration repository:
6338

64-
From the "ADMIN" menu, select "Pipelines", and click on "Create a new pipeline
65-
within this group". The first step will setup the pipeline name to "Meta" and
66-
click "NEXT". Then on the next step, use the following configuration:
67-
68-
* Material Type: `Git`
39+
* Plugin ID: `YAML Configuration Plugin`
40+
* Material type: `Git`
41+
* Config repository ID: `sample`
6942
* URL: Same as before, use the Git URL for your repository
7043
* Branch: `master`
71-
* Blacklist: `pipelines/*`
72-
* Invert the file filter: Checked
73-
74-
Once again, you can test by clicking on "CHECK CONNECTION" before proceeding
75-
with clicking "NEXT". In the final step, use the following configuration:
76-
77-
* Stage Name: `update-pipelines`
78-
* Job Name: `update-pipelines`
79-
* Task Type: `More...`
80-
* Command: `pipelines/update.sh`
8144

82-
We can then click on "FINISH". Going back to the "Job Settings" tab, we can add
83-
the `docker-jdk` Elastic Profile Id to ensure the agent will run. We can now
84-
un-pause the pipeline and test that it runs successfully.
45+
Once again, you can test by clicking on "Test Connection" before proceeding
46+
with clicking "Save".
8547

86-
### Pipeline as code for Meta pipeline
48+
Now we can commit and push our changes to introduce the `PetClinic.gocd.yaml`
49+
pipeline configuration file and it should be picked up by the YAML configuration
50+
plugin.
8751

88-
Once the build succeeds, let's update the dummy script to setup the pipeline
89-
based on its current state. GoMatic has a feature to export the current pipeline
90-
configuration as code, so we can run this command locally, replacing the IP
91-
address with your GoCD Server public IP from GKE:
52+
When it first executes, you will see an error because the `PetClinic` pipeline
53+
already exists. In order for GoCD to use the YAML definition, we need to delete
54+
the existing pipeline. Don't worry, once the YAML plugin re-scans the repo, it
55+
will recreate our pipeline. Click on the "ADMIN" menu and select "Pipelines".
56+
Click the "Delete" link and on the "Proceed" button to confirm the pipeline
57+
deletion.
9258

93-
```shell
94-
$ docker run -i --rm dtsato/gomatic python -m gomatic.go_cd_configurator -s 35.190.56.218 -p Meta
95-
#!/usr/bin/env python
96-
from gomatic import *
97-
...
98-
```
59+
Now wait for the YAML Config Plugin to re-scan the repository and recreate our
60+
pipeline.
9961

100-
You can see that at the end of the execution, there is some Python code that we
101-
can copy and paste and use as the template for our pipeline configuration as code.
102-
Paste it on the `pipelines/meta_pipeline.py` script and make a few tweaks:
103-
104-
```python
105-
#!/usr/bin/env python
106-
from gomatic import *
107-
import os, re
108-
109-
print "Updating Meta Pipeline..."
110-
111-
go_server_host = re.search('https?://([a-z0-9.\-._~%]+)', os.environ['GO_SERVER_URL']).group(1)
112-
go_server_url = "%s:%s" % (go_server_host, "8153")
113-
configurator = GoCdConfigurator(HostRestClient(go_server_url))
114-
pipeline = configurator\
115-
.ensure_pipeline_group("sample")\
116-
.ensure_replacement_of_pipeline("Meta")\
117-
.set_git_material(GitMaterial("https://github.com/dtsato/devops-in-practice-workshop.git", branch="master", ignore_patterns=set(['pipelines/*']), invert_filter="True"))
118-
stage = pipeline.ensure_stage("update-pipelines")
119-
job = stage.ensure_job("update-pipelines").set_elastic_profile_id('docker-jdk')
120-
job.add_task(ExecTask(['pipelines/update.sh']))
121-
122-
configurator.save_updated_config()
123-
```
62+
### Improving the PetClinic pipeline
12463

125-
We are adding some code to parse the GoCD Server URL from the environment
126-
variable and removed the named arguments from the last line to actually save the
127-
configuration, and not just do a dry-run.
64+
Now, to test that the pipeline configuration is really defined in code, on the
65+
`PetClinic.gocd.yaml` file, let's make a few changes to improve it. First, let's
66+
configure our `build-and-publish` job, to collect and publish the test report
67+
artifacts from surefire, by adding the `artifacts` section:
12868

129-
Once you commit and push this code, the Meta pipeline should trigger again, and
130-
this time the above code will invoke GoMatic.
131-
132-
### Pipeline as code for PetClinic pipeline
133-
134-
Finally, let's extract the pipeline configuration for the "PetClinic" pipeline,
135-
by executing the same GoMatic command locally, changing the pipeline name and
136-
using your GoCD Server URL:
137-
138-
```shell
139-
$ docker run -i --rm dtsato/gomatic python -m gomatic.go_cd_configurator -s 35.190.56.218 -p PetClinic
140-
#!/usr/bin/env python
141-
from gomatic import *
69+
```yaml
70+
...
71+
jobs:
72+
build-and-publish:
73+
timeout: 0
74+
environment_variables:
75+
MAVEN_OPTS: -Xmx1024m
76+
GCLOUD_PROJECT_ID: devops-workshop-123
77+
secure_variables:
78+
GCLOUD_SERVICE_KEY: AES:kb7KQ/gJ1VTtYGU6SLUJjA==...
79+
elastic_profile_id: docker-jdk
80+
artifacts:
81+
- test:
82+
source: target/surefire-reports
83+
tasks:
14284
...
14385
```
14486

145-
At the end of the execution, you can once again copy and paste the Python code
146-
into the `pipelines/pet_clinic_pipeline.py` script, and make the same tweaks:
147-
148-
```python
149-
#!/usr/bin/env python
150-
from gomatic import *
151-
import os, re
152-
153-
print "Updating PetClinic Pipeline..."
154-
go_server_host = re.search('https?://([a-z0-9.\-._~%]+)', os.environ['GO_SERVER_URL']).group(1)
155-
go_server_url = "%s:%s" % (go_server_host, "8153")
156-
configurator = GoCdConfigurator(HostRestClient(go_server_url))
157-
secret_variables = {'GCLOUD_SERVICE_KEY': 'lKD+DoKDGtCsaToW...'}
158-
pipeline = configurator\
159-
.ensure_pipeline_group("sample")\
160-
.ensure_replacement_of_pipeline("PetClinic")\
161-
.set_git_material(GitMaterial("https://github.com/dtsato/devops-in-practice-workshop.git", branch="master", ignore_patterns=set(['pipelines/*'])))
162-
stage = pipeline.ensure_stage("commit")
163-
job = stage\
164-
.ensure_job("build-and-publish")\
165-
.ensure_environment_variables({'MAVEN_OPTS': '-Xmx1024m', 'GCLOUD_PROJECT_ID': 'devops-workshop-123'})\
166-
.ensure_encrypted_environment_variables(secret_variables)
167-
job.set_elastic_profile_id('docker-jdk')
168-
job.add_task(ExecTask(['./mvnw', 'clean', 'package']))
169-
job.add_task(ExecTask(['bash', '-c', 'docker build --tag pet-app:$GO_PIPELINE_LABEL --build-arg JAR_FILE=target/spring-petclinic-2.0.0.BUILD-SNAPSHOT.jar .']))
170-
job.add_task(ExecTask(['bash', '-c', 'docker login -u _json_key -p"$(echo $GCLOUD_SERVICE_KEY | base64 -d)" https://us.gcr.io']))
171-
job.add_task(ExecTask(['bash', '-c', 'docker tag pet-app:$GO_PIPELINE_LABEL us.gcr.io/$GCLOUD_PROJECT_ID/pet-app:$GO_PIPELINE_LABEL']))
172-
job.add_task(ExecTask(['bash', '-c', 'docker push us.gcr.io/$GCLOUD_PROJECT_ID/pet-app:$GO_PIPELINE_LABEL']))
173-
stage = pipeline.ensure_stage("deploy")
174-
job = stage\
175-
.ensure_job("deploy")\
176-
.ensure_environment_variables({'GCLOUD_ZONE': 'us-central1-a', 'GCLOUD_PROJECT_ID': 'devops-workshop-123', 'GCLOUD_CLUSTER': 'devops-workshop-gke'})\
177-
.ensure_encrypted_environment_variables(secret_variables)
178-
job.set_elastic_profile_id('kubectl')
179-
job.add_task(ExecTask(['bash', '-c', 'echo $GCLOUD_SERVICE_KEY | base64 -d > secret.json && chmod 600 secret.json']))
180-
job.add_task(ExecTask(['bash', '-c', 'gcloud auth activate-service-account --key-file secret.json']))
181-
job.add_task(ExecTask(['bash', '-c', 'gcloud container clusters get-credentials $GCLOUD_CLUSTER --zone $GCLOUD_ZONE --project $GCLOUD_PROJECT_ID']))
182-
job.add_task(ExecTask(['bash', '-c', './deploy.sh']))
183-
job.add_task(ExecTask(['bash', '-c', 'rm secret.json']))
184-
185-
configurator.save_updated_config()
87+
Also, let's extract the common environment variables and secret variables
88+
configuration to a new [GoCD Environment](https://docs.gocd.org/current/navigation/environments_page.html)
89+
by adding the following section to the beginning of the `PetClinic.gocd.yaml`
90+
file, copying the values of the variables from one of the stages/jobs
91+
(make sure the `GCLOUD_PROJECT_ID` and `GCLOUD_SERVICE_KEY` are yours):
92+
93+
```yaml
94+
format_version: 4
95+
environments:
96+
gcp:
97+
environment_variables:
98+
GCLOUD_CLUSTER: devops-workshop-gke
99+
GCLOUD_ZONE: us-central1-a
100+
GCLOUD_PROJECT_ID: devops-workshop-123
101+
secure_variables:
102+
GCLOUD_SERVICE_KEY: AES:kb7KQ/gJ1VTtYGU6SLUJjA==...
103+
pipelines:
104+
- PetClinic
105+
...
186106
```
187107

188-
Please note that we once again added some code to parse the GoCD Server URL, set
189-
the Elastic Profile IDs for both jobs, and removed the arguments from the last
190-
line. We also added an extra line to the `build-and-publish` job to collect the
191-
test report artifacts from surefire. This allows us to test that the pipeline
192-
configuration is getting updated after the Meta pipeline executes.
108+
Also, make sure to remove those common environment variables and secret
109+
variables from the stages and jobs to avoid the duplication.
193110

194-
Once again, when you commit and push these changes, the "Meta" pipeline should
195-
trigger and reconfigure the PetClinic pipeline.
111+
Once again, when you commit and push these changes, the YAML configuration
112+
plugin should pick up the changes and trigger a new pipeline execution, where
113+
you can check that the test results are now available after a successful
114+
pipeline run.

instructions/14-canary-release.md

Lines changed: 47 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
* Create a `web-canary.yml` kubernetes definition to implement canary releases
1111
* Update the `deploy.sh` script to deploy the canary release only
1212
* Create a `complete-canary.sh` script to complete the canary release rollout
13-
* Extend the PetClinic pipeline (using GoMatic) to add a new stage with a manual
13+
* Extend the PetClinic pipeline to add a new stage with a manual
1414
approval to complete the canary release using the above scripts
1515

1616
## Step by Step Instructions
@@ -129,32 +129,54 @@ Don't forget to make the script executable:
129129
$ chmod a+x complete-canary.sh
130130
```
131131

132-
Finally, we can update our `pipelines/pet_clinic_pipeline.py` script to add a
133-
new manual stage and job to execute the `complete_canary.sh` script, after the
134-
definition of the `deploy` stage:
132+
Finally, we can update our `PetClinic.gocd.yaml` pipeline configuration file to
133+
add a new manual stage and job to execute the `complete_canary.sh` script, after
134+
the definition of the `deploy` stage:
135135

136-
```python
136+
```yaml
137137
...
138138
139-
stage = pipeline.ensure_stage("approve-canary")
140-
stage.set_has_manual_approval()
141-
job = stage\
142-
.ensure_job("complete-canary")\
143-
.ensure_environment_variables({'GCLOUD_ZONE': 'us-central1-a', 'GCLOUD_PROJECT_ID': 'devops-workshop-123', 'GCLOUD_CLUSTER': 'devops-workshop-gke'})\
144-
.ensure_encrypted_environment_variables(secret_variables)
145-
job.set_elastic_profile_id('kubectl')
146-
job.add_task(ExecTask(['bash', '-c', 'echo $GCLOUD_SERVICE_KEY | base64 -d > secret.json && chmod 600 secret.json']))
147-
job.add_task(ExecTask(['bash', '-c', 'gcloud auth activate-service-account --key-file secret.json']))
148-
job.add_task(ExecTask(['bash', '-c', 'gcloud container clusters get-credentials $GCLOUD_CLUSTER --zone $GCLOUD_ZONE --project $GCLOUD_PROJECT_ID']))
149-
job.add_task(ExecTask(['bash', '-c', './complete-canary.sh']))
150-
job.add_task(ExecTask(['bash', '-c', 'rm secret.json']))
151-
152-
configurator.save_updated_config()
139+
- approve-canary:
140+
fetch_materials: true
141+
keep_artifacts: false
142+
clean_workspace: false
143+
approval:
144+
type: manual
145+
jobs:
146+
complete-canary:
147+
timeout: 0
148+
elastic_profile_id: kubectl
149+
tasks:
150+
- exec:
151+
arguments:
152+
- -c
153+
- echo $GCLOUD_SERVICE_KEY | base64 -d > secret.json && chmod 600 secret.json
154+
command: bash
155+
run_if: passed
156+
- exec:
157+
arguments:
158+
- -c
159+
- gcloud auth activate-service-account --key-file secret.json
160+
command: bash
161+
run_if: passed
162+
- exec:
163+
arguments:
164+
- -c
165+
- gcloud container clusters get-credentials $GCLOUD_CLUSTER --zone $GCLOUD_ZONE --project $GCLOUD_PROJECT_ID
166+
command: bash
167+
run_if: passed
168+
- exec:
169+
command: ./complete-canary.sh
170+
run_if: passed
171+
- exec:
172+
arguments:
173+
- -c
174+
- rm secret.json
175+
command: bash
176+
run_if: passed
153177
```
154178

155-
Commit and push the changes to the pipeline definition and wait until GoCD is
156-
updated. Once the pipeline is updated with the new stage, go ahead and commit and
157-
push the other remaining changes to the kubernetes files and deployment scripts.
158-
This should trigger the "PetClinic" pipeline and you should see it deploy the
159-
new version as a canary release. Then you can test a manual approval to complete
160-
the release.
179+
Commit and push the changes and wait until GoCD is updated. Once the pipeline is
180+
updated with the new stage, it will trigger a new execution.You should see it
181+
deploy the new version as a canary release. Then you can test a manual approval
182+
to complete the release.

0 commit comments

Comments
 (0)