Description
A good user story should be (I-N-V-E-S-T principle)
- Independent (from other user stories, allowing to realize them in any order);
- Negotiable (omit details that would freeze the story);
- Valuable (implementation delivers an increment of functionality, observable by and useful to users);
- Estimable (developers should be able to estimate its size relative to other stories);
- Sizable (implementation fits in one iteration – if it needs many to complete, it is an EPIC);
- Testable (user must be able to check the conditions of satisfaction).
Description
As a developer, I'd like to have a CLI that will configure my local test harness, so that I don't have a lot of manual effort.
Acceptance Criteria
Reference: [Done-Done Checklist] (https://github.com/Microsoft/code-with-engineering-playbook/blob/master/Engineering/BestPractices/DoneDone.md)
- Terraform variable and deployment files are auto-generated once the target template name is provided.
- Pre-requisite dependencies are validated via the CLI.
- Artifacts required to run the test harness locally are pulled down via the CLI.
Also, here are a few points that need to be addressed:
- Constraint 1;
- Constraint 2;
- Constraint 3.
Resources
Technical Design Document
Mockups
Tasks
Stories are intended to be completed in a single sprint; if task breakdown creates addition work then team should discuss promoting the Story to an Epic.
Reference: [Minimal Valuable Slices] (https://github.com/Microsoft/code-with-engineering-playbook/blob/master/Engineering/BestPractices/MinimalSlices.md)
Reference: [How to Write Better Tasks] (http://agilebutpragmatic.blogspot.com/2012/04/splitting-story-into-tasks-how-to-write.html)
Assignee should break down work into tasks here
Command: Run - setup and test
- bool flag: --docker
- download test runner script to /tmp/cobalt-testrunner-{date/time}.sh
- run script (expect script to verify latest ver for base image)
- output stdout
- clean up files
- leave base image