[META] Analysis, Results Aggregation, and Reporting #102
Closed
Description
OpenSearch Benchmark currently reports various performance metrics and creates a detailed report which can be published to an OpenSearch instance. The goal here is to:
- Create aggregations across many of these metrics to provide a summary report.
- Publish this report in a data store
- Analysis : Report should have insights about the quality of test, if it was anomalous or the test ran successfully, etc.
Acceptance Criteria
- We have a comprehensive dashboard that records the performance stats on regular basis that can be reproduced by anyone (Ex: using CDK to create the entire stack)
- We can view associated PR's and other changes corresponding to a nightly build run
- We can view aggregate stats of performance test runs at any point in time
- We can analyze the raw stats corresponding to a specific performance run for deeper analysis at any point in time
- The dashboard is updated automatically after ever performance test runs
- The dashboard is accessible by everyone ( Read only access enabled for anonymous user )
- Automated alerting is enabled to report on regressions / anomalies
- Admin users have ability to create new dashboards / visualizations as needed ( integrate with OIDC ? )
- The dashboard is maintained on regular basis ( up to date patches, upgrades etc..)
Metadata
Assignees
Labels
Type
Projects
Status
Done
Status
✅ Done