Improve benchmark consistency #27728
Labels
area-CI
area-performance
Issues relating to slowness of app, cpu usage, and/or blank screens.
team-dev-ops
DevOps team
team-tiger
Tiger team (for tech debt reduction + performance improvements)
type-enhancement
What is this about?
We have a "benchmark" CircleCI job that is intended to benchmark UI startup performance, but in practice we've found the results to vary wildly between runs. We suspect this is due to the shared hosting environment in which CircleCI jobs are run.
We would like to improve the consistency of these benchmark runs. Consistent results are crucial for enabling us to use these results as a "quality gate" and prevent performance regressions.
The suggested solution is to setup a dedicated server for running these benchmark jobs in order to guarantee a more consistent environment for each run (including consistent resources, RAM/CPU/disk) in order to get more consistent benchmark results.
Scenario
No response
Design
No response
Technical Details
We may be able to do this by using a self-hosted GitHub Actions runner.
Threat Modeling Framework
No response
Acceptance Criteria
Stakeholder review needed before the work gets merged
References
Related: #6647
The text was updated successfully, but these errors were encountered: