Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve benchmark consistency #27728

Open
1 of 9 tasks
Gudahtt opened this issue Oct 9, 2024 · 0 comments
Open
1 of 9 tasks

Improve benchmark consistency #27728

Gudahtt opened this issue Oct 9, 2024 · 0 comments
Labels
area-CI area-performance Issues relating to slowness of app, cpu usage, and/or blank screens. team-dev-ops DevOps team team-tiger Tiger team (for tech debt reduction + performance improvements) type-enhancement

Comments

@Gudahtt
Copy link
Member

Gudahtt commented Oct 9, 2024

What is this about?

We have a "benchmark" CircleCI job that is intended to benchmark UI startup performance, but in practice we've found the results to vary wildly between runs. We suspect this is due to the shared hosting environment in which CircleCI jobs are run.

We would like to improve the consistency of these benchmark runs. Consistent results are crucial for enabling us to use these results as a "quality gate" and prevent performance regressions.

The suggested solution is to setup a dedicated server for running these benchmark jobs in order to guarantee a more consistent environment for each run (including consistent resources, RAM/CPU/disk) in order to get more consistent benchmark results.

Scenario

No response

Design

No response

Technical Details

We may be able to do this by using a self-hosted GitHub Actions runner.

Threat Modeling Framework

No response

Acceptance Criteria

  • Each benchmark job should run with the same resources, no contention with other test runs
    • Some operating system functions will not be possible to tightly control, but we should make a best-effort attempt at guaranteeing a consistent environment
  • The results should be "consistent enough to be useful". We can test it and evaluate whether it is good enough once we have results.

Stakeholder review needed before the work gets merged

  • Engineering (needed in most cases)
  • Design
  • Product
  • QA (automation tests are required to pass before merging PRs but not all changes are covered by automation tests - please review if QA is needed beyond automation tests)
  • Security
  • Legal
  • Marketing
  • Management (please specify)
  • Other (please specify)

References

Related: #6647

@Gudahtt Gudahtt added type-enhancement area-CI area-performance Issues relating to slowness of app, cpu usage, and/or blank screens. labels Oct 9, 2024
@gauthierpetetin gauthierpetetin added team-tiger Tiger team (for tech debt reduction + performance improvements) team-dev-ops DevOps team labels Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-CI area-performance Issues relating to slowness of app, cpu usage, and/or blank screens. team-dev-ops DevOps team team-tiger Tiger team (for tech debt reduction + performance improvements) type-enhancement
Projects
Status: To be fixed
Development

No branches or pull requests

2 participants