Skip to content

Attempt to remedy random CI failures #925

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 16, 2025

Conversation

woodsp-ibm
Copy link
Member

@woodsp-ibm woodsp-ibm commented May 15, 2025

Summary

#903 lists CI failures that are seen randomly to occur here. I tried running the tests locally a few times and had a couple of failures. After changing the code I did not see any failures. Trying again though on the unaltered code I could not seem to get any failures again.

As such I am not sure the extent, if any, that this will remedy the situation. I added the simuator seed, via options, just in case one of the others, which was not set before, was what ends up getting used so hopefully the extra seeds help. The tests still pass with them set and so they seem to do no harm as such even if they end up not really being used. But lets see how things go. I did not mark this PR to close the random error one i.e. #903 but rather figured, if this seems ok and gets merged, to see how CI goes over time and if we see no re-occurence then close. If I see any random failure as this goes through CI then I think I will close it as clearly it would not be remedying things effectively. (Update: it seems to have gone through CI and passed everything ok, with no random failure, so I guess time will tell how effective this update really can be).

Details and comments

Note: EstimatorOptions does have a seed_estimator and I had initially set that too. But I removed it as it generated a warning when running the tests that it was not used with the local simulator.

@coveralls
Copy link

Pull Request Test Coverage Report for Build 15052613295

Details

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage remained the same at 90.828%

Totals Coverage Status
Change from base Build 15049746135: 0.0%
Covered Lines: 4486
Relevant Lines: 4939

💛 - Coveralls

Copy link
Collaborator

@edoaltamura edoaltamura left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's see how the CI runs go over the next few days with the seed fixed. I don't remember it making a big difference when I tried in the past, but I had fixed the seed in .run() rather.
It'd be great if a new noiseless default for GenericBackendV2 isn't needed (see #903 (comment)).

@edoaltamura edoaltamura merged commit ff55d2c into qiskit-community:main May 16, 2025
16 checks passed
@woodsp-ibm woodsp-ibm deleted the fix_rnd_ci branch May 16, 2025 15:07
@edoaltamura edoaltamura added the stable backport potential The bug might be minimal and/or import enough to be port to stable label May 22, 2025
mergify bot pushed a commit that referenced this pull request May 22, 2025
edoaltamura pushed a commit that referenced this pull request May 22, 2025
(cherry picked from commit ff55d2c)

Co-authored-by: Steve Wood <40241007+woodsp-ibm@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stable backport potential The bug might be minimal and/or import enough to be port to stable
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants