Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI is unpredictably failing #1772

Open
chameco opened this issue Dec 4, 2022 · 1 comment
Open

CI is unpredictably failing #1772

chameco opened this issue Dec 4, 2022 · 1 comment
Labels
tooling: CI Issues involving CI/CD scripts or processes

Comments

@chameco
Copy link
Contributor

chameco commented Dec 4, 2022

This can be seen clearly on the Nightly builds page, and is almost certainly a consequence of #1746.
It looks like not only the BLST job is inconsistent: in some jobs AWSLC times out.
In both cases it looks like a Z3 timeout, probably because we moved from Z3 4.8.10 to 4.8.14.

@RyanGlScott
Copy link
Contributor

Sigh.

As a short-term fix, I think we should just include both z3-4.8.10 and z3-4.8.14 (with z3 being an alias for z3-4.8.14) in a what4-solvers bindist, and then we can manually copy z3-4.8.10 to z3 before running the AWSLC/BLST jobs.

Alternatively, we could try different, later versions of Z3 on those jobs to find one that doesn't exhibit the nondeterministic timeouts that z3-4.8.14 does, but that sounds like a tedious way to go about yet. Better yet would be to extract the SMT-LIB2 term that is causing the timeout so that we can test it on different Z3 versions without needing the run the full AWSLC/BLST jobs.

RyanGlScott added a commit to GaloisInc/what4-solvers that referenced this issue Dec 5, 2022
We now include both Z3 4.8.10 and 4.8.14 so that SAW's CI can revert back to
4.8.10 in certain cases, as 4.8.14 has been known to nondeterministically fail
on certain jobs. See GaloisInc/saw-script#1772.
RyanGlScott added a commit to GaloisInc/what4-solvers that referenced this issue Dec 5, 2022
We now include both Z3 4.8.10 and 4.8.14 so that SAW's CI can revert back to
4.8.10 in certain cases, as 4.8.14 has been known to nondeterministically fail
on certain jobs. See GaloisInc/saw-script#1772.
RyanGlScott added a commit to GaloisInc/what4-solvers that referenced this issue Dec 5, 2022
We now include both Z3 4.8.10 and 4.8.14 so that SAW's CI can revert back to
4.8.10 in certain cases, as 4.8.14 has been known to nondeterministically fail
on certain jobs. See GaloisInc/saw-script#1772.
RyanGlScott added a commit to GaloisInc/what4-solvers that referenced this issue Dec 5, 2022
We now include both Z3 4.8.10 and 4.8.14 so that SAW's CI can revert back to
4.8.10 in certain cases, as 4.8.14 has been known to nondeterministically fail
on certain jobs. See GaloisInc/saw-script#1772.
RyanGlScott added a commit to GaloisInc/what4-solvers that referenced this issue Dec 5, 2022
We now include both Z3 4.8.10 and 4.8.14 so that SAW's CI can revert back to
4.8.10 in certain cases, as 4.8.14 has been known to nondeterministically fail
on certain jobs. See GaloisInc/saw-script#1772.
RyanGlScott added a commit to GaloisInc/what4-solvers that referenced this issue Dec 5, 2022
We now include both Z3 4.8.10 and 4.8.14 so that SAW's CI can revert back to
4.8.10 in certain cases, as 4.8.14 has been known to nondeterministically fail
on certain jobs. See GaloisInc/saw-script#1772.
RyanGlScott added a commit to GaloisInc/what4-solvers that referenced this issue Dec 5, 2022
We now include both Z3 4.8.10 and 4.8.14 so that SAW's CI can revert back to
4.8.10 in certain cases, as 4.8.14 has been known to nondeterministically fail
on certain jobs. See GaloisInc/saw-script#1772.
RyanGlScott added a commit that referenced this issue Dec 5, 2022
This updates the `what4-solvers` snapshot to one that includes both Z3 4.8.10
and 4.8.14. While we continue to use 4.8.14 for most CI purposes, we fall back
to 4.8.10 specifically when running the AWSLC or BLST proofs, which have been
known to nondeterministically time out with those versions. Hopefully, this
should resolve our CI woes.

This avoids the main issue in #1772, although the underlying cause still has yet
to be investigated.
@RyanGlScott RyanGlScott added the tooling: CI Issues involving CI/CD scripts or processes label Dec 5, 2022
RyanGlScott added a commit that referenced this issue Dec 6, 2022
This updates the `what4-solvers` snapshot to one that includes both Z3 4.8.10
and 4.8.14. While we continue to use 4.8.14 for most CI purposes, we fall back
to 4.8.10 specifically when running the AWSLC or BLST proofs, which have been
known to nondeterministically time out with those versions. Hopefully, this
should resolve our CI woes.

This avoids the main issue in #1772, although the underlying cause still has yet
to be investigated.
RyanGlScott added a commit that referenced this issue Dec 7, 2022
This updates the `what4-solvers` snapshot to one that includes both Z3 4.8.10
and 4.8.14. While we continue to use 4.8.14 for most CI purposes, we fall back
to 4.8.10 specifically when running the AWSLC or BLST proofs, which have been
known to nondeterministically time out with those versions. Hopefully, this
should resolve our CI woes.

This avoids the main issue in #1772, although the underlying cause still has yet
to be investigated.
RyanGlScott added a commit that referenced this issue Dec 12, 2022
This updates the `what4-solvers` snapshot to one that includes both Z3 4.8.10
and 4.8.14. While we continue to use 4.8.14 for most CI purposes, we fall back
to 4.8.10 specifically when running the AWSLC or BLST proofs, which have been
known to nondeterministically time out with those versions. Hopefully, this
should resolve our CI woes.

This avoids the main issue in #1772, although the underlying cause still has yet
to be investigated.
RyanGlScott added a commit that referenced this issue Dec 13, 2022
This updates the `what4-solvers` snapshot to one that includes both Z3 4.8.8
and 4.8.14. While we continue to use 4.8.14 for most CI purposes, we fall back
to 4.8.8 specifically when running the AWSLC or BLST proofs, which have been
known to nondeterministically time out with those versions. Hopefully, this
should resolve our CI woes.

This avoids the main issue in #1772, although the underlying cause still has yet
to be investigated.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tooling: CI Issues involving CI/CD scripts or processes
Projects
None yet
Development

No branches or pull requests

2 participants