-
Notifications
You must be signed in to change notification settings - Fork 167
Workflows/test result reporting #1347
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Workflows/test result reporting #1347
Conversation
|
This PR builds on top of #1312 and adds just one commit that refactors result reporting. @smuppand @lumag please take a look. I also pushed the changes to |
| echo "DEVICE_TYPE=dragonboard-410c" >> dragonboard-410c.ini | ||
| echo "BOOT_IMG_FILE=boot-apq8016-sbc-qcom-armv8a.img" >> dragonboard-410c.ini | ||
| cat dragonboard-410c.ini | ||
| lava-test-plans --dry-run --variables dragonboard-410c.ini --test-plan "${GITHUB_REPOSITORY#*/}/${{ inputs.distro_name }}/${{ inputs.testplan }}" --device-type "dragonboard-410c" --dry-run-path "${JOBS_OUT_PATH}/dragonboard-410c-${{ inputs.distro_name }}-${{ inputs.testplan }}" || true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the || true on lava-test-plans --dry-run, you intentionally ignore failures. If generation fails, still it will upload something (maybe empty)? and downstream will fail in confusing ways. Better to fail fast if generation fails unless you are explicitly trying to allow missing device templates.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is the same commit as in #1312. I don't think it's a good idea to change it before it's merged.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can add another commit on top. We discussed fixing this in #1312. I'll try that now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a bit more complicated.In general lava-test-plans will only fail if it can't render a template. This means no jobs will be generated in case of failure. However this only works in case when there is a single test job file produced. Test plan can contain more than one test job. In this case a failure may result in some test job files, but not all that were expected. It should not happen as the workflows/tests in lava-test-plans repository renders all possible jobs for all existing machines.
We have 2 options to address the issue:
- only run workflow on machines included in lava-test-plans (I prefer this option). In this case we can remove
||trueand assume all matrix workflows will produce something - keep
|| truebut restrict all test plans to only a single job (I don't think it's a good idea).
I'll implement the 1st option and we can discuss further.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is what happens if I include device that doesn't exist: https://github.com/qualcomm-linux/meta-qcom/actions/runs/20713210403/job/59469534291
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right. Remove || true from lava-test-plans --dry-run.
Add pre-check that device template exists.
Add post-check that at least one job file was generated (and non-empty).
Test run workflowTest jobs for commit 1e821e4
|
Test run workflowTest jobs for commit 1e821e4
|
Use upload-artifact@v6 in all workflows. v6 was released recently. It features support for Node.js 24 and fixes a few deprecation notices in the v4 and v5 code. Signed-off-by: Milosz Wasilewski <milosz.wasilewski@oss.qualcomm.com>
Move test job summary generation to a separate action and improve the overall summary by adding a table with test results. Note: results table generation carries an assumption there is only one test result in each test-definition. This might not always be true. Once this assumption is broken the summary generation action will need to be updated. Signed-off-by: Milosz Wasilewski <milosz.wasilewski@oss.qualcomm.com>
Improve loop preparing test job list by removing dependency on directory structure. This should make the action more robust. Signed-off-by: Milosz Wasilewski <milosz.wasilewski@oss.qualcomm.com>
1e821e4 to
e520e7d
Compare
Test run workflowTest jobs for commit e520e7d
|
Improve test result reporting in both push and PR generated builds.