Skip to content

Commit e74a002

Browse files
[ci] New script to generate test reports as Buildkite Annotations (#113447)
The CI builds now send the results of every lit run to a unique file. This means we can read them all to make a combined report for all tests. This report will be shown as an "annotation" in the build results: https://buildkite.com/docs/agent/v3/cli-annotate#creating-an-annotation Here is an example: https://buildkite.com/llvm-project/github-pull-requests/builds/112660 (make sure it is showing "All" instead of "Failures") This is an alternative to using the existing Buildkite plugin: https://github.com/buildkite-plugins/junit-annotate-buildkite-plugin As the plugin is: * Specific to Buildkite, and we may move away from Buildkite. * Requires docker, unless we were to fork it ourselves. * Does not let you customise the report format unless again, we make our own fork. Annotations use GitHub's flavour of Markdown so the main code in the script generates that text. There is an extra "style" argument generated to make the formatting nicer in Buildkite. "context" is the name of the annotation that will be created. By using different context names for Linux and Windows results we get 2 separate annotations. The script also handles calling the buildkite-agent. This makes passing extra arguments to the agent easier, rather than piping the output of this script into the agent. In the future we can remove the agent part of it and simply use the report content. Either printed to stdout or as a comment on the GitHub PR.
1 parent f539d92 commit e74a002

File tree

4 files changed

+345
-4
lines changed

4 files changed

+345
-4
lines changed

.ci/generate_test_report.py

Lines changed: 328 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,328 @@
1+
# Script to parse many JUnit XML result files and send a report to the buildkite
2+
# agent as an annotation.
3+
#
4+
# To run the unittests:
5+
# python3 -m unittest discover -p generate_test_report.py
6+
7+
import argparse
8+
import unittest
9+
from io import StringIO
10+
from junitparser import JUnitXml, Failure
11+
from textwrap import dedent
12+
from subprocess import check_call
13+
14+
15+
def junit_from_xml(xml):
16+
return JUnitXml.fromfile(StringIO(xml))
17+
18+
19+
class TestReports(unittest.TestCase):
20+
def test_title_only(self):
21+
self.assertEqual(_generate_report("Foo", []), ("", None))
22+
23+
def test_no_tests_in_testsuite(self):
24+
self.assertEqual(
25+
_generate_report(
26+
"Foo",
27+
[
28+
junit_from_xml(
29+
dedent(
30+
"""\
31+
<?xml version="1.0" encoding="UTF-8"?>
32+
<testsuites time="0.00">
33+
<testsuite name="Empty" tests="0" failures="0" skipped="0" time="0.00">
34+
</testsuite>
35+
</testsuites>"""
36+
)
37+
)
38+
],
39+
),
40+
("", None),
41+
)
42+
43+
def test_no_failures(self):
44+
self.assertEqual(
45+
_generate_report(
46+
"Foo",
47+
[
48+
junit_from_xml(
49+
dedent(
50+
"""\
51+
<?xml version="1.0" encoding="UTF-8"?>
52+
<testsuites time="0.00">
53+
<testsuite name="Passed" tests="1" failures="0" skipped="0" time="0.00">
54+
<testcase classname="Bar/test_1" name="test_1" time="0.00"/>
55+
</testsuite>
56+
</testsuites>"""
57+
)
58+
)
59+
],
60+
),
61+
(
62+
dedent(
63+
"""\
64+
# Foo
65+
66+
* 1 test passed"""
67+
),
68+
"success",
69+
),
70+
)
71+
72+
def test_report_single_file_single_testsuite(self):
73+
self.assertEqual(
74+
_generate_report(
75+
"Foo",
76+
[
77+
junit_from_xml(
78+
dedent(
79+
"""\
80+
<?xml version="1.0" encoding="UTF-8"?>
81+
<testsuites time="8.89">
82+
<testsuite name="Bar" tests="4" failures="2" skipped="1" time="410.63">
83+
<testcase classname="Bar/test_1" name="test_1" time="0.02"/>
84+
<testcase classname="Bar/test_2" name="test_2" time="0.02">
85+
<skipped message="Reason"/>
86+
</testcase>
87+
<testcase classname="Bar/test_3" name="test_3" time="0.02">
88+
<failure><![CDATA[Output goes here]]></failure>
89+
</testcase>
90+
<testcase classname="Bar/test_4" name="test_4" time="0.02">
91+
<failure><![CDATA[Other output goes here]]></failure>
92+
</testcase>
93+
</testsuite>
94+
</testsuites>"""
95+
)
96+
)
97+
],
98+
),
99+
(
100+
dedent(
101+
"""\
102+
# Foo
103+
104+
* 1 test passed
105+
* 1 test skipped
106+
* 2 tests failed
107+
108+
## Failed tests
109+
(click to see output)
110+
111+
### Bar
112+
<details>
113+
<summary>Bar/test_3/test_3</summary>
114+
115+
```
116+
Output goes here
117+
```
118+
</details>
119+
<details>
120+
<summary>Bar/test_4/test_4</summary>
121+
122+
```
123+
Other output goes here
124+
```
125+
</details>"""
126+
),
127+
"error",
128+
),
129+
)
130+
131+
MULTI_SUITE_OUTPUT = (
132+
dedent(
133+
"""\
134+
# ABC and DEF
135+
136+
* 1 test passed
137+
* 1 test skipped
138+
* 2 tests failed
139+
140+
## Failed tests
141+
(click to see output)
142+
143+
### ABC
144+
<details>
145+
<summary>ABC/test_2/test_2</summary>
146+
147+
```
148+
ABC/test_2 output goes here
149+
```
150+
</details>
151+
152+
### DEF
153+
<details>
154+
<summary>DEF/test_2/test_2</summary>
155+
156+
```
157+
DEF/test_2 output goes here
158+
```
159+
</details>"""
160+
),
161+
"error",
162+
)
163+
164+
def test_report_single_file_multiple_testsuites(self):
165+
self.assertEqual(
166+
_generate_report(
167+
"ABC and DEF",
168+
[
169+
junit_from_xml(
170+
dedent(
171+
"""\
172+
<?xml version="1.0" encoding="UTF-8"?>
173+
<testsuites time="8.89">
174+
<testsuite name="ABC" tests="2" failures="1" skipped="0" time="410.63">
175+
<testcase classname="ABC/test_1" name="test_1" time="0.02"/>
176+
<testcase classname="ABC/test_2" name="test_2" time="0.02">
177+
<failure><![CDATA[ABC/test_2 output goes here]]></failure>
178+
</testcase>
179+
</testsuite>
180+
<testsuite name="DEF" tests="2" failures="1" skipped="1" time="410.63">
181+
<testcase classname="DEF/test_1" name="test_1" time="0.02">
182+
<skipped message="reason"/>
183+
</testcase>
184+
<testcase classname="DEF/test_2" name="test_2" time="0.02">
185+
<failure><![CDATA[DEF/test_2 output goes here]]></failure>
186+
</testcase>
187+
</testsuite>
188+
</testsuites>"""
189+
)
190+
)
191+
],
192+
),
193+
self.MULTI_SUITE_OUTPUT,
194+
)
195+
196+
def test_report_multiple_files_multiple_testsuites(self):
197+
self.assertEqual(
198+
_generate_report(
199+
"ABC and DEF",
200+
[
201+
junit_from_xml(
202+
dedent(
203+
"""\
204+
<?xml version="1.0" encoding="UTF-8"?>
205+
<testsuites time="8.89">
206+
<testsuite name="ABC" tests="2" failures="1" skipped="0" time="410.63">
207+
<testcase classname="ABC/test_1" name="test_1" time="0.02"/>
208+
<testcase classname="ABC/test_2" name="test_2" time="0.02">
209+
<failure><![CDATA[ABC/test_2 output goes here]]></failure>
210+
</testcase>
211+
</testsuite>
212+
</testsuites>"""
213+
)
214+
),
215+
junit_from_xml(
216+
dedent(
217+
"""\
218+
<?xml version="1.0" encoding="UTF-8"?>
219+
<testsuites time="8.89">
220+
<testsuite name="DEF" tests="2" failures="1" skipped="1" time="410.63">
221+
<testcase classname="DEF/test_1" name="test_1" time="0.02">
222+
<skipped message="reason"/>
223+
</testcase>
224+
<testcase classname="DEF/test_2" name="test_2" time="0.02">
225+
<failure><![CDATA[DEF/test_2 output goes here]]></failure>
226+
</testcase>
227+
</testsuite>
228+
</testsuites>"""
229+
)
230+
),
231+
],
232+
),
233+
self.MULTI_SUITE_OUTPUT,
234+
)
235+
236+
237+
def _generate_report(title, junit_objects):
238+
style = None
239+
240+
if not junit_objects:
241+
return ("", style)
242+
243+
failures = {}
244+
tests_run = 0
245+
tests_skipped = 0
246+
tests_failed = 0
247+
248+
for results in junit_objects:
249+
for testsuite in results:
250+
tests_run += testsuite.tests
251+
tests_skipped += testsuite.skipped
252+
tests_failed += testsuite.failures
253+
254+
for test in testsuite:
255+
if (
256+
not test.is_passed
257+
and test.result
258+
and isinstance(test.result[0], Failure)
259+
):
260+
if failures.get(testsuite.name) is None:
261+
failures[testsuite.name] = []
262+
failures[testsuite.name].append(
263+
(test.classname + "/" + test.name, test.result[0].text)
264+
)
265+
266+
if not tests_run:
267+
return ("", style)
268+
269+
style = "error" if tests_failed else "success"
270+
report = [f"# {title}", ""]
271+
272+
tests_passed = tests_run - tests_skipped - tests_failed
273+
274+
def plural(num_tests):
275+
return "test" if num_tests == 1 else "tests"
276+
277+
if tests_passed:
278+
report.append(f"* {tests_passed} {plural(tests_passed)} passed")
279+
if tests_skipped:
280+
report.append(f"* {tests_skipped} {plural(tests_skipped)} skipped")
281+
if tests_failed:
282+
report.append(f"* {tests_failed} {plural(tests_failed)} failed")
283+
284+
if failures:
285+
report.extend(["", "## Failed tests", "(click to see output)"])
286+
for testsuite_name, failures in failures.items():
287+
report.extend(["", f"### {testsuite_name}"])
288+
for name, output in failures:
289+
report.extend(
290+
[
291+
"<details>",
292+
f"<summary>{name}</summary>",
293+
"",
294+
"```",
295+
output,
296+
"```",
297+
"</details>",
298+
]
299+
)
300+
301+
return "\n".join(report), style
302+
303+
304+
def generate_report(title, junit_files):
305+
return _generate_report(title, [JUnitXml.fromfile(p) for p in junit_files])
306+
307+
308+
if __name__ == "__main__":
309+
parser = argparse.ArgumentParser()
310+
parser.add_argument(
311+
"title", help="Title of the test report, without Markdown formatting."
312+
)
313+
parser.add_argument("context", help="Annotation context to write to.")
314+
parser.add_argument("junit_files", help="Paths to JUnit report files.", nargs="*")
315+
args = parser.parse_args()
316+
317+
report, style = generate_report(args.title, args.junit_files)
318+
check_call(
319+
[
320+
"buildkite-agent",
321+
"annotate",
322+
"--context",
323+
args.context,
324+
"--style",
325+
style,
326+
report,
327+
]
328+
)

.ci/monolithic-linux.sh

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,11 +28,16 @@ if [[ -n "${CLEAR_CACHE:-}" ]]; then
2828
ccache --clear
2929
fi
3030

31-
function show-stats {
31+
function at-exit {
3232
mkdir -p artifacts
3333
ccache --print-stats > artifacts/ccache_stats.txt
34+
35+
# If building fails there will be no results files.
36+
shopt -s nullglob
37+
python3 "${MONOREPO_ROOT}"/.ci/generate_test_report.py ":linux: Linux x64 Test Results" \
38+
"linux-x64-test-results" "${BUILD_DIR}"/test-results.*.xml
3439
}
35-
trap show-stats EXIT
40+
trap at-exit EXIT
3641

3742
projects="${1}"
3843
targets="${2}"
@@ -42,6 +47,7 @@ lit_args="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --use-unique-outpu
4247
echo "--- cmake"
4348
pip install -q -r "${MONOREPO_ROOT}"/mlir/python/requirements.txt
4449
pip install -q -r "${MONOREPO_ROOT}"/lldb/test/requirements.txt
50+
pip install -q -r "${MONOREPO_ROOT}"/.ci/requirements.txt
4551
cmake -S "${MONOREPO_ROOT}"/llvm -B "${BUILD_DIR}" \
4652
-D LLVM_ENABLE_PROJECTS="${projects}" \
4753
-G Ninja \

.ci/monolithic-windows.sh

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,17 +27,23 @@ if [[ -n "${CLEAR_CACHE:-}" ]]; then
2727
fi
2828

2929
sccache --zero-stats
30-
function show-stats {
30+
function at-exit {
3131
mkdir -p artifacts
3232
sccache --show-stats >> artifacts/sccache_stats.txt
33+
34+
# If building fails there will be no results files.
35+
shopt -s nullglob
36+
python "${MONOREPO_ROOT}"/.ci/generate_test_report.py ":windows: Windows x64 Test Results" \
37+
"windows-x64-test-results" "${BUILD_DIR}"/test-results.*.xml
3338
}
34-
trap show-stats EXIT
39+
trap at-exit EXIT
3540

3641
projects="${1}"
3742
targets="${2}"
3843

3944
echo "--- cmake"
4045
pip install -q -r "${MONOREPO_ROOT}"/mlir/python/requirements.txt
46+
pip install -q -r "${MONOREPO_ROOT}"/.ci/requirements.txt
4147

4248
# The CMAKE_*_LINKER_FLAGS to disable the manifest come from research
4349
# on fixing a build reliability issue on the build server, please

.ci/requirements.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
junitparser==3.2.0

0 commit comments

Comments
 (0)