Skip to content

4 test failures on OpenBSD x86_64 (chrono-test, gtest-extra-test, xchar-test, posix-mock-test) #3670

Closed
@seanm

Description

@seanm

I tried building current master (f76603f) on OpenBSD 7.3 on x86_64. It built. There were 4 test failures however. Verbose output below:

The following tests FAILED:
	  3 - chrono-test (Failed)
	  6 - gtest-extra-test (Failed)
	 17 - xchar-test (Failed)
	 19 - posix-mock-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.
*** Error 8 in /home/builder/external/fmt-bin (Makefile:91 'test': /usr/local/bin/ctest --force-new-ctest-process --exclude-regex "CMake.Fil...)


kartikeya$ ctest -R chrono-test -V
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
Test project /home/builder/external/fmt-bin
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 3
    Start 3: chrono-test

3: Test command: /home/builder/external/fmt-bin/bin/chrono-test
3: Working Directory: /home/builder/external/fmt-bin/test
3: Test timeout computed to be: 10000000
3: [==========] Running 31 tests from 1 test suite.
3: [----------] Global test environment set-up.
3: [----------] 31 tests from chrono_test
3: [ RUN      ] chrono_test.format_tm
3: [       OK ] chrono_test.format_tm (364 ms)
3: [ RUN      ] chrono_test.format_tm_future
3: [       OK ] chrono_test.format_tm_future (0 ms)
3: [ RUN      ] chrono_test.format_tm_past
3: [       OK ] chrono_test.format_tm_past (0 ms)
3: [ RUN      ] chrono_test.grow_buffer
3: [       OK ] chrono_test.grow_buffer (0 ms)
3: [ RUN      ] chrono_test.format_to_empty_container
3: [       OK ] chrono_test.format_to_empty_container (0 ms)
3: [ RUN      ] chrono_test.empty_result
3: [       OK ] chrono_test.empty_result (0 ms)
3: [ RUN      ] chrono_test.gmtime
3: [       OK ] chrono_test.gmtime (0 ms)
3: [ RUN      ] chrono_test.system_clock_time_point
3: [       OK ] chrono_test.system_clock_time_point (0 ms)
3: [ RUN      ] chrono_test.format_default
3: [       OK ] chrono_test.format_default (0 ms)
3: [ RUN      ] chrono_test.duration_align
3: [       OK ] chrono_test.duration_align (0 ms)
3: [ RUN      ] chrono_test.tm_align
3: [       OK ] chrono_test.tm_align (0 ms)
3: [ RUN      ] chrono_test.tp_align
3: [       OK ] chrono_test.tp_align (0 ms)
3: [ RUN      ] chrono_test.format_specs
3: [       OK ] chrono_test.format_specs (0 ms)
3: [ RUN      ] chrono_test.invalid_specs
3: [       OK ] chrono_test.invalid_specs (3 ms)
3: [ RUN      ] chrono_test.locale
3: ja_JP.utf8 locale is missing.
3: [       OK ] chrono_test.locale (1 ms)
3: [ RUN      ] chrono_test.format_default_fp
3: [       OK ] chrono_test.format_default_fp (0 ms)
3: [ RUN      ] chrono_test.format_precision
3: [       OK ] chrono_test.format_precision (0 ms)
3: [ RUN      ] chrono_test.format_full_specs
3: [       OK ] chrono_test.format_full_specs (0 ms)
3: [ RUN      ] chrono_test.format_simple_q
3: [       OK ] chrono_test.format_simple_q (0 ms)
3: [ RUN      ] chrono_test.format_precision_q
3: [       OK ] chrono_test.format_precision_q (0 ms)
3: [ RUN      ] chrono_test.format_full_specs_q
3: [       OK ] chrono_test.format_full_specs_q (0 ms)
3: [ RUN      ] chrono_test.invalid_width_id
3: [       OK ] chrono_test.invalid_width_id (0 ms)
3: [ RUN      ] chrono_test.invalid_colons
3: [       OK ] chrono_test.invalid_colons (0 ms)
3: [ RUN      ] chrono_test.negative_durations
3: [       OK ] chrono_test.negative_durations (0 ms)
3: [ RUN      ] chrono_test.special_durations
3: [       OK ] chrono_test.special_durations (0 ms)
3: [ RUN      ] chrono_test.unsigned_duration
3: [       OK ] chrono_test.unsigned_duration (0 ms)
3: [ RUN      ] chrono_test.weekday
3: /home/builder/external/fmt/test/chrono-test.cc:755: Failure
3: Value of: (std::vector<std::string>{"пн", "Пн", "пнд", "Пнд"})
3: Expected: contains at least one element that is equal to "Mon"
3:   Actual: { "\xD0\xBF\xD0\xBD"
3:     As Text: "пн", "\xD0\x9F\xD0\xBD"
3:     As Text: "Пн", "\xD0\xBF\xD0\xBD\xD0\xB4"
3:     As Text: "пнд", "\xD0\x9F\xD0\xBD\xD0\xB4"
3:     As Text: "Пнд" }
3: /home/builder/external/fmt/test/chrono-test.cc:757: Failure
3: Value of: (std::vector<std::string>{"пн", "Пн", "пнд", "Пнд"})
3: Expected: contains at least one element that is equal to "Mon"
3:   Actual: { "\xD0\xBF\xD0\xBD"
3:     As Text: "пн", "\xD0\x9F\xD0\xBD"
3:     As Text: "Пн", "\xD0\xBF\xD0\xBD\xD0\xB4"
3:     As Text: "пнд", "\xD0\x9F\xD0\xBD\xD0\xB4"
3:     As Text: "Пнд" }
3: [  FAILED  ] chrono_test.weekday (6 ms)
3: [ RUN      ] chrono_test.cpp20_duration_subsecond_support
3: [       OK ] chrono_test.cpp20_duration_subsecond_support (0 ms)
3: [ RUN      ] chrono_test.timestamps_ratios
3: [       OK ] chrono_test.timestamps_ratios (0 ms)
3: [ RUN      ] chrono_test.timestamps_sub_seconds
3: [       OK ] chrono_test.timestamps_sub_seconds (0 ms)
3: [ RUN      ] chrono_test.glibc_extensions
3: [       OK ] chrono_test.glibc_extensions (0 ms)
3: [----------] 31 tests from chrono_test (379 ms total)
3:
3: [----------] Global test environment tear-down
3: [==========] 31 tests from 1 test suite ran. (381 ms total)
3: [  PASSED  ] 30 tests.
3: [  FAILED  ] 1 test, listed below:
3: [  FAILED  ] chrono_test.weekday
3:
3:  1 FAILED TEST
1/1 Test #3: chrono-test ......................***Failed    0.40 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   0.42 sec

The following tests FAILED:
	  3 - chrono-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.


kartikeya$ ctest -R gtest-extra-test -V
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
Test project /home/builder/external/fmt-bin
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 6
    Start 6: gtest-extra-test

6: Test command: /home/builder/external/fmt-bin/bin/gtest-extra-test
6: Working Directory: /home/builder/external/fmt-bin/test
6: Test timeout computed to be: 10000000
6: [==========] Running 23 tests from 3 test suites.
6: [----------] Global test environment set-up.
6: [----------] 6 tests from single_evaluation_test
6: [ RUN      ] single_evaluation_test.failed_expect_throw_msg
6: [       OK ] single_evaluation_test.failed_expect_throw_msg (3 ms)
6: [ RUN      ] single_evaluation_test.failed_expect_system_error
6: [       OK ] single_evaluation_test.failed_expect_system_error (1 ms)
6: [ RUN      ] single_evaluation_test.exception_tests
6: [       OK ] single_evaluation_test.exception_tests (0 ms)
6: [ RUN      ] single_evaluation_test.system_error_tests
6: [       OK ] single_evaluation_test.system_error_tests (0 ms)
6: [ RUN      ] single_evaluation_test.failed_expect_write
6: [       OK ] single_evaluation_test.failed_expect_write (1 ms)
6: [ RUN      ] single_evaluation_test.write_tests
6: [       OK ] single_evaluation_test.write_tests (0 ms)
6: [----------] 6 tests from single_evaluation_test (7 ms total)
6:
6: [----------] 11 tests from gtest_extra_test
6: [ RUN      ] gtest_extra_test.expect_write
6: [       OK ] gtest_extra_test.expect_write (1 ms)
6: [ RUN      ] gtest_extra_test.expect_write_streaming
6: [       OK ] gtest_extra_test.expect_write_streaming (0 ms)
6: [ RUN      ] gtest_extra_test.expect_throw_no_unreachable_code_warning
6: [       OK ] gtest_extra_test.expect_throw_no_unreachable_code_warning (0 ms)
6: [ RUN      ] gtest_extra_test.expect_system_error_no_unreachable_code_warning
6: [       OK ] gtest_extra_test.expect_system_error_no_unreachable_code_warning (0 ms)
6: [ RUN      ] gtest_extra_test.expect_throw_behaves_like_single_statement
6: [       OK ] gtest_extra_test.expect_throw_behaves_like_single_statement (0 ms)
6: [ RUN      ] gtest_extra_test.expect_system_error_behaves_like_single_statement
6: [       OK ] gtest_extra_test.expect_system_error_behaves_like_single_statement (0 ms)
6: [ RUN      ] gtest_extra_test.expect_write_behaves_like_single_statement
6: [       OK ] gtest_extra_test.expect_write_behaves_like_single_statement (0 ms)
6: [ RUN      ] gtest_extra_test.expect_throw_msg
6: [       OK ] gtest_extra_test.expect_throw_msg (0 ms)
6: [ RUN      ] gtest_extra_test.expect_system_error
6: [       OK ] gtest_extra_test.expect_system_error (0 ms)
6: [ RUN      ] gtest_extra_test.expect_throw_msg_streaming
6: [       OK ] gtest_extra_test.expect_throw_msg_streaming (0 ms)
6: [ RUN      ] gtest_extra_test.expect_system_error_streaming
6: [       OK ] gtest_extra_test.expect_system_error_streaming (0 ms)
6: [----------] 11 tests from gtest_extra_test (3 ms total)
6:
6: [----------] 6 tests from output_redirect_test
6: [ RUN      ] output_redirect_test.scoped_redirect
6: [       OK ] output_redirect_test.scoped_redirect (1 ms)
6: [ RUN      ] output_redirect_test.flush_error_in_ctor
6: [       OK ] output_redirect_test.flush_error_in_ctor (0 ms)
6: [ RUN      ] output_redirect_test.dup_error_in_ctor
6: /home/builder/external/fmt/test/gtest-extra-test.cc:358: Failure
6: redir.reset(new output_redirect(f.get())) throws an exception with a different message.
6: Expected: cannot duplicate file descriptor 4: Bad file descriptor
6:   Actual: cannot flush stream: Bad file descriptor
6: [  FAILED  ] output_redirect_test.dup_error_in_ctor (0 ms)
6: [ RUN      ] output_redirect_test.restore_and_read
6: [       OK ] output_redirect_test.restore_and_read (0 ms)
6: [ RUN      ] output_redirect_test.flush_error_in_restore_and_read
6: [       OK ] output_redirect_test.flush_error_in_restore_and_read (0 ms)
6: [ RUN      ] output_redirect_test.error_in_dtor
6: [       OK ] output_redirect_test.error_in_dtor (1 ms)
6: [----------] 6 tests from output_redirect_test (5 ms total)
6:
6: [----------] Global test environment tear-down
6: [==========] 23 tests from 3 test suites ran. (18 ms total)
6: [  PASSED  ] 22 tests.
6: [  FAILED  ] 1 test, listed below:
6: [  FAILED  ] output_redirect_test.dup_error_in_ctor
6:
6:  1 FAILED TEST
1/1 Test #6: gtest-extra-test .................***Failed    0.04 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   0.05 sec

The following tests FAILED:
	  6 - gtest-extra-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.


kartikeya$ ctest -R xchar-test -V
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
Test project /home/builder/external/fmt-bin
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 17
    Start 17: xchar-test

17: Test command: /home/builder/external/fmt-bin/bin/xchar-test
17: Working Directory: /home/builder/external/fmt-bin/test
17: Test timeout computed to be: 10000000
17: [==========] Running 37 tests from 9 test suites.
17: [----------] Global test environment set-up.
17: [----------] 1 test from is_string_test/0, where TypeParam = char
17: [ RUN      ] is_string_test/0.is_string
17: [       OK ] is_string_test/0.is_string (0 ms)
17: [----------] 1 test from is_string_test/0 (0 ms total)
17:
17: [----------] 1 test from is_string_test/1, where TypeParam = wchar_t
17: [ RUN      ] is_string_test/1.is_string
17: [       OK ] is_string_test/1.is_string (0 ms)
17: [----------] 1 test from is_string_test/1 (0 ms total)
17:
17: [----------] 1 test from is_string_test/2, where TypeParam = char16_t
17: [ RUN      ] is_string_test/2.is_string
17: [       OK ] is_string_test/2.is_string (0 ms)
17: [----------] 1 test from is_string_test/2 (0 ms total)
17:
17: [----------] 1 test from is_string_test/3, where TypeParam = char32_t
17: [ RUN      ] is_string_test/3.is_string
17: [       OK ] is_string_test/3.is_string (0 ms)
17: [----------] 1 test from is_string_test/3 (0 ms total)
17:
17: [----------] 21 tests from xchar_test
17: [ RUN      ] xchar_test.format_explicitly_convertible_to_wstring_view
17: [       OK ] xchar_test.format_explicitly_convertible_to_wstring_view (0 ms)
17: [ RUN      ] xchar_test.format
17: [       OK ] xchar_test.format (3 ms)
17: [ RUN      ] xchar_test.is_formattable
17: [       OK ] xchar_test.is_formattable (0 ms)
17: [ RUN      ] xchar_test.compile_time_string
17: [       OK ] xchar_test.compile_time_string (0 ms)
17: [ RUN      ] xchar_test.format_custom_char
17: [       OK ] xchar_test.format_custom_char (0 ms)
17: [ RUN      ] xchar_test.format_utf8_precision
17: [       OK ] xchar_test.format_utf8_precision (0 ms)
17: [ RUN      ] xchar_test.format_to
17: [       OK ] xchar_test.format_to (0 ms)
17: [ RUN      ] xchar_test.vformat_to
17: [       OK ] xchar_test.vformat_to (0 ms)
17: [ RUN      ] xchar_test.format_as
17: [       OK ] xchar_test.format_as (0 ms)
17: [ RUN      ] xchar_test.named_arg_udl
17: [       OK ] xchar_test.named_arg_udl (0 ms)
17: [ RUN      ] xchar_test.print
17: [       OK ] xchar_test.print (0 ms)
17: [ RUN      ] xchar_test.join
17: [       OK ] xchar_test.join (0 ms)
17: [ RUN      ] xchar_test.enum
17: [       OK ] xchar_test.enum (0 ms)
17: [ RUN      ] xchar_test.streamed
17: [       OK ] xchar_test.streamed (0 ms)
17: [ RUN      ] xchar_test.sign_not_truncated
17: [       OK ] xchar_test.sign_not_truncated (0 ms)
17: [ RUN      ] xchar_test.chrono
17: [       OK ] xchar_test.chrono (0 ms)
17: [ RUN      ] xchar_test.color
17: [       OK ] xchar_test.color (0 ms)
17: [ RUN      ] xchar_test.ostream
17: [       OK ] xchar_test.ostream (0 ms)
17: [ RUN      ] xchar_test.format_map
17: [       OK ] xchar_test.format_map (0 ms)
17: [ RUN      ] xchar_test.escape_string
17: [       OK ] xchar_test.escape_string (0 ms)
17: [ RUN      ] xchar_test.to_wstring
17: [       OK ] xchar_test.to_wstring (0 ms)
17: [----------] 21 tests from xchar_test (6 ms total)
17:
17: [----------] 1 test from format_test
17: [ RUN      ] format_test.wide_format_to_n
17: [       OK ] format_test.wide_format_to_n (0 ms)
17: [----------] 1 test from format_test (0 ms total)
17:
17: [----------] 1 test from chrono_test_wchar
17: [ RUN      ] chrono_test_wchar.time_point
17: [       OK ] chrono_test_wchar.time_point (2 ms)
17: [----------] 1 test from chrono_test_wchar (2 ms total)
17:
17: [----------] 9 tests from locale_test
17: [ RUN      ] locale_test.localized_double
17: [       OK ] locale_test.localized_double (0 ms)
17: [ RUN      ] locale_test.format
17: [       OK ] locale_test.format (0 ms)
17: [ RUN      ] locale_test.format_detault_align
17: [       OK ] locale_test.format_detault_align (0 ms)
17: [ RUN      ] locale_test.format_plus
17: [       OK ] locale_test.format_plus (0 ms)
17: [ RUN      ] locale_test.wformat
17: [       OK ] locale_test.wformat (3 ms)
17: [ RUN      ] locale_test.int_formatter
17: [       OK ] locale_test.int_formatter (0 ms)
17: [ RUN      ] locale_test.complex
17: [       OK ] locale_test.complex (0 ms)
17: [ RUN      ] locale_test.chrono_weekday
17: /home/builder/external/fmt/test/xchar-test.cc:627: Failure
17: Value of: (std::vector<std::wstring>{L"\x43F\x43D", L"\x41F\x43D", L"\x43F\x43D\x434", L"\x41F\x43D\x434"})
17: Expected: contains at least one element that is equal to L"Mon"
17:   Actual: { L"\x43F\x43D", L"\x41F\x43D", L"\x43F\x43D\x434", L"\x41F\x43D\x434" }
17: [  FAILED  ] locale_test.chrono_weekday (2 ms)
17: [ RUN      ] locale_test.sign
17: [       OK ] locale_test.sign (0 ms)
17: [----------] 9 tests from locale_test (7 ms total)
17:
17: [----------] 1 test from std_test_xchar
17: [ RUN      ] std_test_xchar.optional
17: [       OK ] std_test_xchar.optional (0 ms)
17: [----------] 1 test from std_test_xchar (0 ms total)
17:
17: [----------] Global test environment tear-down
17: [==========] 37 tests from 9 test suites ran. (18 ms total)
17: [  PASSED  ] 36 tests.
17: [  FAILED  ] 1 test, listed below:
17: [  FAILED  ] locale_test.chrono_weekday
17:
17:  1 FAILED TEST
1/1 Test #17: xchar-test .......................***Failed    0.04 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   0.05 sec

The following tests FAILED:
	 17 - xchar-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.


kartikeya$ ctest -R posix-mock-test -V
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
Test project /home/builder/external/fmt-bin
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 19
    Start 19: posix-mock-test

19: Test command: /home/builder/external/fmt-bin/bin/posix-mock-test
19: Working Directory: /home/builder/external/fmt-bin/test
19: Test timeout computed to be: 10000000
19: [==========] Running 18 tests from 4 test suites.
19: [----------] Global test environment set-up.
19: [----------] 1 test from os_test
19: [ RUN      ] os_test.getpagesize
19: [       OK ] os_test.getpagesize (4 ms)
19: [----------] 1 test from os_test (4 ms total)
19:
19: [----------] 12 tests from file_test
19: [ RUN      ] file_test.open_retry
19: [       OK ] file_test.open_retry (1 ms)
19: [ RUN      ] file_test.close_no_retry_in_dtor
19: [       OK ] file_test.close_no_retry_in_dtor (1 ms)
19: [ RUN      ] file_test.close_no_retry
19: [       OK ] file_test.close_no_retry (0 ms)
19: [ RUN      ] file_test.size
19: [       OK ] file_test.size (0 ms)
19: [ RUN      ] file_test.max_size
19: [       OK ] file_test.max_size (0 ms)
19: [ RUN      ] file_test.read_retry
19: [       OK ] file_test.read_retry (0 ms)
19: [ RUN      ] file_test.write_retry
19: [       OK ] file_test.write_retry (0 ms)
19: [ RUN      ] file_test.dup_no_retry
19: [       OK ] file_test.dup_no_retry (0 ms)
19: [ RUN      ] file_test.dup2_retry
19: [       OK ] file_test.dup2_retry (0 ms)
19: [ RUN      ] file_test.dup2_no_except_retry
19: [       OK ] file_test.dup2_no_except_retry (0 ms)
19: [ RUN      ] file_test.pipe_no_retry
19: [       OK ] file_test.pipe_no_retry (0 ms)
19: [ RUN      ] file_test.fdopen_no_retry
19: [       OK ] file_test.fdopen_no_retry (0 ms)
19: [----------] 12 tests from file_test (5 ms total)
19:
19: [----------] 4 tests from buffered_file_test
19: [ RUN      ] buffered_file_test.open_retry
19: [       OK ] buffered_file_test.open_retry (0 ms)
19: [ RUN      ] buffered_file_test.close_no_retry_in_dtor
19: [       OK ] buffered_file_test.close_no_retry_in_dtor (0 ms)
19: [ RUN      ] buffered_file_test.close_no_retry
19: [       OK ] buffered_file_test.close_no_retry (0 ms)
19: [ RUN      ] buffered_file_test.fileno_no_retry
19: /home/builder/external/fmt/test/posix-mock-test.cc:435: Failure
19: Expected: (f.descriptor)() throws an exception of type std::system_error.
19:   Actual: it throws nothing.
19: /home/builder/external/fmt/test/posix-mock-test.cc:436: Failure
19: Expected equality of these values:
19:   2
19:   fileno_count
19:     Which is: 1
19: [  FAILED  ] buffered_file_test.fileno_no_retry (1 ms)
19: [----------] 4 tests from buffered_file_test (2 ms total)
19:
19: [----------] 1 test from scoped_mock
19: [ RUN      ] scoped_mock.scope
19: [       OK ] scoped_mock.scope (0 ms)
19: [----------] 1 test from scoped_mock (0 ms total)
19:
19: [----------] Global test environment tear-down
19: [==========] 18 tests from 4 test suites ran. (14 ms total)
19: [  PASSED  ] 17 tests.
19: [  FAILED  ] 1 test, listed below:
19: [  FAILED  ] buffered_file_test.fileno_no_retry
19:
19:  1 FAILED TEST
1/1 Test #19: posix-mock-test ..................***Failed    0.03 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   0.05 sec

The following tests FAILED:
	 19 - posix-mock-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions