You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: content/en/continuous_integration/_index.md
+9-7
Original file line number
Diff line number
Diff line change
@@ -45,29 +45,31 @@ cascade:
45
45
46
46
## Overview
47
47
48
-
Datadog Continuous Integration (CI) Visibility unifies information about CI test and pipeline results in addition to data about CI performance, trends, and reliability. CI Visibility brings CI metrics and data into Datadog dashboards and notebooks so you can communicate the health of your CI environment and focus your efforts in improving your team's ability to deliver quality code every time.
48
+
Datadog Continuous Integration (CI) Visibility provides a unified view of pipeline results, performance, trends, and reliability across your CI environments. By integrating Datadog with your CI pipelines, you can create monitors, display data within [Datadog dashboards][1] and [notebooks][2], and create visualizations for your organization's CI health.
CI Visibility enables developers to identify the reasons behind a test or pipeline failure, monitor trends in test suite execution times, and see the effect that a given commit has on the pipeline. Further, it also provides build engineers with visibility into cross-organization CI health and trends in pipeline performance over time.
52
+
</br>
53
+
54
+
CI Visibility helps developers understand the causes of pipeline disruptions and monitor trends in pipeline execution times. It also offers build engineers insights into cross-organization CI health and pipeline performance over time.
53
55
54
-
## Improve test reliability and create traces
56
+
## Improve pipeline reliability and create traces
55
57
56
-
CI Visibility helps you troubleshoot test failures and broken builds by connecting the most important development outages to the commits that caused them. You can instrument your tests and generate traces from your testing frameworks as they execute in CI.
58
+
CI Visibility helps you troubleshoot pipeline failures and broken builds by connecting the most significant development outages to the commits that caused them. You can instrument your pipelines and trace them as they execute, enabling deeper insights into pipeline performance.
57
59
58
60
## Increase efficiency through seamless integrations
59
61
60
-
Datadog integrates with the following CI providers to gather pipeline metrics which track the performance and results from the moment a commit enters the pipeline until it is ready to be deployed. Use the data aggregated over time to track trends in the performance of tests and builds, and identify what is most important to fix.
62
+
Datadog integrates with a variety of CI providers to collect metrics that track the performance of your CI pipelines from commit to deployment. These metrics are used to identify performance trends and improvement opportunities.
You can use the `datadog-ci` CLI to [trace commands][8]in your pipelines, as well as the [custom tags and measures commands][9] to add user-defined text and numerical tags in your pipeline traces.
68
+
You can use the `datadog-ci` CLI to [trace commands][8]and add [custom tags and measures][9], which allows you to add user-defined text and numerical tags in your pipeline traces.
67
69
68
70
## Ready to start?
69
71
70
-
See[Pipeline Visibility][3]and [Test Visibility][4]for instructions on setting up CI Visibility with your CI providers, information about compatibility requirements, and steps for instrumenting and configuring data collection. Then, start exploring details about your test runs and pipeline executions in the [CI Visibility Explorer][7] and export your search query into a [CI Pipeline or Test Monitor][6].
72
+
Visit[Pipeline Visibility][3] for instructions on setting up CI Visibility with your CI providers, including details on compatibility requirements and steps for configuring data collection. Then, start exploring details about your pipeline executions in the [CI Visibility Explorer][7] and export your search query into a [CI Pipeline Monitor][6].
<divclass="alert alert-warning">CI Visibility is not available in the selected site ({{< region-param key="dd_site_name" >}}) at this time.</div>
38
+
<divclass="alert alert-warning">Test Visibility is not available in the selected site ({{< region-param key="dd_site_name" >}}) at this time.</div>
36
39
{{< /site-region >}}
37
40
38
41
## Overview
39
42
40
-
[Test Visibility][1] provides a test-first view into your CI health by displaying important metrics and results from your tests. It can help you investigate performance problems and test failures that concern you the most because you workon the related code, not because you maintain the pipelines they are run in.
43
+
[Test Visibility][1] provides a test-first view into your CI health by displaying important metrics and results from your tests. It can help you investigate performance problems and test failures that are most relevant to your work, focusing on the code you are responsible for, rather than the pipelines which run your tests.
41
44
42
45
## Setup
43
46
44
-
{{< whatsnext desc="Choose a language to set up Test Visibility in Datadog:" >}}
In addition to tests, Test Visibility provides visibility over the whole testing phase of your project.
56
54
@@ -68,14 +66,14 @@ In addition to tests, Test Visibility provides visibility over the whole testing
68
66
| {{< ci-details title="Source code start/end" >}}Automatic report of the start and end lines of a test.{{< /ci-details >}} | {{< X >}} | {{< X >}} | {{< X >}} (only start) | {{< X >}} | {{< X >}} (only start) | {{< X >}} | {{< X >}} (only start) |
69
67
| {{< ci-details title="CI and git info" >}}Automatic collection of git and CI environment metadata, such as CI provider, git commit SHA or pipeline URL.{{< /ci-details >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
70
68
| {{< ci-details title="Git metadata upload" >}}Automatic upload of git tree information used for Intelligent Test Runner.{{< /ci-details >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} || {{< X >}} | {{< X >}} |
71
-
| {{< ci-details title="Intelligent Test Runner *" >}}Capability to enable Intelligent Test Runner, which intelligently skips tests based on code coverage and git metadata.{{< /ci-details >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} || {{< X >}} ||
69
+
| {{< ci-details title="Intelligent Test Runner *" >}}Capability to enable Intelligent Test Runner, which intelligently skips tests based on code coverage and git metadata.{{< /ci-details >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |{{< X >}}| {{< X >}} ||
72
70
| {{< ci-details title="Code coverage support" >}}Ability to report total code coverage metrics.{{< /ci-details >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} || {{< X >}} | {{< X >}} (manual) |
73
71
| {{< ci-details title="Benchmark tests support" >}}Automatic detection of performance statistics for benchmark tests.{{< /ci-details >}} | {{< X >}} ||| {{< X >}} || {{< X >}} ||
74
72
| {{< ci-details title="Parameterized tests" >}}Automatic detection of parameterized tests.{{< /ci-details >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} ||
75
73
| {{< ci-details title="Early flake detection *" >}}Automatically retry new tests to detect flakiness.{{< /ci-details >}} || {{< X >}} | {{< X >}} |||||
76
74
| {{< ci-details title="Selenium RUM integration" >}}Automatically link browser sessions to test cases when testing RUM-instrumented applications.{{< /ci-details >}} | {{< X >}} | {{< X >}} | {{< X >}} |||||
77
75
78
-
\*the feature is opt-in and needs to be enabled in the test service settings UI.
76
+
\*The feature is opt-in, and needs to be enabled on the [**Test Service Settings** page][5].
79
77
80
78
## Default configurations
81
79
@@ -109,13 +107,12 @@ The following tags are automatically collected to identify test configurations,
109
107
110
108
### Parameterized test configurations
111
109
112
-
When you run parameterized tests, the library detects and reports information about the parameters used.
113
-
Parameters are a part of test configuration, so the same test case executed with different parameters will be considered as two different tests in Test Visibility.
110
+
When you run parameterized tests, the library detects and reports information about the parameters used. Parameters are a part of test configuration, so the same test case executed with different parameters is considered as two different tests in Test Visibility.
114
111
115
-
If a test parameter is non-deterministic and has a different value every time a test is run, each test execution will be considered a new test in Test Visibility.
116
-
As a consequence, some product features may not work correctly for such tests: history of executions, flakiness detection, Intelligent Test Runner, and others.
112
+
If a test parameter is non-deterministic and has a different value every time a test is run, each test execution is considered a new test in Test Visibility. As a consequence, some product features may not work correctly for such tests: history of executions, flakiness detection, Intelligent Test Runner, and others.
117
113
118
114
Some examples of non-deterministic test parameters are:
115
+
119
116
- current date
120
117
- a random value
121
118
- a value that depends on the test execution environment (such as an absolute file path or the current username)
@@ -141,19 +138,21 @@ Note: Nested `test.configuration` tags, such as `test.configuration.cpu.memory`,
141
138
142
139
In order to filter using these configurations tags, [you must create facets for these tags][2].
143
140
144
-
## Use CI tests data
145
-
146
-
{{% ci-information-collected %}}
147
-
148
-
### Integrations
141
+
## Enhance your developer workflow
149
142
150
-
{{< whatsnext desc="Learn about the following integrations with Test Visibility:" >}}
143
+
{{< whatsnext desc="Integrate Test Visibility with tools to report code coverage data, enhance browser tests with RUM, and access insights across platforms by streamlining issue identification and resolution in your development cycle." >}}
151
144
{{< nextlink href="/tests/developer_workflows/" >}}Enhancing Developer Workflows with Datadog{{< /nextlink >}}
{{< nextlink href="/tests/swift_tests" >}}Instrument Swift Tests with Browser RUM{{< /nextlink >}}
155
148
{{< /whatsnext >}}
156
149
150
+
## Use CI tests data
151
+
152
+
{{% ci-information-collected %}}
153
+
154
+
When creating a [dashboard][6] or a [notebook][7], you can use CI test data in your search query, which updates the visualization widget options. For more information, see the [Dashboards][8] and [Notebooks documentation][9].
155
+
157
156
## Alert on test data
158
157
159
158
When you evaluate failed or flaky tests, or the performance of a CI test on the [**Test Runs** page][3], click **Create Monitor** to create a [CI Test monitor][4].
@@ -166,3 +165,8 @@ When you evaluate failed or flaky tests, or the performance of a CI test on the
0 commit comments