Skip to content

Add segment store data density provider#355

Merged
bhufmann merged 1 commit intoeclipse-tracecompass:masterfrom
theibhir:segmentstore-provider
Dec 1, 2025
Merged

Add segment store data density provider#355
bhufmann merged 1 commit intoeclipse-tracecompass:masterfrom
theibhir:segmentstore-provider

Conversation

@theibhir
Copy link
Copy Markdown
Contributor

@theibhir theibhir commented Nov 26, 2025

What it does

  • Add fetchTree and fetchXY
  • Add getXAxisAndYSeriesModels'
  • Adapt the response data structure to match the genericXY
  • Add tests in DataProviderMangerTest
  • Add tests in SegmentStoreDensityDataProvider
  • Use Q for help

How to test

  • run the trace server
  • open a trace in vscode-trace-extension
  • open one of the latency density views

Follow-ups

N/A

Review checklist

  • As an author, I have thoroughly tested my changes and carefully followed the instructions in this template

Summary by CodeRabbit

  • New Features

    • Density analysis: hierarchical tree of grouped segments, sampling-based XY generation, and axis metadata exposure
    • Conditional/experiment-aware provider creation and composite-trace support
  • Improvements

    • Renamed label from "Function Density" to "Latency Density" and added descriptive text
    • Improved cancellation handling, progress monitoring, and provider discovery
  • Tests

    • Added/updated latency-density descriptors and tests for tree/XY fetch, success/failure/cancel scenarios

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Nov 26, 2025

Walkthrough

Replaces direct XY computation with sampling-based assembly, adds module-aware tree building and a conditional create(...) factory for experiments, exposes X-axis metadata, updates localized title/description to "Latency Density", and extends factory descriptors and tests to include latency-density providers.

Changes

Cohort / File(s) Summary
Messaging & Externalized Strings
analysis/.../timing/core/segmentstore/Messages.java, analysis/.../timing/core/segmentstore/messages.properties
Added public SegmentStoreDensityDataProvider_description field and properties entry; changed SegmentStoreDensityDataProvider_title value from "{0} - Function Density" to "{0} - Latency Density".
Core Provider Logic
analysis/.../timing/core/segmentstore/SegmentStoreDensityDataProvider.java
Reworked XY generation to a sampling-based flow (getXAxisAndYSeriesModels(...)), added getXAxisDescription(), sampling/range creation, per-entry binning, segment-type ↔ entry-ID BiMap, improved cancellation and module-wait handling in fetchXY/fetchTree, and added create(...) factory for experiment-aware instantiation.
Provider Factory & Descriptors
analysis/.../timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java
createProvider now delegates to SegmentStoreDensityDataProvider.create(...) and falls back to scatter/composite providers; added getDescriptors(ITmfTrace) to enumerate ISegmentStoreProvider modules and emit localized DataProviderDescriptor entries.
Integration & Unit Tests
releng/.../core.tests/dataproviders/DataProviderManagerTest.java, analysis/.../tests/.../SegmentStoreDensityDataProviderTest.java
Tests updated to expect new latency-density descriptors; density tests refactored to fetch tree IDs then call fetchXY with requested items/time range, include cancelled and null-store scenarios, and validate models/status rather than raw arrays.

Sequence Diagram(s)

sequenceDiagram
    participant View
    participant Factory
    participant DensityProvider as DensityProvider
    participant ScatterProvider as Scatter
    participant CompositeProvider as Composite
    View->>Factory: request createProvider(trace, secondaryId)
    alt Density provider available
        Factory->>DensityProvider: SegmentStoreDensityDataProvider.create(trace, secondaryId)
        DensityProvider-->>Factory: provider instance (non-null)
        Factory-->>View: return DensityProvider
    else single-trace fallback
        Factory->>Scatter: SegmentStoreScatterDataProvider.create(trace, secondaryId)
        Scatter-->>Factory: provider instance
        Factory-->>View: return Scatter
    else multi-trace composite
        Factory->>Composite: TmfTreeXYCompositeDataProvider.create(...)
        Composite-->>Factory: composite instance
        Factory-->>View: return Composite
    end
Loading
sequenceDiagram
    participant UI
    participant Provider
    participant Analyzer
    UI->>Provider: fetchTree(...) or fetchXY(request with SelectionTimeQueryFilter)
    Provider->>Analyzer: ensure module analyzed / schedule if needed
    Analyzer-->>Provider: ready or cancelled
    Provider->>Provider: createEvenlyDistributedRanges(start,end,count)
    Provider->>Provider: sample/bin segments into ranges, build X axis + Y series
    alt cancelled
        Provider-->>UI: canceled response
    else success
        Provider-->>UI: return XY model + getXAxisDescription()
    end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

  • Focus review on SegmentStoreDensityDataProvider.java: sampling/binning correctness, BiMap ID↔name mapping, module wait/scheduling, and cancellation propagation.
  • Review SegmentStoreDensityDataProviderFactory.getDescriptors: descriptor IDs/parent IDs, localization, and deduplication logic.
  • Validate updated tests for correct fetchTree → fetchXY usage and composite ID formatting.

Poem

I nibble samples in moonlit rows, 🐇
I count the hops where latency grows.
Trees sprout IDs in ordered lines,
Bins hum softly with counted times.
Hooray for density—tiny signs.

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 37.04% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main objective of the pull request - adding a segment store data density provider with new fetchTree and fetchXY capabilities.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🧹 Nitpick comments (4)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (1)

214-219: Missing new Builder() instantiation before setting descriptor values.

Line 214 reuses the builder from the previous "State System Explorer" descriptor without creating a new instance. While this works due to the previous build() call, it's inconsistent with the pattern used throughout this file. Other descriptors consistently create a new builder before setting values (see lines 219, 225, 231, etc.).

Apply this diff for consistency:

         EXPECTED_KERNEL_DP_DESCRIPTORS.add(builder.build());
+        builder = new DataProviderDescriptor.Builder();
         builder.setName("IRQ Analysis - Latency Density")
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)

225-227: Add Javadoc for public method.

This public method lacks documentation. Consider adding a Javadoc comment describing its purpose and return value.

+    /**
+     * Get the X-axis description for the density chart.
+     *
+     * @return the X-axis description with duration in nanoseconds
+     */
     public TmfXYAxisDescription getXAxisDescription() {
         return new TmfXYAxisDescription("Duration", "ns", DataType.DURATION);
     }

317-322: Use externalized string for "Total".

The Messages.SegmentStoreDensity_TotalLabel constant exists and should be used here instead of the hardcoded string.

     private static String getSegmentType(ISegment segment) {
         if (segment instanceof INamedSegment) {
             return ((INamedSegment) segment).getName();
         }
-        return "Total";
+        return Objects.requireNonNull(Messages.SegmentStoreDensity_TotalLabel);
     }

330-338: Consider externalizing "Unknown" string.

For i18n consistency, the "Unknown" fallback string should be externalized via the Messages class.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between eb48c52 and 68ca06d.

📒 Files selected for processing (5)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties (1 hunks)
  • releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (3 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: build / build
  • GitHub Check: build / build
  • GitHub Check: build / build
🔇 Additional comments (9)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1)

155-159: LGTM!

The new SegmentStoreDensityDataProvider_description field follows the established NLS pattern in this class and is consistent with other description fields.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties (1)

31-32: LGTM!

The title and description updates are appropriate for the latency density data provider and align with the naming convention used by other segment store providers.

releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (2)

318-329: LGTM!

The UST density descriptors are correctly added with proper builder instantiation and consistent naming/ID patterns.


351-356: LGTM!

The experiment density descriptor for Event Matching Latency is correctly added.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)

113-142: LGTM!

The fetchXY implementation correctly handles cancellation, module completion, and creates the XY model with proper sampling support.


145-216: LGTM!

The fetchTree implementation properly builds a hierarchical tree model using grouping aspects, with correct handling of module lifecycle, cancellation, and completion status.


351-377: LGTM!

The createEvenlyDistributedRanges method correctly distributes the remainder across ranges and handles edge cases for the last sample.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2)

60-62: I'm unable to access the repository directly due to cloning failures, and the file SegmentStoreDensityDataProviderFactory.java doesn't appear in the public Eclipse Trace Compass repository. This file may be:

  1. In a PR branch not yet merged
  2. In a fork or private repository
  3. In development/staging

To complete the verification of this review comment, I need:

Can you provide:

  • The repository URL or link to the PR being reviewed?
  • Direct access to the codebase (or a way to clone it)?
  • Or the content of the SegmentStoreDensityDataProviderFactory.java file and related factory files for comparison?

Alternatively, if you can grant access to the specific repository hosting this PR, I can proceed with a thorough verification of:

  • The full createProvider() method implementation
  • Comparison with SegmentStoreScatterDataProviderFactory to detect copy-paste errors
  • Git history or inline comments explaining the fallback logic
  • Whether returning null would be more semantically correct

Please share the repository context so I can complete the verification.


76-84: Based on my verification, I can now provide the rewritten review comment.

The review comment contains a factually incorrect premise.

According to Eclipse API documentation, NLS.bind() never returns null. It always returns a String value—if null parameters are passed, it converts them to the string "null". Therefore, Objects.requireNonNull(NLS.bind(...)) will never throw an NPE due to NLS.bind() returning null.

However, there is a separate data quality concern: if analysis.getName() or analysis.getHelpText() return null, NLS.bind() will silently convert them to the string "null", resulting in poor user-facing descriptor names. The actual risk is not an exception, but storing invalid placeholder strings as descriptor metadata.


Remove the null-safety concern as stated; verify that analysis methods return valid non-null values.

The claim that Objects.requireNonNull() will throw NPE because NLS.bind() returns null is incorrect—NLS.bind() always returns a String. However, if analysis.getName() or analysis.getHelpText() return null, they will be converted to the string "null" by NLS.bind(), creating poor descriptor names. Validate the analysis object's methods return meaningful non-null values before binding.

Likely an incorrect or invalid review comment.

Comment on lines +397 to +399
Iterable<ISegmentStoreProvider> modules = TmfTraceUtils.getAnalysisModulesOfClass(trace, ISegmentStoreProvider.class);
Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId));
Iterator<ISegmentStoreProvider> iterator = filteredModules.iterator();
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Unsafe cast in filter lambda may throw ClassCastException.

The lambda casts m to IAnalysisModule without checking if the ISegmentStoreProvider is actually an IAnalysisModule.

Apply this fix:

         Iterable<ISegmentStoreProvider> modules = TmfTraceUtils.getAnalysisModulesOfClass(trace, ISegmentStoreProvider.class);
-        Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId));
+        Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules,
+                m -> m instanceof IAnalysisModule && ((IAnalysisModule) m).getId().equals(secondaryId));
🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java
around lines 397 to 399, the filter lambda unsafely casts m to IAnalysisModule
which can throw ClassCastException; update the predicate to first check m
instanceof IAnalysisModule and only then cast to call getId(), e.g. filter with
m -> (m instanceof IAnalysisModule) && ((IAnalysisModule)
m).getId().equals(secondaryId), so non-IAnalysisModule providers are safely
excluded.

@Yibhir0 Yibhir0 force-pushed the segmentstore-provider branch 2 times, most recently from f366077 to b40d78c Compare November 27, 2025 15:15
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

♻️ Duplicate comments (2)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)

390-407: Unsafe cast to IAnalysisModule in filter predicate can throw ClassCastException

The filter:

Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules,
        m -> ((IAnalysisModule) m).getId().equals(secondaryId));

assumes every ISegmentStoreProvider is an IAnalysisModule. If that stops being true for any implementation, this lambda will throw at runtime before the instanceof checks later in the method.

You can make this robust (and align with earlier review feedback) by guarding the cast:

-        Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules,
-                m -> ((IAnalysisModule) m).getId().equals(secondaryId));
+        Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules,
+                m -> m instanceof IAnalysisModule
+                        && ((IAnalysisModule) m).getId().equals(secondaryId));
@@
-            ISegmentStoreProvider module = iterator.next();
+            ISegmentStoreProvider module = iterator.next();
@@
-            ((IAnalysisModule) module).schedule();
+            if (module instanceof IAnalysisModule) {
+                ((IAnalysisModule) module).schedule();
+            }
             return new SegmentStoreDensityDataProvider(trace, module, secondaryId);

This avoids surprising runtime failures if a non‑analysis segment store provider is ever introduced.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (1)

68-88: Unconditional cast to IAnalysisModule in descriptor loop is fragile

getDescriptors currently does:

for (ISegmentStoreProvider module : modules) {
    IAnalysisModule analysis = (IAnalysisModule) module;
    if (!existingModules.contains(analysis.getId())) {
        ...
    }
}

This assumes every ISegmentStoreProvider is also an IAnalysisModule. If that invariant is ever broken, descriptor discovery will start throwing ClassCastException at runtime.

You can harden this with an instanceof guard and skip non‑analysis providers:

-        for (ISegmentStoreProvider module : modules) {
-            IAnalysisModule analysis = (IAnalysisModule) module;
-            if (!existingModules.contains(analysis.getId())) {
+        for (ISegmentStoreProvider module : modules) {
+            if (!(module instanceof IAnalysisModule)) {
+                continue;
+            }
+            IAnalysisModule analysis = (IAnalysisModule) module;
+            if (!existingModules.contains(analysis.getId())) {
                 DataProviderDescriptor.Builder builder = new DataProviderDescriptor.Builder();
                 builder.setId(SegmentStoreDensityDataProvider.ID + DataProviderConstants.ID_SEPARATOR + analysis.getId())
                         .setParentId(analysis.getConfiguration() != null ? analysis.getId() : null)
                         .setName(Objects.requireNonNull(NLS.bind(Messages.SegmentStoreDensityDataProvider_title, analysis.getName())))
                         .setDescription(Objects.requireNonNull(NLS.bind(Messages.SegmentStoreDensityDataProvider_description, analysis.getHelpText())))
                         .setProviderType(ProviderType.TREE_GENERIC_XY)
                         .setConfiguration(analysis.getConfiguration());
                 descriptors.add(builder.build());
                 existingModules.add(analysis.getId());
             }
         }

This preserves the current behavior for analysis-backed segment store providers while avoiding unexpected runtime failures.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f366077 and b40d78c.

📒 Files selected for processing (6)
  • analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (4 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties (1 hunks)
  • releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (3 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties
🧰 Additional context used
🧬 Code graph analysis (2)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/filters/TimeQueryFilter.java (1)
  • TimeQueryFilter (35-188)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeDataModel.java (1)
  • TmfTreeDataModel (28-220)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeModel.java (1)
  • TmfTreeModel (33-246)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (5)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/DataProviderDescriptor.java (1)
  • DataProviderDescriptor (30-243)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/xy/TmfTreeXYCompositeDataProvider.java (1)
  • TmfTreeXYCompositeDataProvider (53-197)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
  • SegmentStoreDensityDataProvider (76-408)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreScatterDataProvider.java (1)
  • SegmentStoreScatterDataProvider (82-534)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1)
  • Messages (22-172)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: build / build
  • GitHub Check: build / build
  • GitHub Check: build / build
🔇 Additional comments (2)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (1)

214-236: New latency-density descriptors look consistent with factory behavior

The added EXPECTED_* descriptors for kernel, UST, and experiment traces correctly use ProviderType.TREE_GENERIC_XY and IDs of the form SegmentStoreDensityDataProvider.ID + DataProviderConstants.ID_SEPARATOR + analysisId, matching the new factory logic and message format.

Also applies to: 318-329, 351-356

analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (1)

185-188: ID expectation matches new composed provider ID

The expectedId = SegmentStoreDensityDataProvider.ID + ":" + ID assertion correctly reflects the new fID composition in the provider constructor, including the use of DataProviderConstants.ID_SEPARATOR in production code.

Comment on lines 118 to 148
@Test()
public void testDataProviderFetchXY() {
TimeQueryFilter timeQueryFilter = new TimeQueryFilter(0, 100, 100);
TmfModelResponse<@NonNull ITmfXyModel> response = fDataProvider.fetchXY(FetchParametersUtils.timeQueryToMap(timeQueryFilter), null);
// Get tree first to find available entries
TmfModelResponse<@NonNull TmfTreeModel<@NonNull TmfTreeDataModel>> treeResponse =
fDataProvider.fetchTree(FetchParametersUtils.timeQueryToMap(new TimeQueryFilter(0, 100, 100)), null);
assertNotNull(treeResponse);
TmfTreeModel<@NonNull TmfTreeDataModel> treeModel = treeResponse.getModel();
assertNotNull(treeModel);
List<@NonNull TmfTreeDataModel> entries = treeModel.getEntries();

// Create parameters with selected items
Map<String, Object> parameters = new HashMap<>();
parameters.put("start", 0L);
parameters.put("end", 100L);
parameters.put("nb_points", 100);
if (!entries.isEmpty()) {
parameters.put("requested_items", Collections.singletonList(entries.get(0).getId()));
} else {
parameters.put("requested_items", Collections.emptyList());
}

TmfModelResponse<@NonNull ITmfXyModel> response = fDataProvider.fetchXY(parameters, null);
assertNotNull(response);
ITmfXyModel responseModel = response.getModel();
assertNotNull(responseModel);
SeriesModel seriesResponse = (SeriesModel) responseModel.getSeriesData().toArray()[0];
assertTrue(Arrays.equals(yValues, seriesResponse.getData()));
assertTrue(Arrays.equals(xValues, seriesResponse.getXAxis()));

if (!responseModel.getSeriesData().isEmpty()) {
IYModel yModel = (IYModel) responseModel.getSeriesData().iterator().next();
assertTrue(Arrays.equals(yValues, yModel.getData()));
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

requested_items should target a leaf series, not the root node

In testDataProviderFetchXY, requested_items is built from entries.get(0).getId(). Given fetchTree constructs the root node first (parentId = -1) and then grouping/leaf nodes, entries.get(0) is very likely the root. However:

  • Density bins in getXAxisAndYSeriesModels are keyed by series IDs derived from segment types (leaf entries), not the root ID.
  • With the root ID selected, entryToBins gets an array for the root, but getEntryIdForSegmentType returns leaf IDs; no bins are ever incremented for the selected ID, so the test’s yValues comparison will not match the intended series.

To robustly test the real density data, select a non‑root/leaf entry, e.g.:

-        if (!entries.isEmpty()) {
-            parameters.put("requested_items", Collections.singletonList(entries.get(0).getId()));
-        } else {
-            parameters.put("requested_items", Collections.emptyList());
-        }
+        Long requestedId = entries.stream()
+                .filter(e -> e.getParentId() != -1) // skip root
+                .findFirst()
+                .map(TmfTreeDataModel::getId)
+                .orElse(null);
+        if (requestedId != null) {
+            parameters.put("requested_items", Collections.singletonList(requestedId));
+        } else {
+            parameters.put("requested_items", Collections.emptyList());
+        }

(or use a dedicated ROOT_ID constant if available instead of -1).

This ensures the test validates the density for an actual series node.

🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java
around lines 118-148, the test builds requested_items using
entries.get(0).getId() which is likely the root node; density bins are keyed by
leaf series IDs so the test never selects a real series and the yValues
comparison can fail. Change the selection to pick a non-root (leaf) entry: scan
entries for the first entry whose parent id != ROOT (or != -1) or otherwise
identify a leaf series id and put that id in requested_items; if none found,
fall back to an empty list. Ensure requested_items contains a leaf series id so
bins are incremented for the selected series.

Comment on lines +234 to +273
private @Nullable Pair<@NonNull ISampling, @NonNull Collection<@NonNull IYModel>> getXAxisAndYSeriesModels(
@NonNull Map<@NonNull String, @NonNull Object> fetchParameters, @Nullable IProgressMonitor monitor) {
SelectionTimeQueryFilter filter = FetchParametersUtils.createSelectionTimeQueryWithSamples(fetchParameters);
if (filter == null) {
return null;
}
final ISegmentStore<ISegment> segStore = fProvider.getSegmentStore();
if (segStore == null) {
return null;
}

// Get segments in time range
Iterable<ISegment> segments = segStore.getIntersectingElements(filter.getStart(), filter.getEnd());

// Find max duration
long maxLength = getMaxDuration();
if (maxLength <= 0) {
maxLength = 1;
}

int nbSamples = filter.getNumberOfSamples();
long sampleStart = 0;
long sampleEnd = maxLength;

// Create bins based on selected entries
Map<@NonNull Long, @NonNull Integer> selectedEntries = getSelectedEntries(filter);
ISampling.Ranges sampling = createEvenlyDistributedRanges(sampleStart, sampleEnd, nbSamples);
if (sampling == null) {
return null;
}

// Create bins for each selected entry
Map<Long, double[]> entryToBins = new HashMap<>();
for (Long entryId : selectedEntries.keySet()) {
entryToBins.put(entryId, new double[nbSamples]);
}

// Process segments and fill bins
long totalSpan = sampleEnd - sampleStart + 1;
long step = totalSpan / nbSamples;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Bin/sampling logic breaks down when nb_points > max segment duration

In getXAxisAndYSeriesModels and createEvenlyDistributedRanges the step size can become 0 when the requested number of samples exceeds the duration span. Consequences:

  • step in getXAxisAndYSeriesModels becomes 0, so the if (step > 0 && …) guard prevents all segments from being binned, yielding an all‑zero density even when segments exist.
  • createEvenlyDistributedRanges with step == 0 can generate ranges where start > end once remainder is exhausted, which is a malformed ISampling.Ranges structure.

Example: maxLength = 50, nb_points = 500totalSpan = 51, step = 0; segments are never counted, and many constructed ranges will be invalid.

Consider clamping the effective number of samples to the duration span and ensuring step >= 1 before use, e.g.:

-        int nbSamples = filter.getNumberOfSamples();
+        int nbSamples = filter.getNumberOfSamples();
+        long totalSpan = sampleEnd - sampleStart + 1;
+        if (totalSpan <= 0) {
+            return null;
+        }
+        if (nbSamples > totalSpan) {
+            nbSamples = (int) totalSpan;
+        }
@@
-        long totalSpan = sampleEnd - sampleStart + 1;
-        long step = totalSpan / nbSamples;
+        long step = totalSpan / nbSamples;
         ...

and in createEvenlyDistributedRanges:

-        if (samples <= 0 || start >= end) {
+        if (samples <= 0 || start >= end) {
             return null;
         }
-
-        List<ISampling.@NonNull Range<@NonNull Long>> ranges = new ArrayList<>(samples);
-        long totalSpan = end - start;
-        long step = totalSpan / samples;
-        long remainder = totalSpan % samples;
+        long totalSpan = end - start;
+        int effectiveSamples = (int) Math.min(samples, Math.max(1, totalSpan));
+        List<ISampling.@NonNull Range<@NonNull Long>> ranges = new ArrayList<>(effectiveSamples);
+        long step = totalSpan / effectiveSamples;
+        long remainder = totalSpan % effectiveSamples;
@@
-        for (int i = 0; i < samples; i++) {
+        for (int i = 0; i < effectiveSamples; i++) {

This keeps sampling and bin index computation consistent and avoids zero‑width bins and invalid ranges.

Also applies to: 347-373

🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java
around lines 234 to 273 (and also apply the same fix to the similar code at
lines ~347-373), the binning logic can produce a zero step and invalid ranges
when the requested number of samples (nbSamples) exceeds the time span; clamp
nbSamples to at most the effective span (totalSpan) so you never request more
bins than time units, and ensure any computed step is at least 1 before using
it; update getXAxisAndYSeriesModels to cap nbSamples = Math.max(1,
Math.min(nbSamples, (int)totalSpan)) (or equivalent) and recompute step =
Math.max(1, totalSpan / nbSamples), and modify createEvenlyDistributedRanges to
guard against step == 0 and to build ranges using the clamped nbSamples so
starts are <= ends and all bins are valid.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (3)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)

271-297: Bin/sampling logic breaks when nbSamples exceeds duration span.

This issue was flagged in a previous review but appears unaddressed. When nbSamples > totalSpan:

  • step becomes 0 (Line 273-274)
  • The if (step > 0 && ...) guard at Line 281 prevents all segments from being binned
  • Result: all-zero density chart even when segments exist

Consider clamping nbSamples to the duration span:

         long totalSpan = sampleEnd - sampleStart + 1;
+        if (nbSamples > totalSpan) {
+            nbSamples = (int) totalSpan;
+        }
+        if (nbSamples <= 0) {
+            nbSamples = 1;
+        }
         long step = totalSpan / nbSamples;

This ensures step >= 1 and segments are properly binned.


347-373: Invalid ranges when step is 0.

Continuing from the previous issue: when samples > totalSpan, step becomes 0, causing:

  • rangeEnd = current + step - 1 = current - 1 (Line 360)
  • This creates ranges where rangeStart > rangeEnd

Apply the same fix by clamping effective samples:

         long totalSpan = end - start;
+        int effectiveSamples = (int) Math.min(samples, Math.max(1, totalSpan));
-        long step = totalSpan / samples;
-        long remainder = totalSpan % samples;
+        long step = Math.max(1, totalSpan / effectiveSamples);
+        long remainder = totalSpan % effectiveSamples;

-        for (int i = 0; i < samples; i++) {
+        for (int i = 0; i < effectiveSamples; i++) {

Also update the returned Ranges to reflect the actual number of ranges created.


390-407: Unsafe cast in filter lambda may throw ClassCastException.

Line 394 casts m to IAnalysisModule without checking if the ISegmentStoreProvider implements IAnalysisModule. This was flagged in a previous review.

Apply this fix:

         Iterable<ISegmentStoreProvider> modules = TmfTraceUtils.getAnalysisModulesOfClass(trace, ISegmentStoreProvider.class);
-        Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId));
+        Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules,
+                m -> m instanceof IAnalysisModule && ((IAnalysisModule) m).getId().equals(secondaryId));

Additionally, the cast at Line 403 should be safe after this fix since the filter ensures only IAnalysisModule instances remain.

🧹 Nitpick comments (5)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (1)

214-236: Kernel latency density descriptors look consistent with existing provider patterns

The four new kernel latency density descriptors correctly:

  • Use ProviderType.TREE_GENERIC_XY (non-time-based density),
  • Follow the existing SegmentStoreDensityDataProvider:<analysis-id> ID convention,
  • Mirror the existing latency Statistics/Table/XY triplets for IRQ, SWS, Futex, and Syscall.

Reusing the existing builder for the first descriptor (Line 214) is functionally fine, but you may optionally instantiate a fresh DataProviderDescriptor.Builder() there for readability and consistency with neighboring code.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties (1)

31-32: Minor grammatical inconsistency with other descriptions.

The description uses "Shows" while other descriptions in this file use "Show" (e.g., SegmentStoreStatisticsDataProvider_description, SegmentStoreTableDataProvider_description, SegmentStoreScatterGraphDataProvider_description). Consider using "Show" for consistency.

 SegmentStoreDensityDataProvider_title={0} - Latency Density
-SegmentStoreDensityDataProvider_description=Shows latency density distribution by {0}
+SegmentStoreDensityDataProvider_description=Show latency density distribution by {0}
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3)

101-128: Test requests all entries including root and grouping nodes.

The test collects all entry IDs from fetchTree including the root trace node and grouping aspect nodes (Lines 107-111). However, getXAxisAndYSeriesModels only populates bins for IDs that map to segment types via fIdToType, which are the leaf entries. The root and grouping nodes won't have corresponding bins incremented.

While the test currently passes because it only checks that some series data exists (Lines 121-127), it doesn't validate that the density values are correct for actual segment types. Consider filtering to request only leaf entries:

-        List<Long> itemIds = treeModel.getEntries().stream().map(TmfTreeDataModel::getId).collect(Collectors.toList());
+        // Skip root (parentId == -1) and grouping nodes to request only leaf series
+        List<Long> itemIds = treeModel.getEntries().stream()
+                .filter(e -> e.getParentId() != -1)
+                .filter(e -> !e.getName().startsWith("group"))
+                .map(TmfTreeDataModel::getId)
+                .collect(Collectors.toList());

Alternatively, add assertions that verify the returned series actually contains non-zero density data for the expected segments.


169-192: Test should explicitly assert expected behavior for null segments.

The conditional at Lines 186-191 means no assertions run if responseModel is null or seriesData is empty. For a null-segment provider, the expected behavior should be explicitly asserted. Consider:

         TmfModelResponse<@NonNull ITmfXyModel> response = fDataProviderNullSegments.fetchXY(parameters, null);
         assertNotNull(response);
+        // For null segments, we expect either no model or empty series data
+        ITmfXyModel responseModel = response.getModel();
+        if (responseModel != null) {
+            // Assert that null segment provider produces empty series
+            assertTrue("Expected empty series for null segment provider",
+                    responseModel.getSeriesData().isEmpty());
+        }
-
-        ITmfXyModel responseModel = response.getModel();
-        if (responseModel != null && !responseModel.getSeriesData().isEmpty()) {
-            ISeriesModel seriesModel = responseModel.getSeriesData().iterator().next();
-            assertNotNull(seriesModel);
-            assertNotNull(seriesModel.getData());
-            assertEquals(100, seriesModel.getData().length);
-        }

This makes the test's expectations explicit rather than silently passing when no data is returned.


197-203: Consider adding assertions for tree structure.

The test only checks that fetchTree returns a non-null response with COMPLETED status. Consider also validating that the tree model contains expected entries:

TmfTreeModel<@NonNull TmfTreeDataModel> treeModel = response.getModel();
assertNotNull(treeModel);
assertFalse("Tree should contain entries", treeModel.getEntries().isEmpty());
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b40d78c and 36cf0bc.

📒 Files selected for processing (6)
  • analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties (1 hunks)
  • releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (3 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java
🧰 Additional context used
🧬 Code graph analysis (3)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (7)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/internal/tmf/core/model/TmfXyResponseFactory.java (1)
  • TmfXyResponseFactory (41-251)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/CommonStatusMessage.java (1)
  • CommonStatusMessage (27-93)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/filters/SelectionTimeQueryFilter.java (1)
  • SelectionTimeQueryFilter (29-86)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeDataModel.java (1)
  • TmfTreeDataModel (28-220)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeModel.java (1)
  • TmfTreeModel (33-246)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/xy/TmfXYAxisDescription.java (1)
  • TmfXYAxisDescription (26-143)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/response/TmfModelResponse.java (1)
  • TmfModelResponse (29-79)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (3)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
  • SegmentStoreDensityDataProvider (76-408)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreScatterDataProvider.java (1)
  • SegmentStoreScatterDataProvider (82-534)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1)
  • Messages (22-172)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (6)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
  • SegmentStoreDensityDataProvider (76-408)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/internal/tmf/core/model/filters/FetchParametersUtils.java (1)
  • FetchParametersUtils (33-184)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/exceptions/TmfAnalysisException.java (1)
  • TmfAnalysisException (27-70)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/filters/TimeQueryFilter.java (1)
  • TimeQueryFilter (35-188)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeDataModel.java (1)
  • TmfTreeDataModel (28-220)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeModel.java (1)
  • TmfTreeModel (33-246)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: build / build
  • GitHub Check: build / build
  • GitHub Check: build / build
🔇 Additional comments (12)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (2)

318-329: UST callstack latency density descriptors correctly extend the UST provider set

The two new UST latency density descriptors:

  • Match naming of the existing “Latency Statistics / Table / vs Time” views,
  • Use ProviderType.TREE_GENERIC_XY and the same SegmentStoreDensityDataProvider prefix,
  • Reference the appropriate legacy and “(new)” callstack analysis IDs.

This should keep the UST expectations in sync with the new density providers without affecting existing tests.


351-356: Event Matching latency density descriptor properly added to experiment expectations

The new “Event Matching Latency - Latency Density” descriptor:

  • Uses the same density naming/description pattern as the other views,
  • Has ProviderType.TREE_GENERIC_XY,
  • Uses the expected SegmentStoreDensityDataProvider:<event-matching-id> ID,
  • Is added only to EXPECTED_EXPERIMENT_SET, which is appropriate since Event Matching is experiment-specific.

No issues from a test-coverage or consistency perspective.

analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3)

133-140: LGTM!

The test correctly validates that an empty parameter map results in a FAILED status response.


145-164: LGTM!

The cancellation test properly validates that a pre-cancelled monitor results in a CANCELLED status response.


208-212: LGTM!

The test correctly validates the composite ID format matches the provider's ID construction.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2)

44-47: LGTM!

The createProvider(ITmfTrace) method correctly returns null to indicate that a secondary ID is required for this factory.


68-91: LGTM! The instanceof check at Lines 74-76 properly guards against the ClassCastException that was flagged in a previous review.

One minor observation: the parentId logic at Line 81 sets the parent to analysis.getId() only when getConfiguration() is non-null. Verify this is the intended behavior, as it means most descriptors will have null parent IDs unless they have an associated configuration.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (5)

76-107: LGTM!

The constructor properly initializes the provider with the trace, segment store provider, grouping aspects, and composite ID.


109-139: LGTM!

The fetchXY method properly handles:

  • Filter validation with appropriate error response
  • Module completion waiting with cancellation support
  • Delegation to getXAxisAndYSeriesModels for XY data construction

166-179: Segment filtering may exclude valid segments.

Line 168 filters segments with s.getStart() >= start, which excludes segments that started before the query window but extend into it. This may be intentional for the density view, but it differs from how getXAxisAndYSeriesModels (Line 246) uses getIntersectingElements without this additional filter.

If intentional, consider adding a comment explaining why tree building uses stricter filtering than XY data generation.


222-224: LGTM!

The axis description correctly specifies "Duration" in nanoseconds with DataType.DURATION.


313-345: LGTM!

Helper methods for segment type resolution and ID mapping are correctly implemented with appropriate synchronization on fIdToType.

Comment on lines +63 to +65
return TmfTreeXYCompositeDataProvider.create(traces,
Objects.requireNonNull(Messages.SegmentStoreDensityDataProvider_title),
SegmentStoreDensityDataProvider.ID, secondaryId);
Copy link
Copy Markdown

@coderabbitai coderabbitai bot Nov 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Title format string {0} won't be bound.

Messages.SegmentStoreDensityDataProvider_title contains a format placeholder {0} (e.g., "{0} - Latency Density"), but it's passed directly to TmfTreeXYCompositeDataProvider.create without NLS binding. This will result in a literal "{0} - Latency Density" title for the composite provider.

Consider binding the title with an appropriate name:

         return TmfTreeXYCompositeDataProvider.create(traces,
-                Objects.requireNonNull(Messages.SegmentStoreDensityDataProvider_title),
+                Objects.requireNonNull(NLS.bind(Messages.SegmentStoreDensityDataProvider_title, trace.getName())),
                 SegmentStoreDensityDataProvider.ID, secondaryId);

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java
around lines 63 to 65, the title string
Messages.SegmentStoreDensityDataProvider_title contains a "{0}" placeholder but
is passed raw to TmfTreeXYCompositeDataProvider.create; format the title first
(e.g., with NLS.bind(...) or MessageFormat.format(...)) supplying an appropriate
name such as the trace or secondaryId display name, then pass the formatted,
non-null title to TmfTreeXYCompositeDataProvider.create so the placeholder is
bound correctly.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not needed.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MatthewKhouzam, understood. I'll note that this title format is acceptable as-is.

Update fetchTree and fetchXY
Add getXAxisAndYSeriesModels
Response data is similar to the genericXY
unit tests generated by Q

Signed-off-by: Yassine Ibhir <yassine.ibhir.ibhir@ericsson.com>
@Yibhir0 Yibhir0 force-pushed the segmentstore-provider branch from 36cf0bc to 145bde3 Compare November 27, 2025 18:57
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (5)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (1)

107-111: Test may select root node instead of leaf series.

The test collects all entry IDs including the root node (parentId = -1). The density provider builds bins keyed by leaf series IDs (segment types), so including the root ID in requested_items may not produce meaningful data for that entry.

Consider filtering to select only leaf entries (non-root nodes):

-        List<Long> itemIds = treeModel.getEntries().stream().map(TmfTreeDataModel::getId).collect(Collectors.toList());
+        List<Long> itemIds = treeModel.getEntries().stream()
+                .filter(e -> e.getParentId() != -1) // skip root
+                .map(TmfTreeDataModel::getId)
+                .collect(Collectors.toList());

This ensures the test validates actual density data for series nodes rather than including the root which has no bins.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (1)

63-65: Title format string {0} is not bound.

Messages.SegmentStoreDensityDataProvider_title likely contains a format placeholder (e.g., "{0} - Latency Density"), but it's passed directly without NLS binding. This results in a literal {0} in the composite provider title.

         return TmfTreeXYCompositeDataProvider.create(traces,
-                Objects.requireNonNull(Messages.SegmentStoreDensityDataProvider_title),
+                Objects.requireNonNull(NLS.bind(Messages.SegmentStoreDensityDataProvider_title, trace.getName())),
                 SegmentStoreDensityDataProvider.ID, secondaryId);
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)

393-395: Unsafe cast in filter lambda may throw ClassCastException.

The lambda casts m to IAnalysisModule without verifying the type. While TmfTraceUtils.getAnalysisModulesOfClass typically returns analysis modules, it's safer to add the check.

         Iterable<ISegmentStoreProvider> modules = TmfTraceUtils.getAnalysisModulesOfClass(trace, ISegmentStoreProvider.class);
-        Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId));
+        Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules,
+                m -> m instanceof IAnalysisModule && ((IAnalysisModule) m).getId().equals(secondaryId));

270-297: Binning breaks when nbSamples exceeds duration span.

When the requested number of samples exceeds totalSpan (e.g., maxLength=50, nbSamples=500), step becomes 0, and the if (step > 0 && ...) guard prevents all segments from being binned—resulting in all-zero density data.

Consider clamping nbSamples to the duration span:

+        // Clamp samples to avoid zero-width bins
+        if (nbSamples > totalSpan) {
+            nbSamples = (int) Math.max(1, totalSpan);
+        }
+
         long step = totalSpan / nbSamples;
+        if (step <= 0) {
+            step = 1;
+        }

This ensures meaningful binning regardless of the requested sample count.


347-373: Same zero-step issue in createEvenlyDistributedRanges.

When samples > totalSpan, step becomes 0, causing rangeEnd = current + step - 1 = current - 1, which creates invalid ranges where start > end.

Apply the same clamping fix here:

         long totalSpan = end - start;
+        int effectiveSamples = (int) Math.min(samples, Math.max(1, totalSpan));
-        long step = totalSpan / samples;
-        long remainder = totalSpan % samples;
+        long step = totalSpan / effectiveSamples;
+        long remainder = totalSpan % effectiveSamples;
         
         long current = start;
-        for (int i = 0; i < samples; i++) {
+        for (int i = 0; i < effectiveSamples; i++) {
🧹 Nitpick comments (3)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (2)

121-127: Conditional assertion weakens test coverage.

The if (!responseModel.getSeriesData().isEmpty()) guard means the test passes silently when no series data is returned. If the intent is to verify actual density data, consider asserting that series data is not empty for valid segments.

-        if (!responseModel.getSeriesData().isEmpty()) {
-            ISeriesModel seriesModel = responseModel.getSeriesData().iterator().next();
+        assertFalse("Expected series data for valid segments", responseModel.getSeriesData().isEmpty());
+        ISeriesModel seriesModel = responseModel.getSeriesData().iterator().next();

This ensures the test fails if the provider unexpectedly returns no data.


186-191: Make null-segment test expectations explicit.

The conditional check makes it unclear what the expected behavior is for null segments. If null segments should produce no series data, assert that explicitly:

         ITmfXyModel responseModel = response.getModel();
-        if (responseModel != null && !responseModel.getSeriesData().isEmpty()) {
-            ISeriesModel seriesModel = responseModel.getSeriesData().iterator().next();
-            assertNotNull(seriesModel);
-            assertNotNull(seriesModel.getData());
-            assertEquals(100, seriesModel.getData().length);
-        }
+        // Null segments provider should return completed response with empty or minimal data
+        assertNotNull(responseModel);
+        // If expecting empty series for null segments:
+        // assertTrue("Expected empty series for null segments", responseModel.getSeriesData().isEmpty());

This makes the test's intent clear and prevents silent passes.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)

375-383: Consider caching max duration for large segment stores.

getMaxDuration() iterates through the entire segment store on each call to getXAxisAndYSeriesModels. For large traces, this could be a performance concern.

Consider caching the result or computing it once during initialization if the segment store doesn't change during provider lifetime.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 36cf0bc and 145bde3.

📒 Files selected for processing (6)
  • analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2 hunks)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties (1 hunks)
  • releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (3 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties
  • analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java
🧰 Additional context used
🧬 Code graph analysis (1)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (5)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/DataProviderDescriptor.java (1)
  • DataProviderDescriptor (30-243)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/xy/TmfTreeXYCompositeDataProvider.java (1)
  • TmfTreeXYCompositeDataProvider (53-197)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
  • SegmentStoreDensityDataProvider (76-408)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreScatterDataProvider.java (1)
  • SegmentStoreScatterDataProvider (82-534)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1)
  • Messages (22-172)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: build / build
  • GitHub Check: build / build
  • GitHub Check: build / build
🔇 Additional comments (11)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (3)

214-236: Kernel latency density descriptors are consistent and well-integrated

The four new kernel entries (IRQ, SWS latency, Futex, Syscall) follow the existing naming/description scheme and re-use the same analysis IDs as the corresponding statistics/table/scatter providers, with a clear SegmentStoreDensityDataProvider: prefix and TREE_GENERIC_XY type. This should plug the new density providers cleanly into the manager expectations.


318-329: UST callstack latency density descriptors align with existing UST providers

Both “LTTng-UST CallStack - Latency Density” and “(new) - Latency Density” mirror the existing UST callstack descriptors: same analysis IDs, appropriate TREE_GENERIC_XY provider type, and clear density-specific descriptions. They should appear correctly in the UST and experiment expected sets via the existing addAll logic.


351-356: Event Matching Latency density descriptor correctly added for experiments

The new “Event Matching Latency - Latency Density” descriptor uses the same analysis ID as the existing Event Matching latency providers, with the new density provider prefix and TREE_GENERIC_XY type, and is added only to EXPECTED_EXPERIMENT_SET, which is appropriate for this analysis.

analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3)

133-140: LGTM!

The test correctly validates that fetchXY returns a FAILED status when required filter parameters are missing.


145-164: LGTM!

The test correctly validates that fetchXY returns a CANCELLED status when passed a pre-cancelled progress monitor.


197-212: LGTM!

Both testFetchTree and testID correctly validate the tree response status and composite ID format respectively.

analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2)

59-62: Inconsistent fallback to scatter provider.

When SegmentStoreDensityDataProvider.create() returns null for a single-trace scenario, the factory falls back to SegmentStoreScatterDataProvider.create(). This seems inconsistent since this is the Density data provider factory.

Is this intentional (e.g., scatter is a valid fallback for density), or should this return null to indicate no provider is available?


68-91: LGTM!

The getDescriptors method correctly:

  • Filters modules that are not IAnalysisModule instances
  • Deduplicates by module ID
  • Uses NLS.bind for name and description
  • Sets appropriate provider type and configuration
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)

109-139: LGTM!

The fetchXY method has proper:

  • Parameter validation with failed response for invalid filters
  • Module completion waiting with cancellation support
  • Clean delegation to getXAxisAndYSeriesModels

141-213: LGTM!

The fetchTree method properly:

  • Handles module completion with cancellation support
  • Returns appropriate status codes for null segment store
  • Builds hierarchical tree structure using grouping aspects
  • Includes cancellation checks during segment iteration

222-224: LGTM!

The getXAxisDescription method correctly provides axis metadata for the XY model.

Comment on lines +215 to +220
// ISegmentStoreProvider provider = fProvider;
private long getUniqueId(String name) {
synchronized (fIdToType) {
return fIdToType.inverse().computeIfAbsent(name, n -> TRACE_IDS.getAndIncrement());
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Remove commented-out code.

Line 215 contains a commented-out variable declaration that appears to be dead code.

-    // ISegmentStoreProvider provider = fProvider;
     private long getUniqueId(String name) {
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// ISegmentStoreProvider provider = fProvider;
private long getUniqueId(String name) {
synchronized (fIdToType) {
return fIdToType.inverse().computeIfAbsent(name, n -> TRACE_IDS.getAndIncrement());
}
}
private long getUniqueId(String name) {
synchronized (fIdToType) {
return fIdToType.inverse().computeIfAbsent(name, n -> TRACE_IDS.getAndIncrement());
}
}
🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java
around lines 215 to 220, remove the commented-out dead declaration "//
ISegmentStoreProvider provider = fProvider;"; delete the commented line so only
the getUniqueId method remains, and run a quick compile/format to ensure no
whitespace or style violations remain.

Copy link
Copy Markdown
Contributor

@bhufmann bhufmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. I tested it with InAndOut configuration, and it works as well. Thanks for the contribution.

@bhufmann bhufmann merged commit cac9137 into eclipse-tracecompass:master Dec 1, 2025
5 checks passed
@theibhir
Copy link
Copy Markdown
Contributor Author

theibhir commented Dec 1, 2025

Looks good to me. I tested it with InAndOut configuration, and it works as well. Thanks for the contribution.

Thank you for the review and help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants