Add segment store data density provider#355
Add segment store data density provider#355bhufmann merged 1 commit intoeclipse-tracecompass:masterfrom
Conversation
WalkthroughReplaces direct XY computation with sampling-based assembly, adds module-aware tree building and a conditional create(...) factory for experiments, exposes X-axis metadata, updates localized title/description to "Latency Density", and extends factory descriptors and tests to include latency-density providers. Changes
Sequence Diagram(s)sequenceDiagram
participant View
participant Factory
participant DensityProvider as DensityProvider
participant ScatterProvider as Scatter
participant CompositeProvider as Composite
View->>Factory: request createProvider(trace, secondaryId)
alt Density provider available
Factory->>DensityProvider: SegmentStoreDensityDataProvider.create(trace, secondaryId)
DensityProvider-->>Factory: provider instance (non-null)
Factory-->>View: return DensityProvider
else single-trace fallback
Factory->>Scatter: SegmentStoreScatterDataProvider.create(trace, secondaryId)
Scatter-->>Factory: provider instance
Factory-->>View: return Scatter
else multi-trace composite
Factory->>Composite: TmfTreeXYCompositeDataProvider.create(...)
Composite-->>Factory: composite instance
Factory-->>View: return Composite
end
sequenceDiagram
participant UI
participant Provider
participant Analyzer
UI->>Provider: fetchTree(...) or fetchXY(request with SelectionTimeQueryFilter)
Provider->>Analyzer: ensure module analyzed / schedule if needed
Analyzer-->>Provider: ready or cancelled
Provider->>Provider: createEvenlyDistributedRanges(start,end,count)
Provider->>Provider: sample/bin segments into ranges, build X axis + Y series
alt cancelled
Provider-->>UI: canceled response
else success
Provider-->>UI: return XY model + getXAxisDescription()
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 4
🧹 Nitpick comments (4)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (1)
214-219: Missingnew Builder()instantiation before setting descriptor values.Line 214 reuses the builder from the previous "State System Explorer" descriptor without creating a new instance. While this works due to the previous
build()call, it's inconsistent with the pattern used throughout this file. Other descriptors consistently create a new builder before setting values (see lines 219, 225, 231, etc.).Apply this diff for consistency:
EXPECTED_KERNEL_DP_DESCRIPTORS.add(builder.build()); + builder = new DataProviderDescriptor.Builder(); builder.setName("IRQ Analysis - Latency Density")analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)
225-227: Add Javadoc for public method.This public method lacks documentation. Consider adding a Javadoc comment describing its purpose and return value.
+ /** + * Get the X-axis description for the density chart. + * + * @return the X-axis description with duration in nanoseconds + */ public TmfXYAxisDescription getXAxisDescription() { return new TmfXYAxisDescription("Duration", "ns", DataType.DURATION); }
317-322: Use externalized string for "Total".The
Messages.SegmentStoreDensity_TotalLabelconstant exists and should be used here instead of the hardcoded string.private static String getSegmentType(ISegment segment) { if (segment instanceof INamedSegment) { return ((INamedSegment) segment).getName(); } - return "Total"; + return Objects.requireNonNull(Messages.SegmentStoreDensity_TotalLabel); }
330-338: Consider externalizing "Unknown" string.For i18n consistency, the "Unknown" fallback string should be externalized via the Messages class.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java(1 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java(3 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java(2 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties(1 hunks)releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java(3 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: build / build
- GitHub Check: build / build
- GitHub Check: build / build
🔇 Additional comments (9)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1)
155-159: LGTM!The new
SegmentStoreDensityDataProvider_descriptionfield follows the established NLS pattern in this class and is consistent with other description fields.analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties (1)
31-32: LGTM!The title and description updates are appropriate for the latency density data provider and align with the naming convention used by other segment store providers.
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (2)
318-329: LGTM!The UST density descriptors are correctly added with proper builder instantiation and consistent naming/ID patterns.
351-356: LGTM!The experiment density descriptor for Event Matching Latency is correctly added.
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)
113-142: LGTM!The
fetchXYimplementation correctly handles cancellation, module completion, and creates the XY model with proper sampling support.
145-216: LGTM!The
fetchTreeimplementation properly builds a hierarchical tree model using grouping aspects, with correct handling of module lifecycle, cancellation, and completion status.
351-377: LGTM!The
createEvenlyDistributedRangesmethod correctly distributes the remainder across ranges and handles edge cases for the last sample.analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2)
60-62: I'm unable to access the repository directly due to cloning failures, and the fileSegmentStoreDensityDataProviderFactory.javadoesn't appear in the public Eclipse Trace Compass repository. This file may be:
- In a PR branch not yet merged
- In a fork or private repository
- In development/staging
To complete the verification of this review comment, I need:
Can you provide:
- The repository URL or link to the PR being reviewed?
- Direct access to the codebase (or a way to clone it)?
- Or the content of the
SegmentStoreDensityDataProviderFactory.javafile and related factory files for comparison?Alternatively, if you can grant access to the specific repository hosting this PR, I can proceed with a thorough verification of:
- The full
createProvider()method implementation- Comparison with
SegmentStoreScatterDataProviderFactoryto detect copy-paste errors- Git history or inline comments explaining the fallback logic
- Whether returning
nullwould be more semantically correctPlease share the repository context so I can complete the verification.
76-84: Based on my verification, I can now provide the rewritten review comment.The review comment contains a factually incorrect premise.
According to Eclipse API documentation,
NLS.bind()never returnsnull. It always returns aStringvalue—if null parameters are passed, it converts them to the string"null". Therefore,Objects.requireNonNull(NLS.bind(...))will never throw anNPEdue toNLS.bind()returning null.However, there is a separate data quality concern: if
analysis.getName()oranalysis.getHelpText()return null,NLS.bind()will silently convert them to the string"null", resulting in poor user-facing descriptor names. The actual risk is not an exception, but storing invalid placeholder strings as descriptor metadata.
Remove the null-safety concern as stated; verify that analysis methods return valid non-null values.
The claim that
Objects.requireNonNull()will throw NPE becauseNLS.bind()returns null is incorrect—NLS.bind()always returns a String. However, ifanalysis.getName()oranalysis.getHelpText()return null, they will be converted to the string"null"byNLS.bind(), creating poor descriptor names. Validate the analysis object's methods return meaningful non-null values before binding.Likely an incorrect or invalid review comment.
...tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java
Outdated
Show resolved
Hide resolved
...tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java
Outdated
Show resolved
Hide resolved
| Iterable<ISegmentStoreProvider> modules = TmfTraceUtils.getAnalysisModulesOfClass(trace, ISegmentStoreProvider.class); | ||
| Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId)); | ||
| Iterator<ISegmentStoreProvider> iterator = filteredModules.iterator(); |
There was a problem hiding this comment.
Unsafe cast in filter lambda may throw ClassCastException.
The lambda casts m to IAnalysisModule without checking if the ISegmentStoreProvider is actually an IAnalysisModule.
Apply this fix:
Iterable<ISegmentStoreProvider> modules = TmfTraceUtils.getAnalysisModulesOfClass(trace, ISegmentStoreProvider.class);
- Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId));
+ Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules,
+ m -> m instanceof IAnalysisModule && ((IAnalysisModule) m).getId().equals(secondaryId));🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java
around lines 397 to 399, the filter lambda unsafely casts m to IAnalysisModule
which can throw ClassCastException; update the predicate to first check m
instanceof IAnalysisModule and only then cast to call getId(), e.g. filter with
m -> (m instanceof IAnalysisModule) && ((IAnalysisModule)
m).getId().equals(secondaryId), so non-IAnalysisModule providers are safely
excluded.
...mpass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java
Show resolved
Hide resolved
f366077 to
b40d78c
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
♻️ Duplicate comments (2)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
390-407: Unsafe cast toIAnalysisModulein filter predicate can throwClassCastExceptionThe filter:
Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId));assumes every
ISegmentStoreProvideris anIAnalysisModule. If that stops being true for any implementation, this lambda will throw at runtime before theinstanceofchecks later in the method.You can make this robust (and align with earlier review feedback) by guarding the cast:
- Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, - m -> ((IAnalysisModule) m).getId().equals(secondaryId)); + Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, + m -> m instanceof IAnalysisModule + && ((IAnalysisModule) m).getId().equals(secondaryId)); @@ - ISegmentStoreProvider module = iterator.next(); + ISegmentStoreProvider module = iterator.next(); @@ - ((IAnalysisModule) module).schedule(); + if (module instanceof IAnalysisModule) { + ((IAnalysisModule) module).schedule(); + } return new SegmentStoreDensityDataProvider(trace, module, secondaryId);This avoids surprising runtime failures if a non‑analysis segment store provider is ever introduced.
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (1)
68-88: Unconditional cast toIAnalysisModulein descriptor loop is fragile
getDescriptorscurrently does:for (ISegmentStoreProvider module : modules) { IAnalysisModule analysis = (IAnalysisModule) module; if (!existingModules.contains(analysis.getId())) { ... } }This assumes every
ISegmentStoreProvideris also anIAnalysisModule. If that invariant is ever broken, descriptor discovery will start throwingClassCastExceptionat runtime.You can harden this with an
instanceofguard and skip non‑analysis providers:- for (ISegmentStoreProvider module : modules) { - IAnalysisModule analysis = (IAnalysisModule) module; - if (!existingModules.contains(analysis.getId())) { + for (ISegmentStoreProvider module : modules) { + if (!(module instanceof IAnalysisModule)) { + continue; + } + IAnalysisModule analysis = (IAnalysisModule) module; + if (!existingModules.contains(analysis.getId())) { DataProviderDescriptor.Builder builder = new DataProviderDescriptor.Builder(); builder.setId(SegmentStoreDensityDataProvider.ID + DataProviderConstants.ID_SEPARATOR + analysis.getId()) .setParentId(analysis.getConfiguration() != null ? analysis.getId() : null) .setName(Objects.requireNonNull(NLS.bind(Messages.SegmentStoreDensityDataProvider_title, analysis.getName()))) .setDescription(Objects.requireNonNull(NLS.bind(Messages.SegmentStoreDensityDataProvider_description, analysis.getHelpText()))) .setProviderType(ProviderType.TREE_GENERIC_XY) .setConfiguration(analysis.getConfiguration()); descriptors.add(builder.build()); existingModules.add(analysis.getId()); } }This preserves the current behavior for analysis-backed segment store providers while avoiding unexpected runtime failures.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java(4 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java(1 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java(3 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java(2 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties(1 hunks)releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java
- analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties
🧰 Additional context used
🧬 Code graph analysis (2)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/filters/TimeQueryFilter.java (1)
TimeQueryFilter(35-188)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeDataModel.java (1)
TmfTreeDataModel(28-220)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeModel.java (1)
TmfTreeModel(33-246)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (5)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/DataProviderDescriptor.java (1)
DataProviderDescriptor(30-243)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/xy/TmfTreeXYCompositeDataProvider.java (1)
TmfTreeXYCompositeDataProvider(53-197)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
SegmentStoreDensityDataProvider(76-408)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreScatterDataProvider.java (1)
SegmentStoreScatterDataProvider(82-534)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1)
Messages(22-172)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: build / build
- GitHub Check: build / build
- GitHub Check: build / build
🔇 Additional comments (2)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (1)
214-236: New latency-density descriptors look consistent with factory behaviorThe added EXPECTED_* descriptors for kernel, UST, and experiment traces correctly use
ProviderType.TREE_GENERIC_XYand IDs of the formSegmentStoreDensityDataProvider.ID + DataProviderConstants.ID_SEPARATOR + analysisId, matching the new factory logic and message format.Also applies to: 318-329, 351-356
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (1)
185-188: ID expectation matches new composed provider IDThe
expectedId = SegmentStoreDensityDataProvider.ID + ":" + IDassertion correctly reflects the newfIDcomposition in the provider constructor, including the use ofDataProviderConstants.ID_SEPARATORin production code.
| @Test() | ||
| public void testDataProviderFetchXY() { | ||
| TimeQueryFilter timeQueryFilter = new TimeQueryFilter(0, 100, 100); | ||
| TmfModelResponse<@NonNull ITmfXyModel> response = fDataProvider.fetchXY(FetchParametersUtils.timeQueryToMap(timeQueryFilter), null); | ||
| // Get tree first to find available entries | ||
| TmfModelResponse<@NonNull TmfTreeModel<@NonNull TmfTreeDataModel>> treeResponse = | ||
| fDataProvider.fetchTree(FetchParametersUtils.timeQueryToMap(new TimeQueryFilter(0, 100, 100)), null); | ||
| assertNotNull(treeResponse); | ||
| TmfTreeModel<@NonNull TmfTreeDataModel> treeModel = treeResponse.getModel(); | ||
| assertNotNull(treeModel); | ||
| List<@NonNull TmfTreeDataModel> entries = treeModel.getEntries(); | ||
|
|
||
| // Create parameters with selected items | ||
| Map<String, Object> parameters = new HashMap<>(); | ||
| parameters.put("start", 0L); | ||
| parameters.put("end", 100L); | ||
| parameters.put("nb_points", 100); | ||
| if (!entries.isEmpty()) { | ||
| parameters.put("requested_items", Collections.singletonList(entries.get(0).getId())); | ||
| } else { | ||
| parameters.put("requested_items", Collections.emptyList()); | ||
| } | ||
|
|
||
| TmfModelResponse<@NonNull ITmfXyModel> response = fDataProvider.fetchXY(parameters, null); | ||
| assertNotNull(response); | ||
| ITmfXyModel responseModel = response.getModel(); | ||
| assertNotNull(responseModel); | ||
| SeriesModel seriesResponse = (SeriesModel) responseModel.getSeriesData().toArray()[0]; | ||
| assertTrue(Arrays.equals(yValues, seriesResponse.getData())); | ||
| assertTrue(Arrays.equals(xValues, seriesResponse.getXAxis())); | ||
|
|
||
| if (!responseModel.getSeriesData().isEmpty()) { | ||
| IYModel yModel = (IYModel) responseModel.getSeriesData().iterator().next(); | ||
| assertTrue(Arrays.equals(yValues, yModel.getData())); | ||
| } | ||
| } |
There was a problem hiding this comment.
requested_items should target a leaf series, not the root node
In testDataProviderFetchXY, requested_items is built from entries.get(0).getId(). Given fetchTree constructs the root node first (parentId = -1) and then grouping/leaf nodes, entries.get(0) is very likely the root. However:
- Density bins in
getXAxisAndYSeriesModelsare keyed by series IDs derived from segment types (leaf entries), not the root ID. - With the root ID selected,
entryToBinsgets an array for the root, butgetEntryIdForSegmentTypereturns leaf IDs; no bins are ever incremented for the selected ID, so the test’syValuescomparison will not match the intended series.
To robustly test the real density data, select a non‑root/leaf entry, e.g.:
- if (!entries.isEmpty()) {
- parameters.put("requested_items", Collections.singletonList(entries.get(0).getId()));
- } else {
- parameters.put("requested_items", Collections.emptyList());
- }
+ Long requestedId = entries.stream()
+ .filter(e -> e.getParentId() != -1) // skip root
+ .findFirst()
+ .map(TmfTreeDataModel::getId)
+ .orElse(null);
+ if (requestedId != null) {
+ parameters.put("requested_items", Collections.singletonList(requestedId));
+ } else {
+ parameters.put("requested_items", Collections.emptyList());
+ }(or use a dedicated ROOT_ID constant if available instead of -1).
This ensures the test validates the density for an actual series node.
🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java
around lines 118-148, the test builds requested_items using
entries.get(0).getId() which is likely the root node; density bins are keyed by
leaf series IDs so the test never selects a real series and the yValues
comparison can fail. Change the selection to pick a non-root (leaf) entry: scan
entries for the first entry whose parent id != ROOT (or != -1) or otherwise
identify a leaf series id and put that id in requested_items; if none found,
fall back to an empty list. Ensure requested_items contains a leaf series id so
bins are incremented for the selected series.
...racecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java
Outdated
Show resolved
Hide resolved
| private @Nullable Pair<@NonNull ISampling, @NonNull Collection<@NonNull IYModel>> getXAxisAndYSeriesModels( | ||
| @NonNull Map<@NonNull String, @NonNull Object> fetchParameters, @Nullable IProgressMonitor monitor) { | ||
| SelectionTimeQueryFilter filter = FetchParametersUtils.createSelectionTimeQueryWithSamples(fetchParameters); | ||
| if (filter == null) { | ||
| return null; | ||
| } | ||
| final ISegmentStore<ISegment> segStore = fProvider.getSegmentStore(); | ||
| if (segStore == null) { | ||
| return null; | ||
| } | ||
|
|
||
| // Get segments in time range | ||
| Iterable<ISegment> segments = segStore.getIntersectingElements(filter.getStart(), filter.getEnd()); | ||
|
|
||
| // Find max duration | ||
| long maxLength = getMaxDuration(); | ||
| if (maxLength <= 0) { | ||
| maxLength = 1; | ||
| } | ||
|
|
||
| int nbSamples = filter.getNumberOfSamples(); | ||
| long sampleStart = 0; | ||
| long sampleEnd = maxLength; | ||
|
|
||
| // Create bins based on selected entries | ||
| Map<@NonNull Long, @NonNull Integer> selectedEntries = getSelectedEntries(filter); | ||
| ISampling.Ranges sampling = createEvenlyDistributedRanges(sampleStart, sampleEnd, nbSamples); | ||
| if (sampling == null) { | ||
| return null; | ||
| } | ||
|
|
||
| // Create bins for each selected entry | ||
| Map<Long, double[]> entryToBins = new HashMap<>(); | ||
| for (Long entryId : selectedEntries.keySet()) { | ||
| entryToBins.put(entryId, new double[nbSamples]); | ||
| } | ||
|
|
||
| // Process segments and fill bins | ||
| long totalSpan = sampleEnd - sampleStart + 1; | ||
| long step = totalSpan / nbSamples; |
There was a problem hiding this comment.
Bin/sampling logic breaks down when nb_points > max segment duration
In getXAxisAndYSeriesModels and createEvenlyDistributedRanges the step size can become 0 when the requested number of samples exceeds the duration span. Consequences:
stepingetXAxisAndYSeriesModelsbecomes 0, so theif (step > 0 && …)guard prevents all segments from being binned, yielding an all‑zero density even when segments exist.createEvenlyDistributedRangeswithstep == 0can generate ranges wherestart > endonceremainderis exhausted, which is a malformedISampling.Rangesstructure.
Example: maxLength = 50, nb_points = 500 → totalSpan = 51, step = 0; segments are never counted, and many constructed ranges will be invalid.
Consider clamping the effective number of samples to the duration span and ensuring step >= 1 before use, e.g.:
- int nbSamples = filter.getNumberOfSamples();
+ int nbSamples = filter.getNumberOfSamples();
+ long totalSpan = sampleEnd - sampleStart + 1;
+ if (totalSpan <= 0) {
+ return null;
+ }
+ if (nbSamples > totalSpan) {
+ nbSamples = (int) totalSpan;
+ }
@@
- long totalSpan = sampleEnd - sampleStart + 1;
- long step = totalSpan / nbSamples;
+ long step = totalSpan / nbSamples;
...and in createEvenlyDistributedRanges:
- if (samples <= 0 || start >= end) {
+ if (samples <= 0 || start >= end) {
return null;
}
-
- List<ISampling.@NonNull Range<@NonNull Long>> ranges = new ArrayList<>(samples);
- long totalSpan = end - start;
- long step = totalSpan / samples;
- long remainder = totalSpan % samples;
+ long totalSpan = end - start;
+ int effectiveSamples = (int) Math.min(samples, Math.max(1, totalSpan));
+ List<ISampling.@NonNull Range<@NonNull Long>> ranges = new ArrayList<>(effectiveSamples);
+ long step = totalSpan / effectiveSamples;
+ long remainder = totalSpan % effectiveSamples;
@@
- for (int i = 0; i < samples; i++) {
+ for (int i = 0; i < effectiveSamples; i++) {This keeps sampling and bin index computation consistent and avoids zero‑width bins and invalid ranges.
Also applies to: 347-373
🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java
around lines 234 to 273 (and also apply the same fix to the similar code at
lines ~347-373), the binning logic can produce a zero step and invalid ranges
when the requested number of samples (nbSamples) exceeds the time span; clamp
nbSamples to at most the effective span (totalSpan) so you never request more
bins than time units, and ensure any computed step is at least 1 before using
it; update getXAxisAndYSeriesModels to cap nbSamples = Math.max(1,
Math.min(nbSamples, (int)totalSpan)) (or equivalent) and recompute step =
Math.max(1, totalSpan / nbSamples), and modify createEvenlyDistributedRanges to
guard against step == 0 and to build ranges using the clamped nbSamples so
starts are <= ends and all bins are valid.
b40d78c to
36cf0bc
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (3)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)
271-297: Bin/sampling logic breaks whennbSamplesexceeds duration span.This issue was flagged in a previous review but appears unaddressed. When
nbSamples > totalSpan:
stepbecomes 0 (Line 273-274)- The
if (step > 0 && ...)guard at Line 281 prevents all segments from being binned- Result: all-zero density chart even when segments exist
Consider clamping
nbSamplesto the duration span:long totalSpan = sampleEnd - sampleStart + 1; + if (nbSamples > totalSpan) { + nbSamples = (int) totalSpan; + } + if (nbSamples <= 0) { + nbSamples = 1; + } long step = totalSpan / nbSamples;This ensures
step >= 1and segments are properly binned.
347-373: Invalid ranges whenstepis 0.Continuing from the previous issue: when
samples > totalSpan,stepbecomes 0, causing:
rangeEnd = current + step - 1 = current - 1(Line 360)- This creates ranges where
rangeStart > rangeEndApply the same fix by clamping effective samples:
long totalSpan = end - start; + int effectiveSamples = (int) Math.min(samples, Math.max(1, totalSpan)); - long step = totalSpan / samples; - long remainder = totalSpan % samples; + long step = Math.max(1, totalSpan / effectiveSamples); + long remainder = totalSpan % effectiveSamples; - for (int i = 0; i < samples; i++) { + for (int i = 0; i < effectiveSamples; i++) {Also update the returned
Rangesto reflect the actual number of ranges created.
390-407: Unsafe cast in filter lambda may throwClassCastException.Line 394 casts
mtoIAnalysisModulewithout checking if theISegmentStoreProviderimplementsIAnalysisModule. This was flagged in a previous review.Apply this fix:
Iterable<ISegmentStoreProvider> modules = TmfTraceUtils.getAnalysisModulesOfClass(trace, ISegmentStoreProvider.class); - Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId)); + Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, + m -> m instanceof IAnalysisModule && ((IAnalysisModule) m).getId().equals(secondaryId));Additionally, the cast at Line 403 should be safe after this fix since the filter ensures only
IAnalysisModuleinstances remain.
🧹 Nitpick comments (5)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (1)
214-236: Kernel latency density descriptors look consistent with existing provider patternsThe four new kernel latency density descriptors correctly:
- Use
ProviderType.TREE_GENERIC_XY(non-time-based density),- Follow the existing
SegmentStoreDensityDataProvider:<analysis-id>ID convention,- Mirror the existing latency Statistics/Table/XY triplets for IRQ, SWS, Futex, and Syscall.
Reusing the existing
builderfor the first descriptor (Line 214) is functionally fine, but you may optionally instantiate a freshDataProviderDescriptor.Builder()there for readability and consistency with neighboring code.analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties (1)
31-32: Minor grammatical inconsistency with other descriptions.The description uses "Shows" while other descriptions in this file use "Show" (e.g.,
SegmentStoreStatisticsDataProvider_description,SegmentStoreTableDataProvider_description,SegmentStoreScatterGraphDataProvider_description). Consider using "Show" for consistency.SegmentStoreDensityDataProvider_title={0} - Latency Density -SegmentStoreDensityDataProvider_description=Shows latency density distribution by {0} +SegmentStoreDensityDataProvider_description=Show latency density distribution by {0}analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3)
101-128: Test requests all entries including root and grouping nodes.The test collects all entry IDs from
fetchTreeincluding the root trace node and grouping aspect nodes (Lines 107-111). However,getXAxisAndYSeriesModelsonly populates bins for IDs that map to segment types viafIdToType, which are the leaf entries. The root and grouping nodes won't have corresponding bins incremented.While the test currently passes because it only checks that some series data exists (Lines 121-127), it doesn't validate that the density values are correct for actual segment types. Consider filtering to request only leaf entries:
- List<Long> itemIds = treeModel.getEntries().stream().map(TmfTreeDataModel::getId).collect(Collectors.toList()); + // Skip root (parentId == -1) and grouping nodes to request only leaf series + List<Long> itemIds = treeModel.getEntries().stream() + .filter(e -> e.getParentId() != -1) + .filter(e -> !e.getName().startsWith("group")) + .map(TmfTreeDataModel::getId) + .collect(Collectors.toList());Alternatively, add assertions that verify the returned series actually contains non-zero density data for the expected segments.
169-192: Test should explicitly assert expected behavior for null segments.The conditional at Lines 186-191 means no assertions run if
responseModelis null orseriesDatais empty. For a null-segment provider, the expected behavior should be explicitly asserted. Consider:TmfModelResponse<@NonNull ITmfXyModel> response = fDataProviderNullSegments.fetchXY(parameters, null); assertNotNull(response); + // For null segments, we expect either no model or empty series data + ITmfXyModel responseModel = response.getModel(); + if (responseModel != null) { + // Assert that null segment provider produces empty series + assertTrue("Expected empty series for null segment provider", + responseModel.getSeriesData().isEmpty()); + } - - ITmfXyModel responseModel = response.getModel(); - if (responseModel != null && !responseModel.getSeriesData().isEmpty()) { - ISeriesModel seriesModel = responseModel.getSeriesData().iterator().next(); - assertNotNull(seriesModel); - assertNotNull(seriesModel.getData()); - assertEquals(100, seriesModel.getData().length); - }This makes the test's expectations explicit rather than silently passing when no data is returned.
197-203: Consider adding assertions for tree structure.The test only checks that
fetchTreereturns a non-null response withCOMPLETEDstatus. Consider also validating that the tree model contains expected entries:TmfTreeModel<@NonNull TmfTreeDataModel> treeModel = response.getModel(); assertNotNull(treeModel); assertFalse("Tree should contain entries", treeModel.getEntries().isEmpty());
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java(3 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java(1 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java(3 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java(2 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties(1 hunks)releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java
🧰 Additional context used
🧬 Code graph analysis (3)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (7)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/internal/tmf/core/model/TmfXyResponseFactory.java (1)
TmfXyResponseFactory(41-251)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/CommonStatusMessage.java (1)
CommonStatusMessage(27-93)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/filters/SelectionTimeQueryFilter.java (1)
SelectionTimeQueryFilter(29-86)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeDataModel.java (1)
TmfTreeDataModel(28-220)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeModel.java (1)
TmfTreeModel(33-246)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/xy/TmfXYAxisDescription.java (1)
TmfXYAxisDescription(26-143)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/response/TmfModelResponse.java (1)
TmfModelResponse(29-79)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (3)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
SegmentStoreDensityDataProvider(76-408)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreScatterDataProvider.java (1)
SegmentStoreScatterDataProvider(82-534)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1)
Messages(22-172)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (6)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
SegmentStoreDensityDataProvider(76-408)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/internal/tmf/core/model/filters/FetchParametersUtils.java (1)
FetchParametersUtils(33-184)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/exceptions/TmfAnalysisException.java (1)
TmfAnalysisException(27-70)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/filters/TimeQueryFilter.java (1)
TimeQueryFilter(35-188)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeDataModel.java (1)
TmfTreeDataModel(28-220)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/tree/TmfTreeModel.java (1)
TmfTreeModel(33-246)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: build / build
- GitHub Check: build / build
- GitHub Check: build / build
🔇 Additional comments (12)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (2)
318-329: UST callstack latency density descriptors correctly extend the UST provider setThe two new UST latency density descriptors:
- Match naming of the existing “Latency Statistics / Table / vs Time” views,
- Use
ProviderType.TREE_GENERIC_XYand the sameSegmentStoreDensityDataProviderprefix,- Reference the appropriate legacy and “(new)” callstack analysis IDs.
This should keep the UST expectations in sync with the new density providers without affecting existing tests.
351-356: Event Matching latency density descriptor properly added to experiment expectationsThe new “Event Matching Latency - Latency Density” descriptor:
- Uses the same density naming/description pattern as the other views,
- Has
ProviderType.TREE_GENERIC_XY,- Uses the expected
SegmentStoreDensityDataProvider:<event-matching-id>ID,- Is added only to
EXPECTED_EXPERIMENT_SET, which is appropriate since Event Matching is experiment-specific.No issues from a test-coverage or consistency perspective.
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3)
133-140: LGTM!The test correctly validates that an empty parameter map results in a
FAILEDstatus response.
145-164: LGTM!The cancellation test properly validates that a pre-cancelled monitor results in a
CANCELLEDstatus response.
208-212: LGTM!The test correctly validates the composite ID format matches the provider's ID construction.
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2)
44-47: LGTM!The
createProvider(ITmfTrace)method correctly returns null to indicate that a secondary ID is required for this factory.
68-91: LGTM! Theinstanceofcheck at Lines 74-76 properly guards against theClassCastExceptionthat was flagged in a previous review.One minor observation: the
parentIdlogic at Line 81 sets the parent toanalysis.getId()only whengetConfiguration()is non-null. Verify this is the intended behavior, as it means most descriptors will havenullparent IDs unless they have an associated configuration.analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (5)
76-107: LGTM!The constructor properly initializes the provider with the trace, segment store provider, grouping aspects, and composite ID.
109-139: LGTM!The
fetchXYmethod properly handles:
- Filter validation with appropriate error response
- Module completion waiting with cancellation support
- Delegation to
getXAxisAndYSeriesModelsfor XY data construction
166-179: Segment filtering may exclude valid segments.Line 168 filters segments with
s.getStart() >= start, which excludes segments that started before the query window but extend into it. This may be intentional for the density view, but it differs from howgetXAxisAndYSeriesModels(Line 246) usesgetIntersectingElementswithout this additional filter.If intentional, consider adding a comment explaining why tree building uses stricter filtering than XY data generation.
222-224: LGTM!The axis description correctly specifies "Duration" in nanoseconds with
DataType.DURATION.
313-345: LGTM!Helper methods for segment type resolution and ID mapping are correctly implemented with appropriate synchronization on
fIdToType.
| return TmfTreeXYCompositeDataProvider.create(traces, | ||
| Objects.requireNonNull(Messages.SegmentStoreDensityDataProvider_title), | ||
| SegmentStoreDensityDataProvider.ID, secondaryId); |
There was a problem hiding this comment.
Title format string {0} won't be bound.
Messages.SegmentStoreDensityDataProvider_title contains a format placeholder {0} (e.g., "{0} - Latency Density"), but it's passed directly to TmfTreeXYCompositeDataProvider.create without NLS binding. This will result in a literal "{0} - Latency Density" title for the composite provider.
Consider binding the title with an appropriate name:
return TmfTreeXYCompositeDataProvider.create(traces,
- Objects.requireNonNull(Messages.SegmentStoreDensityDataProvider_title),
+ Objects.requireNonNull(NLS.bind(Messages.SegmentStoreDensityDataProvider_title, trace.getName())),
SegmentStoreDensityDataProvider.ID, secondaryId);Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java
around lines 63 to 65, the title string
Messages.SegmentStoreDensityDataProvider_title contains a "{0}" placeholder but
is passed raw to TmfTreeXYCompositeDataProvider.create; format the title first
(e.g., with NLS.bind(...) or MessageFormat.format(...)) supplying an appropriate
name such as the trace or secondaryId display name, then pass the formatted,
non-null title to TmfTreeXYCompositeDataProvider.create so the placeholder is
bound correctly.
There was a problem hiding this comment.
@MatthewKhouzam, understood. I'll note that this title format is acceptable as-is.
Update fetchTree and fetchXY Add getXAxisAndYSeriesModels Response data is similar to the genericXY unit tests generated by Q Signed-off-by: Yassine Ibhir <yassine.ibhir.ibhir@ericsson.com>
36cf0bc to
145bde3
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (5)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (1)
107-111: Test may select root node instead of leaf series.The test collects all entry IDs including the root node (parentId = -1). The density provider builds bins keyed by leaf series IDs (segment types), so including the root ID in
requested_itemsmay not produce meaningful data for that entry.Consider filtering to select only leaf entries (non-root nodes):
- List<Long> itemIds = treeModel.getEntries().stream().map(TmfTreeDataModel::getId).collect(Collectors.toList()); + List<Long> itemIds = treeModel.getEntries().stream() + .filter(e -> e.getParentId() != -1) // skip root + .map(TmfTreeDataModel::getId) + .collect(Collectors.toList());This ensures the test validates actual density data for series nodes rather than including the root which has no bins.
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (1)
63-65: Title format string{0}is not bound.
Messages.SegmentStoreDensityDataProvider_titlelikely contains a format placeholder (e.g.,"{0} - Latency Density"), but it's passed directly without NLS binding. This results in a literal{0}in the composite provider title.return TmfTreeXYCompositeDataProvider.create(traces, - Objects.requireNonNull(Messages.SegmentStoreDensityDataProvider_title), + Objects.requireNonNull(NLS.bind(Messages.SegmentStoreDensityDataProvider_title, trace.getName())), SegmentStoreDensityDataProvider.ID, secondaryId);analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)
393-395: Unsafe cast in filter lambda may throwClassCastException.The lambda casts
mtoIAnalysisModulewithout verifying the type. WhileTmfTraceUtils.getAnalysisModulesOfClasstypically returns analysis modules, it's safer to add the check.Iterable<ISegmentStoreProvider> modules = TmfTraceUtils.getAnalysisModulesOfClass(trace, ISegmentStoreProvider.class); - Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, m -> ((IAnalysisModule) m).getId().equals(secondaryId)); + Iterable<ISegmentStoreProvider> filteredModules = Iterables.filter(modules, + m -> m instanceof IAnalysisModule && ((IAnalysisModule) m).getId().equals(secondaryId));
270-297: Binning breaks whennbSamplesexceeds duration span.When the requested number of samples exceeds
totalSpan(e.g.,maxLength=50,nbSamples=500),stepbecomes 0, and theif (step > 0 && ...)guard prevents all segments from being binned—resulting in all-zero density data.Consider clamping
nbSamplesto the duration span:+ // Clamp samples to avoid zero-width bins + if (nbSamples > totalSpan) { + nbSamples = (int) Math.max(1, totalSpan); + } + long step = totalSpan / nbSamples; + if (step <= 0) { + step = 1; + }This ensures meaningful binning regardless of the requested sample count.
347-373: Same zero-step issue increateEvenlyDistributedRanges.When
samples > totalSpan,stepbecomes 0, causingrangeEnd = current + step - 1 = current - 1, which creates invalid ranges wherestart > end.Apply the same clamping fix here:
long totalSpan = end - start; + int effectiveSamples = (int) Math.min(samples, Math.max(1, totalSpan)); - long step = totalSpan / samples; - long remainder = totalSpan % samples; + long step = totalSpan / effectiveSamples; + long remainder = totalSpan % effectiveSamples; long current = start; - for (int i = 0; i < samples; i++) { + for (int i = 0; i < effectiveSamples; i++) {
🧹 Nitpick comments (3)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (2)
121-127: Conditional assertion weakens test coverage.The
if (!responseModel.getSeriesData().isEmpty())guard means the test passes silently when no series data is returned. If the intent is to verify actual density data, consider asserting that series data is not empty for valid segments.- if (!responseModel.getSeriesData().isEmpty()) { - ISeriesModel seriesModel = responseModel.getSeriesData().iterator().next(); + assertFalse("Expected series data for valid segments", responseModel.getSeriesData().isEmpty()); + ISeriesModel seriesModel = responseModel.getSeriesData().iterator().next();This ensures the test fails if the provider unexpectedly returns no data.
186-191: Make null-segment test expectations explicit.The conditional check makes it unclear what the expected behavior is for null segments. If null segments should produce no series data, assert that explicitly:
ITmfXyModel responseModel = response.getModel(); - if (responseModel != null && !responseModel.getSeriesData().isEmpty()) { - ISeriesModel seriesModel = responseModel.getSeriesData().iterator().next(); - assertNotNull(seriesModel); - assertNotNull(seriesModel.getData()); - assertEquals(100, seriesModel.getData().length); - } + // Null segments provider should return completed response with empty or minimal data + assertNotNull(responseModel); + // If expecting empty series for null segments: + // assertTrue("Expected empty series for null segments", responseModel.getSeriesData().isEmpty());This makes the test's intent clear and prevents silent passes.
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
375-383: Consider caching max duration for large segment stores.
getMaxDuration()iterates through the entire segment store on each call togetXAxisAndYSeriesModels. For large traces, this could be a performance concern.Consider caching the result or computing it once during initialization if the segment store doesn't change during provider lifetime.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java(3 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java(1 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java(3 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java(2 hunks)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties(1 hunks)releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/messages.properties
- analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java
🧰 Additional context used
🧬 Code graph analysis (1)
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (5)
tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/DataProviderDescriptor.java (1)
DataProviderDescriptor(30-243)tmf/org.eclipse.tracecompass.tmf.core/src/org/eclipse/tracecompass/tmf/core/model/xy/TmfTreeXYCompositeDataProvider.java (1)
TmfTreeXYCompositeDataProvider(53-197)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (1)
SegmentStoreDensityDataProvider(76-408)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreScatterDataProvider.java (1)
SegmentStoreScatterDataProvider(82-534)analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/Messages.java (1)
Messages(22-172)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: build / build
- GitHub Check: build / build
- GitHub Check: build / build
🔇 Additional comments (11)
releng/org.eclipse.tracecompass.integration.core.tests/src/org/eclipse/tracecompass/integration/core/tests/dataproviders/DataProviderManagerTest.java (3)
214-236: Kernel latency density descriptors are consistent and well-integratedThe four new kernel entries (IRQ, SWS latency, Futex, Syscall) follow the existing naming/description scheme and re-use the same analysis IDs as the corresponding statistics/table/scatter providers, with a clear
SegmentStoreDensityDataProvider:prefix andTREE_GENERIC_XYtype. This should plug the new density providers cleanly into the manager expectations.
318-329: UST callstack latency density descriptors align with existing UST providersBoth “LTTng-UST CallStack - Latency Density” and “(new) - Latency Density” mirror the existing UST callstack descriptors: same analysis IDs, appropriate
TREE_GENERIC_XYprovider type, and clear density-specific descriptions. They should appear correctly in the UST and experiment expected sets via the existingaddAlllogic.
351-356: Event Matching Latency density descriptor correctly added for experimentsThe new “Event Matching Latency - Latency Density” descriptor uses the same analysis ID as the existing Event Matching latency providers, with the new density provider prefix and
TREE_GENERIC_XYtype, and is added only toEXPECTED_EXPERIMENT_SET, which is appropriate for this analysis.analysis/org.eclipse.tracecompass.analysis.timing.core.tests/src/org/eclipse/tracecompass/analysis/timing/core/tests/segmentstore/SegmentStoreDensityDataProviderTest.java (3)
133-140: LGTM!The test correctly validates that
fetchXYreturns aFAILEDstatus when required filter parameters are missing.
145-164: LGTM!The test correctly validates that
fetchXYreturns aCANCELLEDstatus when passed a pre-cancelled progress monitor.
197-212: LGTM!Both
testFetchTreeandtestIDcorrectly validate the tree response status and composite ID format respectively.analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProviderFactory.java (2)
59-62: Inconsistent fallback to scatter provider.When
SegmentStoreDensityDataProvider.create()returns null for a single-trace scenario, the factory falls back toSegmentStoreScatterDataProvider.create(). This seems inconsistent since this is the Density data provider factory.Is this intentional (e.g., scatter is a valid fallback for density), or should this return
nullto indicate no provider is available?
68-91: LGTM!The
getDescriptorsmethod correctly:
- Filters modules that are not
IAnalysisModuleinstances- Deduplicates by module ID
- Uses
NLS.bindfor name and description- Sets appropriate provider type and configuration
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java (3)
109-139: LGTM!The
fetchXYmethod has proper:
- Parameter validation with failed response for invalid filters
- Module completion waiting with cancellation support
- Clean delegation to
getXAxisAndYSeriesModels
141-213: LGTM!The
fetchTreemethod properly:
- Handles module completion with cancellation support
- Returns appropriate status codes for null segment store
- Builds hierarchical tree structure using grouping aspects
- Includes cancellation checks during segment iteration
222-224: LGTM!The
getXAxisDescriptionmethod correctly provides axis metadata for the XY model.
| // ISegmentStoreProvider provider = fProvider; | ||
| private long getUniqueId(String name) { | ||
| synchronized (fIdToType) { | ||
| return fIdToType.inverse().computeIfAbsent(name, n -> TRACE_IDS.getAndIncrement()); | ||
| } | ||
| } |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
Remove commented-out code.
Line 215 contains a commented-out variable declaration that appears to be dead code.
- // ISegmentStoreProvider provider = fProvider;
private long getUniqueId(String name) {📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // ISegmentStoreProvider provider = fProvider; | |
| private long getUniqueId(String name) { | |
| synchronized (fIdToType) { | |
| return fIdToType.inverse().computeIfAbsent(name, n -> TRACE_IDS.getAndIncrement()); | |
| } | |
| } | |
| private long getUniqueId(String name) { | |
| synchronized (fIdToType) { | |
| return fIdToType.inverse().computeIfAbsent(name, n -> TRACE_IDS.getAndIncrement()); | |
| } | |
| } |
🤖 Prompt for AI Agents
In
analysis/org.eclipse.tracecompass.analysis.timing.core/src/org/eclipse/tracecompass/internal/analysis/timing/core/segmentstore/SegmentStoreDensityDataProvider.java
around lines 215 to 220, remove the commented-out dead declaration "//
ISegmentStoreProvider provider = fProvider;"; delete the commented line so only
the getUniqueId method remains, and run a quick compile/format to ensure no
whitespace or style violations remain.
bhufmann
left a comment
There was a problem hiding this comment.
Looks good to me. I tested it with InAndOut configuration, and it works as well. Thanks for the contribution.
Thank you for the review and help! |
What it does
How to test
Follow-ups
N/A
Review checklist
Summary by CodeRabbit
New Features
Improvements
Tests
✏️ Tip: You can customize this high-level summary in your review settings.