Skip to content

Commit 010523e

Browse files
sstricklcommit-bot@chromium.org
authored andcommitted
[vm] Fix handling of WeakSerializationReferences in v8 profile.
Instead of trying to lazily create artificial nodes when needed for WeakSerializationReference (WSR) targets in all cases, create them eagerly for targets in reachable WSRs, since those are guaranteed to be needed. Those for unreachable WSRs are still lazily created as needed, since the WSR won't even be accessed by the clustered snapshot writer unless the unreachable WSR is part of another unreachable object that has an artificial node created. This rework avoids some issues seen on upcoming CLs where the artificial nodes for WSR targets weren't getting correctly generated. ----- Also extend the v8 snapshot profile writer tests to check the sizes of the text and data sections in ELF snapshots. That means the v8 snapshot profile writer tests check up to three different measures, from most precise to least precise, depending on the output mode: * If writing an ELF snapshot directly: the sum of the sizes attributed to the text and data section symbols are checked to be exactly the same as the sum of the sizes of the objects in the profile. * If writing an ELF snapshot either directly or via assembly: the sum of the sizes of the text and data sections are checked to be the same as the sum of the sizes of the objects in the profile. If using an assembler that merges text and data sections, then account for the padding between the sections using an approximate check. * For all: Check that the size of the snapshot as a whole is approximately equal to the sum of the sizes of the objects in the profile, accounting both for possible padding and for the global header information and non-data/non-text sections like the dynamic symbol section. TEST=vm/data{,_2}/v8_snapshot_profile_writer_test Cq-Include-Trybots: luci.dart.try:vm-kernel-precomp-linux-debug-x64-try,vm-kernel-precomp-linux-debug-x64c-try,vm-kernel-precomp-nnbd-linux-debug-x64-try,vm-kernel-precomp-mac-release-simarm64-try Change-Id: I66e0e7fdb5bb98045621bf516f150a4723e08147 Reviewed-on: https://dart-review.googlesource.com/c/sdk/+/198942 Commit-Queue: Tess Strickland <sstrickl@google.com> Reviewed-by: Clement Skau <cskau@google.com> Reviewed-by: Daco Harkes <dacoharkes@google.com>
1 parent 34add23 commit 010523e

File tree

6 files changed

+171
-72
lines changed

6 files changed

+171
-72
lines changed

pkg/native_stack_traces/CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
11
# Changelog
22

3+
## 0.4.3
4+
5+
- Exported some more of the ELF utilities for use in Dart tests.
6+
37
## 0.4.2
48

59
- When decoding a stack trace, frames corresponding to the functions

pkg/native_stack_traces/lib/elf.dart

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
// for details. All rights reserved. Use of this source code is governed by a
33
// BSD-style license that can be found in the LICENSE file.
44

5-
export 'src/elf.dart' show Elf, Symbol;
5+
export 'src/elf.dart' show Elf, Section, Symbol;
66
export 'src/constants.dart'
77
show
88
isolateDataSymbolName,

pkg/native_stack_traces/pubspec.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name: native_stack_traces
22
description: Utilities for working with non-symbolic stack traces.
3-
version: 0.4.2
3+
version: 0.4.3
44

55
homepage: https://github.com/dart-lang/sdk/tree/master/pkg/native_stack_traces
66

runtime/tests/vm/dart/v8_snapshot_profile_writer_test.dart

Lines changed: 59 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -189,12 +189,17 @@ Future<void> testAOT(String dillPath,
189189
final expectedSize =
190190
profile.nodes.fold<int>(0, (size, n) => size + n.selfSize);
191191

192+
// May not be ELF, but another format.
193+
final elf = Elf.fromFile(snapshotPath);
194+
192195
var checkedSize = false;
193196
if (!useAsm) {
194-
// Verify that the total size of the snapshot text and data sections is
195-
// the same as the sum of the shallow sizes of all objects in the profile.
196-
// This ensures that all bytes are accounted for in some way.
197-
final elf = Elf.fromFile(snapshotPath);
197+
// Verify that the total size of the snapshot text and data section
198+
// symbols is the same as the sum of the shallow sizes of all objects in
199+
// the profile. This ensures that all bytes are accounted for in some way.
200+
//
201+
// We only check this when generating ELF directly because that's when
202+
// we're guaranteed the symbols will have non-zero size.
198203
Expect.isNotNull(elf);
199204
elf!; // To refine type to non-nullable version.
200205

@@ -215,10 +220,54 @@ Future<void> testAOT(String dillPath,
215220

216221
Expect.equals(
217222
expectedSize, actualSize, "failed on $description snapshot");
223+
Expect.equals(expectedSize, actualSize,
224+
"symbol size check failed on $description snapshot");
225+
checkedSize = true;
226+
}
227+
228+
// See Elf::kPages in runtime/vm/elf.h, which is also used for assembly
229+
// padding.
230+
final segmentAlignment = 16 * 1024;
231+
232+
if (elf != null) {
233+
// Verify that the total size of the snapshot text and data sections is
234+
// approximately the sum of the shallow sizes of all objects in the
235+
// profile. As sections might be merged by the assembler when useAsm is
236+
// true, we need to account for possible padding.
237+
final textSections = elf.namedSections(".text");
238+
Expect.isNotEmpty(textSections);
239+
Expect.isTrue(
240+
textSections.length <= 2, "More text sections than expected");
241+
final dataSections = elf.namedSections(".rodata");
242+
Expect.isNotEmpty(dataSections);
243+
Expect.isTrue(
244+
dataSections.length <= 2, "More data sections than expected");
245+
246+
var actualSize = 0;
247+
for (final section in textSections) {
248+
actualSize += section.length;
249+
}
250+
for (final section in dataSections) {
251+
actualSize += section.length;
252+
}
253+
254+
final mergedCount = (2 - textSections.length) + (2 - dataSections.length);
255+
final possiblePadding = mergedCount * segmentAlignment;
256+
257+
Expect.approxEquals(
258+
expectedSize,
259+
actualSize,
260+
possiblePadding,
261+
"section size failed on $description snapshot" +
262+
(!useAsm ? ", but symbol size test passed" : ""));
218263
checkedSize = true;
219264
}
220265

221266
if (stripUtil || stripFlag) {
267+
// Verify that the actual size of the stripped snapshot is close to the
268+
// sum of the shallow sizes of all objects in the profile. They will not
269+
// be exactly equal because of global headers, padding, and non-text/data
270+
// sections.
222271
var strippedSnapshotPath = snapshotPath;
223272
if (stripUtil) {
224273
strippedSnapshotPath = snapshotPath + '.stripped';
@@ -227,20 +276,18 @@ Future<void> testAOT(String dillPath,
227276
print("Stripped snapshot generated at $strippedSnapshotPath.");
228277
}
229278

230-
// Verify that the actual size of the stripped snapshot is close to the
231-
// sum of the shallow sizes of all objects in the profile. They will not
232-
// be exactly equal because of global headers and padding.
233279
final actualSize = await File(strippedSnapshotPath).length();
234280

235-
// See Elf::kPages in runtime/vm/elf.h, which is also used for assembly
236-
// padding.
237-
final segmentAlignment = 16 * 1024;
238281
// Not every byte is accounted for by the snapshot profile, and data and
239282
// instruction segments are padded to an alignment boundary.
240283
final tolerance = 0.03 * actualSize + 2 * segmentAlignment;
241284

242-
Expect.approxEquals(expectedSize, actualSize, tolerance,
243-
"failed on $description snapshot");
285+
Expect.approxEquals(
286+
expectedSize,
287+
actualSize,
288+
tolerance,
289+
"total size check failed on $description snapshot" +
290+
(elf != null ? ", but section size checks passed" : ""));
244291
checkedSize = true;
245292
}
246293

runtime/tests/vm/dart_2/v8_snapshot_profile_writer_test.dart

Lines changed: 59 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -197,12 +197,17 @@ Future<void> testAOT(String dillPath,
197197
final expectedSize =
198198
profile.nodes.fold<int>(0, (size, n) => size + n.selfSize);
199199

200+
// May not be ELF, but another format.
201+
final elf = Elf.fromFile(snapshotPath);
202+
200203
var checkedSize = false;
201204
if (!useAsm) {
202-
// Verify that the total size of the snapshot text and data sections is
203-
// the same as the sum of the shallow sizes of all objects in the profile.
204-
// This ensures that all bytes are accounted for in some way.
205-
final elf = Elf.fromFile(snapshotPath);
205+
// Verify that the total size of the snapshot text and data section
206+
// symbols is the same as the sum of the shallow sizes of all objects in
207+
// the profile. This ensures that all bytes are accounted for in some way.
208+
//
209+
// We only check this when generating ELF directly because that's when
210+
// we're guaranteed the symbols will have non-zero size.
206211
Expect.isNotNull(elf);
207212

208213
final vmTextSectionSymbol = elf.dynamicSymbolFor(vmSymbolName);
@@ -220,12 +225,54 @@ Future<void> testAOT(String dillPath,
220225
isolateTextSectionSymbol.size +
221226
isolateDataSectionSymbol.size;
222227

223-
Expect.equals(
224-
expectedSize, actualSize, "failed on $description snapshot");
228+
Expect.equals(expectedSize, actualSize,
229+
"symbol size check failed on $description snapshot");
230+
checkedSize = true;
231+
}
232+
233+
// See Elf::kPages in runtime/vm/elf.h, which is also used for assembly
234+
// padding.
235+
final segmentAlignment = 16 * 1024;
236+
237+
if (elf != null) {
238+
// Verify that the total size of the snapshot text and data sections is
239+
// approximately the sum of the shallow sizes of all objects in the
240+
// profile. As sections might be merged by the assembler when useAsm is
241+
// true, we need to account for possible padding.
242+
final textSections = elf.namedSections(".text");
243+
Expect.isNotEmpty(textSections);
244+
Expect.isTrue(
245+
textSections.length <= 2, "More text sections than expected");
246+
final dataSections = elf.namedSections(".rodata");
247+
Expect.isNotEmpty(dataSections);
248+
Expect.isTrue(
249+
dataSections.length <= 2, "More data sections than expected");
250+
251+
var actualSize = 0;
252+
for (final section in textSections) {
253+
actualSize += section.length;
254+
}
255+
for (final section in dataSections) {
256+
actualSize += section.length;
257+
}
258+
259+
final mergedCount = (2 - textSections.length) + (2 - dataSections.length);
260+
final possiblePadding = mergedCount * segmentAlignment;
261+
262+
Expect.approxEquals(
263+
expectedSize,
264+
actualSize,
265+
possiblePadding,
266+
"section size failed on $description snapshot" +
267+
(!useAsm ? ", but symbol size test passed" : ""));
225268
checkedSize = true;
226269
}
227270

228271
if (stripUtil || stripFlag) {
272+
// Verify that the actual size of the stripped snapshot is close to the
273+
// sum of the shallow sizes of all objects in the profile. They will not
274+
// be exactly equal because of global headers, padding, and non-text/data
275+
// sections.
229276
var strippedSnapshotPath = snapshotPath;
230277
if (stripUtil) {
231278
strippedSnapshotPath = snapshotPath + '.stripped';
@@ -234,20 +281,18 @@ Future<void> testAOT(String dillPath,
234281
print("Stripped snapshot generated at $strippedSnapshotPath.");
235282
}
236283

237-
// Verify that the actual size of the stripped snapshot is close to the
238-
// sum of the shallow sizes of all objects in the profile. They will not
239-
// be exactly equal because of global headers and padding.
240284
final actualSize = await File(strippedSnapshotPath).length();
241285

242-
// See Elf::kPages in runtime/vm/elf.h, which is also used for assembly
243-
// padding.
244-
final segmentAlignment = 16 * 1024;
245286
// Not every byte is accounted for by the snapshot profile, and data and
246287
// instruction segments are padded to an alignment boundary.
247288
final tolerance = 0.03 * actualSize + 2 * segmentAlignment;
248289

249-
Expect.approxEquals(expectedSize, actualSize, tolerance,
250-
"failed on $description snapshot");
290+
Expect.approxEquals(
291+
expectedSize,
292+
actualSize,
293+
tolerance,
294+
"total size check failed on $description snapshot" +
295+
(elf != null ? ", but section size checks passed" : ""));
251296
checkedSize = true;
252297
}
253298

runtime/vm/clustered_snapshot.cc

Lines changed: 47 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -2481,6 +2481,13 @@ class WeakSerializationReferenceSerializationCluster
24812481

24822482
intptr_t Count(Serializer* s) { return objects_.length(); }
24832483

2484+
void CreateArtificialTargetNodesIfNeeded(Serializer* s) {
2485+
for (intptr_t i = 0; i < objects_.length(); i++) {
2486+
WeakSerializationReferencePtr weak = objects_[i];
2487+
s->CreateArtificialNodeIfNeeded(weak->untag()->target());
2488+
}
2489+
}
2490+
24842491
void WriteAlloc(Serializer* s) {
24852492
UNREACHABLE(); // No WSRs are serialized, and so this cluster is not added.
24862493
}
@@ -6268,34 +6275,29 @@ void Serializer::AttributeReference(
62686275
ObjectPtr object,
62696276
const V8SnapshotProfileWriter::Reference& reference) {
62706277
if (profile_writer_ == nullptr) return;
6271-
6278+
const auto& object_id = GetProfileId(object);
62726279
#if defined(DART_PRECOMPILER)
6273-
// Make artificial nodes for dropped targets in WSRs.
62746280
if (object->IsHeapObject() && object->IsWeakSerializationReference()) {
6275-
const auto& wsr = WeakSerializationReference::RawCast(object);
6276-
const auto& target = wsr->untag()->target();
6277-
const bool wsr_reachable = !CreateArtificialNodeIfNeeded(wsr);
6278-
if (wsr_reachable && HasArtificialRef(target)) {
6279-
// The target has artificial information used for snapshot analysis and
6280-
// the replacement is part of the snapshot, so write information for both.
6281-
const auto& replacement = wsr->untag()->replacement();
6281+
auto const wsr = WeakSerializationReference::RawCast(object);
6282+
auto const target = wsr->untag()->target();
6283+
const auto& target_id = GetProfileId(target);
6284+
if (object_id != target_id) {
6285+
const auto& replacement_id = GetProfileId(wsr->untag()->replacement());
6286+
ASSERT(object_id == replacement_id);
6287+
// The target of the WSR will be replaced in the snapshot, so write
6288+
// attributions for both the dropped target and for the replacement.
62826289
profile_writer_->AttributeDroppedReferenceTo(
6283-
object_currently_writing_.id_, reference, GetProfileId(target),
6284-
GetProfileId(replacement));
6290+
object_currently_writing_.id_, reference, target_id, replacement_id);
62856291
return;
62866292
}
6287-
// The replacement isn't used, as either the target is strongly referenced
6288-
// or the WSR itself is unreachable, so fall through to attributing a
6289-
// reference to the WSR (which shares a profile ID with the target).
6290-
ASSERT(GetProfileId(wsr) == GetProfileId(target));
6291-
} else if (object_currently_writing_.id_.IsArtificial()) {
6292-
// We may need to recur when writing members of artificial nodes in
6293-
// CreateArtificialNodeIfNeeded.
6294-
CreateArtificialNodeIfNeeded(object);
6293+
// The replacement isn't used for this WSR in the snapshot, as either the
6294+
// target is strongly referenced or the WSR itself is unreachable, so fall
6295+
// through to attributing a reference to the WSR (which shares the profile
6296+
// ID of the target).
62956297
}
62966298
#endif
62976299
profile_writer_->AttributeReferenceTo(object_currently_writing_.id_,
6298-
reference, GetProfileId(object));
6300+
reference, object_id);
62996301
}
63006302

63016303
Serializer::WritingObjectScope::WritingObjectScope(
@@ -6364,33 +6366,19 @@ bool Serializer::CreateArtificialNodeIfNeeded(ObjectPtr obj) {
63646366
// UnsafeRefId will do lazy reference allocation for WSRs.
63656367
intptr_t id = UnsafeRefId(obj);
63666368
ASSERT(id != kUnallocatedReference);
6367-
if (IsArtificialReference(id)) {
6368-
return true;
6369+
if (id != kUnreachableReference) {
6370+
return IsArtificialReference(id);
63696371
}
63706372
if (obj->IsHeapObject() && obj->IsWeakSerializationReference()) {
6371-
// The object ID for the WSR may need lazy resolution.
6372-
if (id == kUnallocatedReference) {
6373-
id = UnsafeRefId(obj);
6374-
}
6375-
ASSERT(id != kUnallocatedReference);
6376-
// Create an artificial node for an unreachable target at this point,
6377-
// whether or not the WSR itself is reachable.
6378-
const auto& target =
6373+
auto const target =
63796374
WeakSerializationReference::RawCast(obj)->untag()->target();
63806375
CreateArtificialNodeIfNeeded(target);
6381-
if (id == kUnreachableReference) {
6382-
ASSERT(HasArtificialRef(target));
6383-
// We can safely set the WSR's object ID to the target's artificial one,
6384-
// as that won't make it look reachable.
6385-
heap_->SetObjectId(obj, heap_->GetObjectId(target));
6386-
return true;
6387-
}
6388-
// The WSR is reachable, so continue to the IsAllocatedReference behavior.
6376+
// Since the WSR is unreachable, we can replace its id with whatever the
6377+
// ID of the target is, whether real or artificial.
6378+
id = heap_->GetObjectId(target);
6379+
heap_->SetObjectId(obj, id);
6380+
return IsArtificialReference(id);
63896381
}
6390-
if (IsAllocatedReference(id)) {
6391-
return false;
6392-
}
6393-
ASSERT_EQUAL(id, kUnreachableReference);
63946382

63956383
const char* type = nullptr;
63966384
const char* name = nullptr;
@@ -6490,6 +6478,7 @@ bool Serializer::CreateArtificialNodeIfNeeded(ObjectPtr obj) {
64906478
id = AssignArtificialRef(obj);
64916479
Serializer::WritingObjectScope scope(this, type, obj, name);
64926480
for (const auto& link : links) {
6481+
CreateArtificialNodeIfNeeded(link.first);
64936482
AttributeReference(link.first, link.second);
64946483
}
64956484
return true;
@@ -7045,10 +7034,11 @@ ZoneGrowableArray<Object*>* Serializer::Serialize(SerializationRoots* roots) {
70457034
}
70467035

70477036
#if defined(DART_PRECOMPILER)
7048-
if (auto const cluster = CID_CLUSTER(WeakSerializationReference)) {
7037+
auto const wsr_cluster = CID_CLUSTER(WeakSerializationReference);
7038+
if (wsr_cluster != nullptr) {
70497039
// Now that we have computed the reachability fixpoint, we remove the
70507040
// count of now-reachable WSRs as they are not actually serialized.
7051-
num_written_objects_ -= cluster->Count(this);
7041+
num_written_objects_ -= wsr_cluster->Count(this);
70527042
// We don't need to write this cluster, so remove it from consideration.
70537043
clusters_by_cid_[kWeakSerializationReferenceCid] = nullptr;
70547044
}
@@ -7137,6 +7127,19 @@ ZoneGrowableArray<Object*>* Serializer::Serialize(SerializationRoots* roots) {
71377127
// And recorded them all in [objects_].
71387128
ASSERT(objects_->length() == num_objects);
71397129

7130+
#if defined(DART_PRECOMPILER)
7131+
if (profile_writer_ != nullptr && wsr_cluster != nullptr) {
7132+
// Post-WriteAlloc, we eagerly create artificial nodes for any unreachable
7133+
// targets in reachable WSRs if writing a v8 snapshot profile, since they
7134+
// will be used in AttributeReference().
7135+
//
7136+
// Unreachable WSRs may also need artifical nodes, as they may be members
7137+
// of other unreachable objects that have artificial nodes in the profile,
7138+
// but they are instead lazily handled in CreateArtificialNodeIfNeeded().
7139+
wsr_cluster->CreateArtificialTargetNodesIfNeeded(this);
7140+
}
7141+
#endif
7142+
71407143
for (SerializationCluster* cluster : clusters) {
71417144
cluster->WriteAndMeasureFill(this);
71427145
#if defined(DEBUG)

0 commit comments

Comments
 (0)