Skip to content

module.exports = {} approach and objects with fast / slow properties #11430

Closed
@vsemozhetbyt

Description

@vsemozhetbyt
  • Version: maybe all
  • Platform: maybe all
  • Subsystem: module

There were some PRs recently that refactor module.exports filling from incremental to atomic.

There also were some concerns about the gain of this approach.

It came to pass that I've recently read some stuff about v8 optimization concerning objects. There is a distinction between objects with fast and slow properties, and one way to make an object slow is exactly incremental property definition.

So I've written a simple test to define how many added properties make an object slow:

for (var i = 1, o = {}; %HasFastProperties(o); i++) o['p' + i] = 1;
console.log(Object.keys(o).length);

Then I've run this code with some Node.js versions:

 node --allow-natives-syntax test.js
// till at least v8 4.5.103.43 (Node.js 4.7.2)
16
// from v8 4.5.103.43 (Node.js 4.7.2) and so on
20

You could also check this code in the some last Node.js:

const slowObj = {};
for (let i = 1; i <= 20; i++) slowObj[`p${i}`] = 1;

const fastObj = JSON.parse(JSON.stringify(slowObj));

console.log(%HasFastProperties(slowObj), %HasFastProperties(fastObj));
false true

This is also a double benchmark, with benchmark.js and a naive approach:

/******************************************************************************/
'use strict';
/******************************************************************************/
const slowObj = {};
for (let i = 1; i <= 20; i++) slowObj[`p${i}`] = 1;

const fastObj = JSON.parse(JSON.stringify(slowObj));

const testCases = [
  function slow() { return slowObj.p20; },
  function fast() { return fastObj.p20; },
];
/******************************************************************************/
console.log(`
  1. Balanced benchmark
`);
/******************************************************************************/
const suite = require('benchmark').Suite();

testCases.forEach((testCase) => { suite.add(testCase.name, testCase); });
suite.on('cycle', (evt) => { console.log(String(evt.target)); }).run({ async: false });
/******************************************************************************/
console.log(`
  2. Simplified benchmark
`);
/******************************************************************************/
const warmup = () => {};
[warmup, ...testCases].forEach((testCase) => {
  const MS_IN_S = 1000;
  const numberOfCycles = 1e8;
  const start = Date.now();

  for (let i = 0; i < numberOfCycles; i++) testCase();

  if (testCase.name !== 'warmup') {
    console.log(`${testCase.name} x ${
      Math.round(numberOfCycles / ((Date.now() - start) / MS_IN_S)).toLocaleString()
    } ops/sec`);
  }
});
  1. Balanced benchmark

slow x 36,283,960 ops/sec ±1.32% (86 runs sampled)
fast x 46,327,002 ops/sec ±1.39% (85 runs sampled)

  2. Simplified benchmark

slow x 38,284,839 ops/sec
fast x 61,881,188 ops/sec

I am not completely sure this should be considered as a reason for the mentioned refactoring, but I feel somehow obliged to share this data.

cc @nodejs/v8

Metadata

Metadata

Assignees

No one assigned

    Labels

    performanceIssues and PRs related to the performance of Node.js.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions