Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(jest-core): Add perfStats to surface test setup overhead #14622

Merged
merged 6 commits into from
Nov 6, 2023

Conversation

mattkubej
Copy link
Contributor

@mattkubej mattkubej commented Oct 16, 2023

Summary

Introduces additional performance metadata to surface the overhead of test setup. Specifically, this aims to highlight the expense of testEnvironment, setupFiles, and setupFilesAfterEnv. The metadata currently available does not surface where test files spend their time outside of test case duration, so these changes aim to highlight that context.

The change primarily extends the perfStats object of the TestResults type as these lifecycle events occur at the test file level. Additionally, the change introduces startedAt to the AssertionResult. duration is already exposed, but startedAt was not even though that attribute was already available on TestEntry. Exposing startedAt provides additional context on where the test ran relative to the start and end time of the file itself.

Test plan

Run a simple test with a custom reporter for surfacing the new metadata:

Simple test

describe('identity benchmark', () => {
  it('needs to be fast', () => {
    expect(1).toBe(1);
  });
});

Custom reporter

const util = require('util');

class JestReporter {
  onRunComplete = async (_contexts, results) => {
    console.log(util.inspect(results, {depth: null}));
  };
}

module.exports = JestReporter;

New perfStats

{
  end: 1697500415497,
  loadTestEnvironmentEnd: 1697500415131,
  loadTestEnvironmentStart: 1697500414890,
  runtime: 366,
  setupAfterEnvEnd: 1697500415484,
  setupAfterEnvStart: 1697500415278,
  setupFilesEnd: 1697500415199,
  setupFilesStart: 1697500415131,
  slow: false,
  start: 1697500415131
}

New testResult

{
  ancestorTitles: [ 'identity benchmark' ],
  duration: 7,
  failing: false,
  failureDetails: [],
  failureMessages: [],
  fullName: 'identity benchmark needs to be fast',
  invocations: 1,
  location: null,
  numPassingAsserts: 1,
  retryReasons: [],
  startAt: 1697500415487,
  status: 'passed',
  title: 'needs to be fast'
}

@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Oct 16, 2023

CLA Signed

The committers listed above are authorized under a signed CLA.

@netlify
Copy link

netlify bot commented Oct 16, 2023

Deploy Preview for jestjs ready!

Built without sensitive environment variables

Name Link
🔨 Latest commit f21117d
🔍 Latest deploy log https://app.netlify.com/sites/jestjs/deploys/6548e4927035dd00086b0f00
😎 Deploy Preview https://deploy-preview-14622--jestjs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@SimenB
Copy link
Member

SimenB commented Oct 27, 2023

Oooh, nice, thanks for the PR!

I wonder if we should add these via performance.mark instead of to the test results (like #13859)? I know we have the perfStats object in our results already, but still. It might even make sense to do both.

@robhogan thoughts?

@mattkubej
Copy link
Contributor Author

mattkubej commented Nov 2, 2023

Oooh, nice, thanks for the PR!

I wonder if we should add these via performance.mark instead of to the test results (like #13859)? I know we have the perfStats object in our results already, but still. It might even make sense to do both.

No problem! Thanks for taking a look!

I originally considered the same about the performance marks as I am already leaning on those to observe the runtime of Jest, but I was concerned about the performance mark entries getting noisy as the perfStats metadata I introduced is per test file. So, I thought of trying to localize that metadata to a results object associated with the test file. With that said, I do not really love the names of the attributes, but it does make it clear what the metadata is.

@robhogan
Copy link
Contributor

robhogan commented Nov 2, 2023

I wonder if we should add these via performance.mark instead of to the test results (like #13859)? I know we have the perfStats object in our results already, but still. It might even make sense to do both.

@robhogan thoughts?

I think it's fine either way. I can see an argument to justify the difference that perfStats is part of Jest's API for measuring userland code, so is actionable for users, and it has runtime significance in scheduling. It's a bit more concrete, and it's handy to keep close to results. OTOH, performance.mark (so far) is for internal operations, useful for Jest maintainers and integrators rather than users.

@@ -494,6 +494,11 @@ export type ProjectConfig = {
workerIdleMemoryLimit?: number;
};

export type ProjectConfigPerfStats = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this belongs in Config. Seems more like a TestResult thing

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yea, good call!

Considering this just captures setupAfterEnv perfStats what are your thoughts on SetupAfterEnvPerfStats? I think the reason I probably avoided using TestResult here was because this is only a subset of the TestResult perfStats due to where I can capture this metadata during the lifecycle of test execution.

Copy link
Member

@SimenB SimenB left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I buy that argument. Thanks!

Should we document these props anywhere? I guess we don't currently, and this PR probably isn't the place to add it.

@mattkubej
Copy link
Contributor Author

Should we document these props anywhere? I guess we don't currently, and this PR probably isn't the place to add it.

Happy to add documentation or tackle that in a subsequent pull request. Do you have thoughts on where might be most appropriate?

I'm seeing these areas as possible for contenders for more context:

@SimenB
Copy link
Member

SimenB commented Nov 3, 2023

Ah, the first link has

   "perfStats": {
      "start": epoch,
      "end": epoch
    },

so I guess we should add the new props there. Maybe that's enough?

@mattkubej
Copy link
Contributor Author

mattkubej commented Nov 4, 2023

so I guess we should add the new props there. Maybe that's enough?

That sounds good to me! I'll get the changes up shortly

Edit: updated!

@SimenB
Copy link
Member

SimenB commented Nov 6, 2023

👍

@SimenB SimenB merged commit 4fedfbd into jestjs:main Nov 6, 2023
8 of 9 checks passed
@SimenB
Copy link
Member

SimenB commented Nov 16, 2023

Copy link

This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 17, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants