-
-
Notifications
You must be signed in to change notification settings - Fork 6.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(jest-core): Add perfStats to surface test setup overhead #14622
Conversation
✅ Deploy Preview for jestjs ready!Built without sensitive environment variables
To edit notification comments on pull requests, go to your Netlify site configuration. |
05f5d7e
to
0291093
Compare
No problem! Thanks for taking a look! I originally considered the same about the performance marks as I am already leaning on those to observe the runtime of Jest, but I was concerned about the performance mark entries getting noisy as the perfStats metadata I introduced is per test file. So, I thought of trying to localize that metadata to a results object associated with the test file. With that said, I do not really love the names of the attributes, but it does make it clear what the metadata is. |
I think it's fine either way. I can see an argument to justify the difference that |
packages/jest-types/src/Config.ts
Outdated
@@ -494,6 +494,11 @@ export type ProjectConfig = { | |||
workerIdleMemoryLimit?: number; | |||
}; | |||
|
|||
export type ProjectConfigPerfStats = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this belongs in Config
. Seems more like a TestResult
thing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah yea, good call!
Considering this just captures setupAfterEnv
perfStats
what are your thoughts on SetupAfterEnvPerfStats
? I think the reason I probably avoided using TestResult
here was because this is only a subset of the TestResult
perfStats
due to where I can capture this metadata during the lifecycle of test execution.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I buy that argument. Thanks!
Should we document these props anywhere? I guess we don't currently, and this PR probably isn't the place to add it.
Happy to add documentation or tackle that in a subsequent pull request. Do you have thoughts on where might be most appropriate? I'm seeing these areas as possible for contenders for more context: |
Ah, the first link has
so I guess we should add the new props there. Maybe that's enough? |
That sounds good to me! I'll get the changes up shortly Edit: updated! |
b6fe351
to
6d59478
Compare
b8ed500
to
2a70b2b
Compare
👍 |
This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
Summary
Introduces additional performance metadata to surface the overhead of test setup. Specifically, this aims to highlight the expense of
testEnvironment
,setupFiles
, andsetupFilesAfterEnv
. The metadata currently available does not surface where test files spend their time outside of test caseduration
, so these changes aim to highlight that context.The change primarily extends the
perfStats
object of theTestResults
type as these lifecycle events occur at the test file level. Additionally, the change introducesstartedAt
to theAssertionResult
.duration
is already exposed, butstartedAt
was not even though that attribute was already available onTestEntry
. ExposingstartedAt
provides additional context on where the test ran relative to thestart
andend
time of the file itself.Test plan
Run a simple test with a custom reporter for surfacing the new metadata:
Simple test
Custom reporter
New perfStats
New testResult