Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core(page-functions): expose simulated throttling requestIdleCallback shim #11032

Merged
merged 14 commits into from
Jun 29, 2020

Conversation

LukasAuerMstage
Copy link
Contributor

Summary
This PR introduces a request idle callback shim that gets applied for lighthouse tests that are run with simulated throttling.

The feature is needed as the use of the request idle callback deadline defeats the simulated throttling logic. Until the deadline hits 0 tasks can be processed within the request idle callback, which adds up to a maximum of 50ms. Lighthouse will then multiply these 50ms with 4 if simulated throttling is active, which results in a long task.

This riC shim calculates the remainig deadline with this issue in mind and will deliver a value of 12ms or less, so that once the simulated throttling is applied, the overall time consumed does not surpass 50ms and no long task gets registered.

Related Issues/PRs
#10693

@LukasAuerMstage LukasAuerMstage requested a review from a team as a code owner June 28, 2020 19:31
@LukasAuerMstage LukasAuerMstage requested review from paulirish and removed request for a team June 28, 2020 19:31
@googlebot

This comment has been minimized.

@LukasAuerMstage

This comment has been minimized.

@googlebot

This comment has been minimized.

@patrickhulce patrickhulce self-assigned this Jun 28, 2020
@patrickhulce patrickhulce self-requested a review June 28, 2020 19:57
@LukasAuerMstage LukasAuerMstage changed the title Ric shim core(page-functions): Expose simulated throttling specific requestIdleCallback shim Jun 29, 2020
@LukasAuerMstage LukasAuerMstage changed the title core(page-functions): Expose simulated throttling specific requestIdleCallback shim core(page-functions): Expose simulated throttling requestIdleCallback shim Jun 29, 2020
@LukasAuerMstage LukasAuerMstage changed the title core(page-functions): Expose simulated throttling requestIdleCallback shim core(page-functions): expose simulated throttling requestIdleCallback shim Jun 29, 2020
 -disable-storage-reset
 -clear origin storage
Copy link
Collaborator

@patrickhulce patrickhulce left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you very much @LukasAuerMstage this looks great! 🎉

*/
async registerRequestIdleCallbackWrap(settings) {
if (settings.throttlingMethod === 'simulate') {
const scriptStr = `(${pageFunctions.wrapRequestIdleCallbackString})()`;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's pass through the multiplier to the function so it doesn't have to hardcode the deadline

Suggested change
const scriptStr = `(${pageFunctions.wrapRequestIdleCallbackString})()`;
const scriptStr = `(${pageFunctions.wrapRequestIdleCallbackString})(${settings.throttling.cpuSlowdownMultiplier)`;

Comment on lines 307 to 313
* RequestIdleCallback shim that calculates the remaining deadline time in order to avoid a potential lighthouse
* penalty for tests run with simulated throttling. As the throttling factor is x4 the maximum remaining time is
* reduced to approximately 50/4 => ~12
* @return {null}
*/
/* istanbul ignore next */
function wrapRequestIdleCallback() {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* RequestIdleCallback shim that calculates the remaining deadline time in order to avoid a potential lighthouse
* penalty for tests run with simulated throttling. As the throttling factor is x4 the maximum remaining time is
* reduced to approximately 50/4 => ~12
* @return {null}
*/
/* istanbul ignore next */
function wrapRequestIdleCallback() {
* RequestIdleCallback shim that calculates the remaining deadline time in order to avoid a potential lighthouse
* penalty for tests run with simulated throttling. Reduces the deadline time to (50 - safetyAllowance) / cpuSlowdownMultiplier to
* ensure a long task is very unlikely if using the API correctly.
* @param {number} cpuSlowdownMultiplier
* @return {null}
*/
/* istanbul ignore next */
function wrapRequestIdleCallback(cpuSlowdownMultiplier) {
const safetyAllowanceMs = 10;
const maxExecutionTimeMs = Math.floor((50 - safetyAllowanceMs) / cpuSlowdownMultiplier);

const start = Date.now();
deadline.__timeRemaining = deadline.timeRemaining;
deadline.timeRemaining = () => {
return Math.min(deadline.__timeRemaining(), Math.max(0, 12 - (Date.now() - start)));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return Math.min(deadline.__timeRemaining(), Math.max(0, 12 - (Date.now() - start)));
return Math.min(deadline.__timeRemaining(), Math.max(0, maxExecutionTimeMs - (Date.now() - start)));

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

might need to reflow with line length :/

requestedUrl: 'http://localhost:10200/ric-shim.html?short',
finalUrl: 'http://localhost:10200/ric-shim.html?short',
audits: {
'interactive': {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TTI can be super flaky on test machines depending on scheduling of network too, let's use total blocking time instead which is strictly time spent in tasks beyond 50ms :)

const timeStart = Date.now();
while (true) {
let elapsedTime = Date.now() - timeStart;
if (elapsedTime > milliseconds) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if (elapsedTime > milliseconds) {
if (elapsedTime >= milliseconds) {

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

otherwise sleep(1) will always sleep for at least 2 ms, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah thats right, i missed that

const iterations = 40;

if (window.location.search.includes('short')) {
window.ricTaskDuration = 1;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: seems like this window should just be a let ricTaskDuration = 0

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If i remember correctly let/const dont move up into the global scope if they are in some sort of curly braces but i could do something like this, wdyt? Or just use var if you prefer that because that should just move up.

let ricTaskDuration;
if (window.location.search.includes('short')) {
ricTaskDuration = 1;
}
if (window.location.search.includes('long')) {
ricTaskDuration = 50;
}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah sorry that's what I meant, declaring let ricTaskDuration = 0 up with iterations and then overriding it if the query param matches

lighthouse-cli/test/fixtures/ric-shim.html Show resolved Hide resolved
 - pass cpu slowdown multiplier to rIC shim wrapper and calculate the maximum execution time dynamically
 - adjust rIC shim function comment
 - refactor smoke tests so that they use tbt instead of tti
 - fix bug in sleep function (was previously 1ms off)
 - add copyright header to smoke test html
Copy link
Collaborator

@patrickhulce patrickhulce left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

Beautiful work! Just a few last nits, but LGTM :)

@@ -0,0 +1,47 @@
<!DOCTYPE html>
<!--
* Copyright 2018 The Lighthouse Authors. All Rights Reserved.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* Copyright 2018 The Lighthouse Authors. All Rights Reserved.
* Copyright 2020 The Lighthouse Authors. All Rights Reserved.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great :) done

'total-blocking-time': {
// With the requestIdleCallback shim in place 1ms tasks should not block at all and should max add up to
// 12.5 ms each, which would result in 50ms under a 4x simulated throttling multiplier and therefore in 0 tbt
numericValue: '<=0',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can give it a little bit of slack to avoid flakiness, maybe <=100? going from 5500 -> 100 is still great :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

makes sense 👍 done

 - change copyright header date to 2020
 - change tbt smoke test value to <=100 (from previously 0) to add some room for a little variance
@patrickhulce
Copy link
Collaborator

Thanks @LukasAuerMstage looks all good! Unfortunately you'll need to rebase after #11035 lands to fix the required status checks that are failing atm.

@brendankenny
Copy link
Member

@patrickhulce as a follow up at some point, could we document this somewhere? It'll be easy to forget this is happening at runtime (since most lantern stuff happens after the fact).

Would lantern.md be best? Maybe we should make a general simulated throttling explainer doc (or section in throttling.md) at some point.

@patrickhulce
Copy link
Collaborator

Would lantern.md be best? Maybe we should make a general simulated throttling explainer doc

Yeah a little "Lantern Considerations" type deal might be good to get a running list of things that need to happen for lantern to work well.

@@ -303,6 +303,39 @@ function getNodeLabel(node) {
return tagName;
}

/**
* RequestIdleCallback shim that calculates the remaining deadline time in order to avoid a potential lighthouse
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* RequestIdleCallback shim that calculates the remaining deadline time in order to avoid a potential lighthouse
* RequestIdleCallback shim that calculates the remaining deadline time in order to avoid a potential

const nativeRequestIdleCallback = window.requestIdleCallback;
window.requestIdleCallback = (cb) => {
const cbWrap = (deadline, timeout) => {
const start = Date.now();
Copy link
Collaborator

@connorjclark connorjclark Jun 29, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will performance.now() work too?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it will, we're like split 50/50 throughout the smokes on Date.now vs. performance.now

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think Date.now() is faster than performance.now() thats why i chose it, but i dont know if it matters all that much in this case. Do you want me to refactor to performance.now()?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it's necessary @LukasAuerMstage but @connorjclark were you hoping to slowly standardize on performance.now?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants