Skip to content

Commit b6978bc

Browse files
authored
experimental_use(promise) (#25084)
* Internal `act`: Unwrapping resolved promises This update our internal implementation of `act` to support React's new behavior for unwrapping promises. Like we did with Scheduler, when something suspends, it will yield to the main thread so the microtasks can run, then continue in a new task. I need to implement the same behavior in the public version of `act`, but there are some additional considerations so I'll do that in a separate commit. * Move throwException to after work loop resumes throwException is the function that finds the nearest boundary and schedules it for a second render pass. We should only call it right before we unwind the stack — not if we receive an immediate ping and render the fiber again. This was an oversight in 8ef3a7c that I didn't notice because it happens to mostly work, anyway. What made me notice the mistake is that throwException also marks the entire render phase as suspended (RootDidSuspend or RootDidSuspendWithDelay), which is only supposed to be happen if we show a fallback. One consequence was that, in the RootDidSuspendWithDelay case, the entire commit phase was blocked, because that's the exit status we use to block a bad fallback from appearing. * Use expando to check whether promise has resolved Add a `status` expando to a thrown thenable to track when its value has resolved. In a later step, we'll also use `value` and `reason` expandos to track the resolved value. This is not part of the official JavaScript spec — think of it as an extension of the Promise API, or a custom interface that is a superset of Thenable. However, it's inspired by the terminology used by `Promise.allSettled`. The intent is that this will be a public API — Suspense implementations can set these expandos to allow React to unwrap the value synchronously without waiting a microtask. * Scaffolding for `experimental_use` hook Sets up a new experimental hook behind a feature flag, but does not implement it yet. * use(promise) Adds experimental support to Fiber for unwrapping the value of a promise inside a component. It is not yet implemented for Server Components, but that is planned. If promise has already resolved, the value can be unwrapped "immediately" without showing a fallback. The trick we use to implement this is to yield to the main thread (literally suspending the work loop), wait for the microtask queue to drain, then check if the promise resolved in the meantime. If so, we can resume the last attempted fiber without unwinding the stack. This functionality was implemented in previous commits. Another feature is that the promises do not need to be cached between attempts. Because we assume idempotent execution of components, React will track the promises that were used during the previous attempt and reuse the result. You shouldn't rely on this property, but during initial render it mostly just works. Updates are trickier, though, because if you used an uncached promise, we have no way of knowing whether the underlying data has changed, so we have to unwrap the promise every time. It will still work, but it's inefficient and can lead to unnecessary fallbacks if it happens during a discrete update. When we implement this for Server Components, this will be less of an issue because there are no updates in that environment. However, it's still better for performance to cache data requests, so the same principles largely apply. The intention is that this will eventually be the only supported way to suspend on arbitrary promises. Throwing a promise directly will be deprecated.
1 parent 11ed701 commit b6978bc

33 files changed

+1398
-439
lines changed

packages/jest-react/src/internalAct.js

+24-12
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ import enqueueTask from 'shared/enqueueTask';
2323
let actingUpdatesScopeDepth = 0;
2424

2525
export function act<T>(scope: () => Thenable<T> | T): Thenable<T> {
26-
if (Scheduler.unstable_flushAllWithoutAsserting === undefined) {
26+
if (Scheduler.unstable_flushUntilNextPaint === undefined) {
2727
throw Error(
2828
'This version of `act` requires a special mock build of Scheduler.',
2929
);
@@ -120,19 +120,31 @@ export function act<T>(scope: () => Thenable<T> | T): Thenable<T> {
120120
}
121121

122122
function flushActWork(resolve, reject) {
123-
// Flush suspended fallbacks
124-
// $FlowFixMe: Flow doesn't know about global Jest object
125-
jest.runOnlyPendingTimers();
126-
enqueueTask(() => {
123+
if (Scheduler.unstable_hasPendingWork()) {
127124
try {
128-
const didFlushWork = Scheduler.unstable_flushAllWithoutAsserting();
129-
if (didFlushWork) {
130-
flushActWork(resolve, reject);
131-
} else {
132-
resolve();
133-
}
125+
Scheduler.unstable_flushUntilNextPaint();
134126
} catch (error) {
135127
reject(error);
136128
}
137-
});
129+
130+
// If Scheduler yields while there's still work, it's so that we can
131+
// unblock the main thread (e.g. for paint or for microtasks). Yield to
132+
// the main thread and continue in a new task.
133+
enqueueTask(() => flushActWork(resolve, reject));
134+
return;
135+
}
136+
137+
// Once the scheduler queue is empty, run all the timers. The purpose of this
138+
// is to force any pending fallbacks to commit. The public version of act does
139+
// this with dev-only React runtime logic, but since our internal act needs to
140+
// work work production builds of React, we have to cheat.
141+
// $FlowFixMe: Flow doesn't know about global Jest object
142+
jest.runOnlyPendingTimers();
143+
if (Scheduler.unstable_hasPendingWork()) {
144+
// Committing a fallback scheduled additional work. Continue flushing.
145+
flushActWork(resolve, reject);
146+
return;
147+
}
148+
149+
resolve();
138150
}

packages/react-reconciler/src/ReactFiberHooks.new.js

+129
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ import type {
1313
MutableSourceSubscribeFn,
1414
ReactContext,
1515
StartTransitionOptions,
16+
Usable,
17+
Thenable,
1618
} from 'shared/ReactTypes';
1719
import type {Fiber, Dispatcher, HookType} from './ReactInternalTypes';
1820
import type {Lanes, Lane} from './ReactFiberLane.new';
@@ -32,6 +34,7 @@ import {
3234
enableLazyContextPropagation,
3335
enableUseMutableSource,
3436
enableTransitionTracing,
37+
enableUseHook,
3538
enableUseMemoCacheHook,
3639
} from 'shared/ReactFeatureFlags';
3740

@@ -120,6 +123,10 @@ import {
120123
} from './ReactFiberConcurrentUpdates.new';
121124
import {getTreeId} from './ReactFiberTreeContext.new';
122125
import {now} from './Scheduler';
126+
import {
127+
trackUsedThenable,
128+
getPreviouslyUsedThenableAtIndex,
129+
} from './ReactFiberWakeable.new';
123130

124131
const {ReactCurrentDispatcher, ReactCurrentBatchConfig} = ReactSharedInternals;
125132

@@ -205,6 +212,9 @@ let didScheduleRenderPhaseUpdate: boolean = false;
205212
let didScheduleRenderPhaseUpdateDuringThisPass: boolean = false;
206213
// Counts the number of useId hooks in this component.
207214
let localIdCounter: number = 0;
215+
// Counts number of `use`-d thenables
216+
let thenableIndexCounter: number = 0;
217+
208218
// Used for ids that are generated completely client-side (i.e. not during
209219
// hydration). This counter is global, so client ids are not stable across
210220
// render attempts.
@@ -403,6 +413,7 @@ export function renderWithHooks<Props, SecondArg>(
403413

404414
// didScheduleRenderPhaseUpdate = false;
405415
// localIdCounter = 0;
416+
// thenableIndexCounter = 0;
406417

407418
// TODO Warn if no hooks are used at all during mount, then some are used during update.
408419
// Currently we will identify the update render as a mount because memoizedState === null.
@@ -441,6 +452,7 @@ export function renderWithHooks<Props, SecondArg>(
441452
do {
442453
didScheduleRenderPhaseUpdateDuringThisPass = false;
443454
localIdCounter = 0;
455+
thenableIndexCounter = 0;
444456

445457
if (numberOfReRenders >= RE_RENDER_LIMIT) {
446458
throw new Error(
@@ -524,6 +536,7 @@ export function renderWithHooks<Props, SecondArg>(
524536
didScheduleRenderPhaseUpdate = false;
525537
// This is reset by checkDidRenderIdHook
526538
// localIdCounter = 0;
539+
thenableIndexCounter = 0;
527540

528541
if (didRenderTooFewHooks) {
529542
throw new Error(
@@ -631,6 +644,7 @@ export function resetHooksAfterThrow(): void {
631644

632645
didScheduleRenderPhaseUpdateDuringThisPass = false;
633646
localIdCounter = 0;
647+
thenableIndexCounter = 0;
634648
}
635649

636650
function mountWorkInProgressHook(): Hook {
@@ -722,6 +736,73 @@ function createFunctionComponentUpdateQueue(): FunctionComponentUpdateQueue {
722736
};
723737
}
724738

739+
function use<T>(usable: Usable<T>): T {
740+
if (
741+
usable !== null &&
742+
typeof usable === 'object' &&
743+
typeof usable.then === 'function'
744+
) {
745+
// This is a thenable.
746+
const thenable: Thenable<T> = (usable: any);
747+
748+
// Track the position of the thenable within this fiber.
749+
const index = thenableIndexCounter;
750+
thenableIndexCounter += 1;
751+
752+
switch (thenable.status) {
753+
case 'fulfilled': {
754+
const fulfilledValue: T = thenable.value;
755+
return fulfilledValue;
756+
}
757+
case 'rejected': {
758+
const rejectedError = thenable.reason;
759+
throw rejectedError;
760+
}
761+
default: {
762+
const prevThenableAtIndex: Thenable<T> | null = getPreviouslyUsedThenableAtIndex(
763+
index,
764+
);
765+
if (prevThenableAtIndex !== null) {
766+
switch (prevThenableAtIndex.status) {
767+
case 'fulfilled': {
768+
const fulfilledValue: T = prevThenableAtIndex.value;
769+
return fulfilledValue;
770+
}
771+
case 'rejected': {
772+
const rejectedError: mixed = prevThenableAtIndex.reason;
773+
throw rejectedError;
774+
}
775+
default: {
776+
// The thenable still hasn't resolved. Suspend with the same
777+
// thenable as last time to avoid redundant listeners.
778+
throw prevThenableAtIndex;
779+
}
780+
}
781+
} else {
782+
// This is the first time something has been used at this index.
783+
// Stash the thenable at the current index so we can reuse it during
784+
// the next attempt.
785+
trackUsedThenable(thenable, index);
786+
787+
// Suspend.
788+
// TODO: Throwing here is an implementation detail that allows us to
789+
// unwind the call stack. But we shouldn't allow it to leak into
790+
// userspace. Throw an opaque placeholder value instead of the
791+
// actual thenable. If it doesn't get captured by the work loop, log
792+
// a warning, because that means something in userspace must have
793+
// caught it.
794+
throw thenable;
795+
}
796+
}
797+
}
798+
}
799+
800+
// TODO: Add support for Context
801+
802+
// eslint-disable-next-line react-internal/safe-string-coercion
803+
throw new Error('An unsupported type was passed to use(): ' + String(usable));
804+
}
805+
725806
function useMemoCache(size: number): Array<any> {
726807
throw new Error('Not implemented.');
727808
}
@@ -2421,6 +2502,9 @@ if (enableCache) {
24212502
(ContextOnlyDispatcher: Dispatcher).getCacheForType = getCacheForType;
24222503
(ContextOnlyDispatcher: Dispatcher).useCacheRefresh = throwInvalidHookError;
24232504
}
2505+
if (enableUseHook) {
2506+
(ContextOnlyDispatcher: Dispatcher).use = throwInvalidHookError;
2507+
}
24242508
if (enableUseMemoCacheHook) {
24252509
(ContextOnlyDispatcher: Dispatcher).useMemoCache = throwInvalidHookError;
24262510
}
@@ -2452,6 +2536,9 @@ if (enableCache) {
24522536
(HooksDispatcherOnMount: Dispatcher).getCacheForType = getCacheForType;
24532537
(HooksDispatcherOnMount: Dispatcher).useCacheRefresh = mountRefresh;
24542538
}
2539+
if (enableUseHook) {
2540+
(HooksDispatcherOnMount: Dispatcher).use = use;
2541+
}
24552542
if (enableUseMemoCacheHook) {
24562543
(HooksDispatcherOnMount: Dispatcher).useMemoCache = useMemoCache;
24572544
}
@@ -2485,6 +2572,9 @@ if (enableCache) {
24852572
if (enableUseMemoCacheHook) {
24862573
(HooksDispatcherOnUpdate: Dispatcher).useMemoCache = useMemoCache;
24872574
}
2575+
if (enableUseHook) {
2576+
(HooksDispatcherOnUpdate: Dispatcher).use = use;
2577+
}
24882578

24892579
const HooksDispatcherOnRerender: Dispatcher = {
24902580
readContext,
@@ -2513,6 +2603,9 @@ if (enableCache) {
25132603
(HooksDispatcherOnRerender: Dispatcher).getCacheForType = getCacheForType;
25142604
(HooksDispatcherOnRerender: Dispatcher).useCacheRefresh = updateRefresh;
25152605
}
2606+
if (enableUseHook) {
2607+
(HooksDispatcherOnRerender: Dispatcher).use = use;
2608+
}
25162609
if (enableUseMemoCacheHook) {
25172610
(HooksDispatcherOnRerender: Dispatcher).useMemoCache = useMemoCache;
25182611
}
@@ -2691,6 +2784,9 @@ if (__DEV__) {
26912784
return mountRefresh();
26922785
};
26932786
}
2787+
if (enableUseHook) {
2788+
(HooksDispatcherOnMountInDEV: Dispatcher).use = use;
2789+
}
26942790
if (enableUseMemoCacheHook) {
26952791
(HooksDispatcherOnMountInDEV: Dispatcher).useMemoCache = useMemoCache;
26962792
}
@@ -2836,6 +2932,9 @@ if (__DEV__) {
28362932
return mountRefresh();
28372933
};
28382934
}
2935+
if (enableUseHook) {
2936+
(HooksDispatcherOnMountWithHookTypesInDEV: Dispatcher).use = use;
2937+
}
28392938
if (enableUseMemoCacheHook) {
28402939
(HooksDispatcherOnMountWithHookTypesInDEV: Dispatcher).useMemoCache = useMemoCache;
28412940
}
@@ -2981,6 +3080,9 @@ if (__DEV__) {
29813080
return updateRefresh();
29823081
};
29833082
}
3083+
if (enableUseHook) {
3084+
(HooksDispatcherOnUpdateInDEV: Dispatcher).use = use;
3085+
}
29843086
if (enableUseMemoCacheHook) {
29853087
(HooksDispatcherOnUpdateInDEV: Dispatcher).useMemoCache = useMemoCache;
29863088
}
@@ -3127,6 +3229,9 @@ if (__DEV__) {
31273229
return updateRefresh();
31283230
};
31293231
}
3232+
if (enableUseHook) {
3233+
(HooksDispatcherOnRerenderInDEV: Dispatcher).use = use;
3234+
}
31303235
if (enableUseMemoCacheHook) {
31313236
(HooksDispatcherOnRerenderInDEV: Dispatcher).useMemoCache = useMemoCache;
31323237
}
@@ -3289,6 +3394,14 @@ if (__DEV__) {
32893394
return mountRefresh();
32903395
};
32913396
}
3397+
if (enableUseHook) {
3398+
(InvalidNestedHooksDispatcherOnMountInDEV: Dispatcher).use = function<T>(
3399+
usable: Usable<T>,
3400+
): T {
3401+
warnInvalidHookAccess();
3402+
return use(usable);
3403+
};
3404+
}
32923405
if (enableUseMemoCacheHook) {
32933406
(InvalidNestedHooksDispatcherOnMountInDEV: Dispatcher).useMemoCache = function(
32943407
size: number,
@@ -3456,6 +3569,14 @@ if (__DEV__) {
34563569
return updateRefresh();
34573570
};
34583571
}
3572+
if (enableUseHook) {
3573+
(InvalidNestedHooksDispatcherOnUpdateInDEV: Dispatcher).use = function<T>(
3574+
usable: Usable<T>,
3575+
): T {
3576+
warnInvalidHookAccess();
3577+
return use(usable);
3578+
};
3579+
}
34593580
if (enableUseMemoCacheHook) {
34603581
(InvalidNestedHooksDispatcherOnUpdateInDEV: Dispatcher).useMemoCache = function(
34613582
size: number,
@@ -3624,6 +3745,14 @@ if (__DEV__) {
36243745
return updateRefresh();
36253746
};
36263747
}
3748+
if (enableUseHook) {
3749+
(InvalidNestedHooksDispatcherOnRerenderInDEV: Dispatcher).use = function<T>(
3750+
usable: Usable<T>,
3751+
): T {
3752+
warnInvalidHookAccess();
3753+
return use(usable);
3754+
};
3755+
}
36273756
if (enableUseMemoCacheHook) {
36283757
(InvalidNestedHooksDispatcherOnRerenderInDEV: Dispatcher).useMemoCache = function(
36293758
size: number,

0 commit comments

Comments
 (0)