Description
Hey!
I am encountering undesirable memoization behaviour which doesn’t match expectations from the documentation, in particular the behaviour of WeakMapMemoize.
I read the issue at #635 carefully and this seems like a different problem/scenario.
We were happy to use weakMapMemoize so we don't need to create private instances of a selector when using them with props, like so:
useSelector(state => selectSomeData(state, id))
However, assume that selectSomeData
was defined like this:
const selectSomeData = createSelector([
heavySelectorThatDependsOnDeepChainOfSelectors,
(state, id) => id
], (heavyResult, id) => /* combiner code */
Then the first selector and its corresponding input selectors get invalidated every time it encounters a new value of id
, although heavySelectorThatDependsOnDeepChainOfSelectors
only accepts one argument so the result will obviously be the same.
Quoting the documentation from https://reselect.js.org/api/weakmapmemoize/
Use Cases: This memoizer is likely best used for cases where you need to call the same selector instance with many different arguments, such as a single selector instance that is used in a list item component and called with item IDs…
This might be obvious to some, or our use case is maybe too niche, but I would have loved to know this before since we’re using a wide range of values of the second arguments that will break the memoization.
We have applied a local patch to our own code like so:
export const createCustomSelector = (inputs, combiner) => {
return createSelector(
inputs.map((input) => {
const isUnary =
input.dependencies?.every((dependency) => dependency.length <= 1) ||
(!('dependencies' in input) && input.length <= 1);
return isUnary ? (state) => input(state) : (state, props) => input(state, props);
}),
combiner
);
};
Mass-replacing that selector in our application leads to great speed-ups.
Could you consider introducing a similar optimization into the core library? Or perhaps just clarify the docs.
Thank you