Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve stringify_string performance #67

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

sapphi-red
Copy link

This PR improves stringify_string performance.

Description

The ECMAScript spec specifies that JSON.stringify will escape ", \\, \n, \r, \t, \b, \f, and chars less than space.
https://tc39.es/ecma262/multipage/structured-data.html#sec-quotejsonstring
This is also specified in the ES5.1 so the compatibility should be fine.

benchmark numbers

test case This PR Alternative (1) Alternative (2) Original Diff
(This PR / Original)
'long'.repeat(1000) 290k 100k 290k 68k 4.26x
'long2\n'.repeat(1000) 59k 59k 25k 33k 1.79x
'"\n"'.repeat(1000) 68k 68k 12k 56k 1.21x
'<'.repeat(1000) 33k 33k 36k 170k 0.19x
'long'.repeat(1000) + '\n' 150k 100k 140k 68k 2.21x

(ops/s)

https://jsbench.me/25lm0vndix/1

This PR degrades performance on the '<'.repeat(1000) case but I assume a string like that won't appear frequently.
I guess this PR would only be slow if a string contained </\u2028/\u2029 many times.

Comment on lines +56 to 58
if (!escape_chars.test(str)){
return `"${str}"`;
}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I remove this part, some test fails. But the reason is because JSON.stringify escapes more characters and uses lower case escapes (e.g. \udc00 instead of \uDC00). There should be no essential problem.
The generated string would be slightly longer but I guess that's fine because in the previous version of the tests, it was expecting the escaped ones (changed by 5530e84#diff-a561630bb56b82342bc66697aee2ad96efddcbc9d150665abd6fb7ecb7c0ab2f).

@Rich-Harris
Copy link
Owner

Thank you! The performance is very dependent on the inputs. If I try shorter strings, this PR makes things much slower: https://jsbench.me/9zlv6vfa6c/2. I'm not sure if there's a way to get the best of both worlds (without introducing a ton of complexity) but I'm wary of accidentally making things slower for a lot of people.

On the flip side of course we probably need to care about making existing slower cases faster more than we need to care about regressions in existing fast cases. But it would be good to have a more concrete grasp of the trade-offs as they relate to typical workloads.

@sapphi-red
Copy link
Author

That's a good point. I've took some benchmarks with different input lengths. If the input length is longer than 100, the new algorithm is faster than the original one (except for the input that only contains <).

benchmark results
┌─────────┬────────────────────────┬──────────────┬───────────────┬────────────────┐
│ (index) │       Task Name        │   Original   │      New      │ New / Original │
├─────────┼────────────────────────┼──────────────┼───────────────┼────────────────┤
│    0    │  '[0] 10 characters'   │ '9535319.54' │ '12299074.83' │     '1.29'     │
│    1    │  '[0] 100 characters'  │ '2799097.47' │ '6668595.40'  │     '2.38'     │
│    2    │ '[0] 1000 characters'  │ '341226.91'  │ '1259122.50'  │     '3.69'     │
│    3    │ '[0] 10000 characters' │  '32842.57'  │  '136339.86'  │     '4.15'     │
│    4    │  '[1] 10 characters'   │ '7287771.03' │ '4057035.29'  │     '0.56'     │
│    5    │  '[1] 100 characters'  │ '854148.41'  │ '1775371.66'  │     '2.08'     │
│    6    │ '[1] 1000 characters'  │ '105528.82'  │  '433440.77'  │     '4.11'     │
│    7    │ '[1] 10000 characters' │  '10819.11'  │  '46874.87'   │     '4.33'     │
│    8    │  '[2] 10 characters'   │ '6486978.52' │ '3936226.66'  │     '0.61'     │
│    9    │  '[2] 100 characters'  │ '907248.40'  │ '1639740.08'  │     '1.81'     │
│   10    │ '[2] 1000 characters'  │  '97387.63'  │  '379195.26'  │     '3.89'     │
│   11    │ '[2] 10000 characters' │  '9879.71'   │  '40203.89'   │     '4.07'     │
│   12    │  '[3] 10 characters'   │ '8355023.75' │ '2538670.52'  │     '0.30'     │
│   13    │  '[3] 100 characters'  │ '1021730.43' │  '631688.51'  │     '0.62'     │
│   14    │ '[3] 1000 characters'  │ '103162.00'  │  '91531.07'   │     '0.89'     │
│   15    │ '[3] 10000 characters' │  '10626.92'  │   '9776.62'   │     '0.92'     │
│   16    │  '[4] 10 characters'   │ '7364233.08' │ '1620777.40'  │     '0.22'     │
│   17    │  '[4] 100 characters'  │ '1617177.35' │  '310815.69'  │     '0.19'     │
│   18    │ '[4] 1000 characters'  │ '171385.70'  │  '36995.96'   │     '0.22'     │
│   19    │ '[4] 10000 characters' │  '16868.77'  │   '3176.59'   │     '0.19'     │
└─────────┴────────────────────────┴──────────────┴───────────────┴────────────────┘

In my workload, the input string doesn't contain <, so I'm fine with using the original algorithm even if the input string only contains a single <.

const re = /[<\u2028\u2029]/

export function stringify_string(str) {
  if (str.length < 100 || re.test(str)) {
    return original_stringify_string(str)
  }
  return new_stringify_string(str)
}

@gtm-nayan
Copy link
Contributor

@sapphi-red
Could you check the benchmarks again with

	return reg.test(str) ? JSON.stringify(str)
		.replaceAll('<', '\\u003C')
		.replaceAll('\u2028', '\\u2028')
		.replaceAll('\u2029', '\\u2029')
	 : `"${str}"`;

I think someone on the v8 team said that the callback based replace is much slower than a string.

@sapphi-red
Copy link
Author

It seems the non-callback based replace is faster for long strings but slower for short strings.
Here's the result.

benchmark result

using callback based replace

┌─────────┬────────────────────────┬──────────────┬───────────────┬────────────────┐
│ (index) │ Task Name              │ Original     │ New           │ New / Original │
├─────────┼────────────────────────┼──────────────┼───────────────┼────────────────┤
│ 0       │ '[0] 10 characters'    │ '9900970.00' │ '12068058.00' │ '1.22'         │
│ 1       │ '[0] 100 characters'   │ '2053952.77' │ '6364939.45'  │ '3.10'         │
│ 2       │ '[0] 1000 characters'  │ '231055.17'  │ '1090166.26'  │ '4.72'         │
│ 3       │ '[0] 10000 characters' │ '23660.64'   │ '118467.03'   │ '5.01'         │
│ 4       │ '[1] 10 characters'    │ '7977366.40' │ '3561900.58'  │ '0.45'         │
│ 5       │ '[1] 100 characters'   │ '1298935.74' │ '1570970.43'  │ '1.21'         │
│ 6       │ '[1] 1000 characters'  │ '143290.17'  │ '312150.37'   │ '2.18'         │
│ 7       │ '[1] 10000 characters' │ '14370.28'   │ '36080.72'    │ '2.51'         │
│ 8       │ '[2] 10 characters'    │ '7000970.60' │ '3497431.90'  │ '0.50'         │
│ 9       │ '[2] 100 characters'   │ '1161107.54' │ '1252507.25'  │ '1.08'         │
│ 10      │ '[2] 1000 characters'  │ '127390.78'  │ '211038.31'   │ '1.66'         │
│ 11      │ '[2] 10000 characters' │ '12625.94'   │ '23048.89'    │ '1.83'         │
│ 12      │ '[3] 10 characters'    │ '8238422.00' │ '2153693.85'  │ '0.26'         │
│ 13      │ '[3] 100 characters'   │ '1286866.20' │ '623993.63'   │ '0.48'         │
│ 14      │ '[3] 1000 characters'  │ '141529.18'  │ '88389.73'    │ '0.62'         │
│ 15      │ '[3] 10000 characters' │ '14239.27'   │ '9470.17'     │ '0.67'         │
│ 16      │ '[4] 10 characters'    │ '7224634.56' │ '1498251.70'  │ '0.21'         │
│ 17      │ '[4] 100 characters'   │ '1183382.00' │ '266826.99'   │ '0.23'         │
│ 18      │ '[4] 1000 characters'  │ '127803.41'  │ '30773.74'    │ '0.24'         │
│ 19      │ '[4] 10000 characters' │ '12712.61'   │ '2742.67'     │ '0.22'         │
└─────────┴────────────────────────┴──────────────┴───────────────┴────────────────┘

not using callback based replace

┌─────────┬────────────────────────┬──────────────┬───────────────┬────────────────┐
│ (index) │ Task Name              │ Original     │ New           │ New / Original │
├─────────┼────────────────────────┼──────────────┼───────────────┼────────────────┤
│ 0       │ '[0] 10 characters'    │ '9574234.00' │ '12113514.00' │ '1.27'         │
│ 1       │ '[0] 100 characters'   │ '2108624.73' │ '6430426.71'  │ '3.05'         │
│ 2       │ '[0] 1000 characters'  │ '238645.52'  │ '1103242.68'  │ '4.62'         │
│ 3       │ '[0] 10000 characters' │ '24123.45'   │ '118660.65'   │ '4.92'         │
│ 4       │ '[1] 10 characters'    │ '7604074.00' │ '2697034.38'  │ '0.35'         │
│ 5       │ '[1] 100 characters'   │ '1314656.95' │ '1480950.22'  │ '1.13'         │
│ 6       │ '[1] 1000 characters'  │ '146561.68'  │ '351241.51'   │ '2.40'         │
│ 7       │ '[1] 10000 characters' │ '14582.63'   │ '42713.72'    │ '2.93'         │
│ 8       │ '[2] 10 characters'    │ '6912714.62' │ '2683552.00'  │ '0.39'         │
│ 9       │ '[2] 100 characters'   │ '1166056.60' │ '1243143.50'  │ '1.07'         │
│ 10      │ '[2] 1000 characters'  │ '127105.47'  │ '245306.72'   │ '1.93'         │
│ 11      │ '[2] 10000 characters' │ '12668.19'   │ '27722.55'    │ '2.19'         │
│ 12      │ '[3] 10 characters'    │ '7968718.00' │ '2092723.58'  │ '0.26'         │
│ 13      │ '[3] 100 characters'   │ '1301275.48' │ '697634.33'   │ '0.54'         │
│ 14      │ '[3] 1000 characters'  │ '141805.01'  │ '104800.72'   │ '0.74'         │
│ 15      │ '[3] 10000 characters' │ '14251.72'   │ '11033.58'    │ '0.77'         │
│ 16      │ '[4] 10 characters'    │ '7480761.01' │ '1552689.69'  │ '0.21'         │
│ 17      │ '[4] 100 characters'   │ '1185071.05' │ '321182.59'   │ '0.27'         │
│ 18      │ '[4] 1000 characters'  │ '127198.70'  │ '37784.60'    │ '0.30'         │
│ 19      │ '[4] 10000 characters' │ '12696.27'   │ '3834.24'     │ '0.30'         │
└─────────┴────────────────────────┴──────────────┴───────────────┴────────────────┘

@gtm-nayan
Copy link
Contributor

Ahh, sorry, replacing the '<' first increases the length of the string 5x in the pathological case so that compounds the slowdown

\2028 and \2029 are less common in the wild I think so let's replace those before replacing <, with that the [3] case actually performs better than the original

@gtm-nayan
Copy link
Contributor

gtm-nayan commented Jun 29, 2024

We probably need to keep both the original function and this one around, because stringify_string is also used to escape object keys, which I expect would be in the 10 length category which perform worse with the new impl.

Edit: or not, because object keys usually don't have chars that need this escaping

@sapphi-red
Copy link
Author

Ahh, sorry, replacing the '<' first increases the length of the string 5x in the pathological case so that compounds the slowdown

\2028 and \2029 are less common in the wild I think so let's replace those before replacing <, with that the [3] case actually performs better than the original

Ah, didn't think about that. Yeah that improves the perf.

benchmark result
┌─────────┬────────────────────────┬───────────────┬───────────────┬────────────────┐
│ (index) │ Task Name              │ Original      │ New           │ New / Original │
├─────────┼────────────────────────┼───────────────┼───────────────┼────────────────┤
│ 0       │ '[0] 10 characters'    │ '10045823.99' │ '12289300.00' │ '1.22'         │
│ 1       │ '[0] 100 characters'   │ '2042596.00'  │ '6397283.44'  │ '3.13'         │
│ 2       │ '[0] 1000 characters'  │ '231196.29'   │ '1078311.14'  │ '4.66'         │
│ 3       │ '[0] 10000 characters' │ '23445.75'    │ '117611.91'   │ '5.02'         │
│ 4       │ '[1] 10 characters'    │ '8065342.39'  │ '2650562.94'  │ '0.33'         │
│ 5       │ '[1] 100 characters'   │ '1290377.74'  │ '1461026.25'  │ '1.13'         │
│ 6       │ '[1] 1000 characters'  │ '141711.57'   │ '341316.91'   │ '2.41'         │
│ 7       │ '[1] 10000 characters' │ '14363.25'    │ '41908.35'    │ '2.92'         │
│ 8       │ '[2] 10 characters'    │ '6922254.00'  │ '2668344.40'  │ '0.39'         │
│ 9       │ '[2] 100 characters'   │ '1168301.07'  │ '1228844.53'  │ '1.05'         │
│ 10      │ '[2] 1000 characters'  │ '128408.25'   │ '241498.79'   │ '1.88'         │
│ 11      │ '[2] 10000 characters' │ '12687.60'    │ '26932.30'    │ '2.12'         │
│ 12      │ '[3] 10 characters'    │ '8189258.36'  │ '2192068.68'  │ '0.27'         │
│ 13      │ '[3] 100 characters'   │ '1246113.00'  │ '773853.69'   │ '0.62'         │
│ 14      │ '[3] 1000 characters'  │ '136255.86'   │ '116258.54'   │ '0.85'         │
│ 15      │ '[3] 10000 characters' │ '13566.91'    │ '11997.17'    │ '0.88'         │
│ 16      │ '[4] 10 characters'    │ '7151321.71'  │ '1706394.00'  │ '0.24'         │
│ 17      │ '[4] 100 characters'   │ '1115232.66'  │ '342196.02'   │ '0.31'         │
│ 18      │ '[4] 1000 characters'  │ '116189.79'   │ '36811.93'    │ '0.32'         │
│ 19      │ '[4] 10000 characters' │ '12453.18'    │ '3995.10'     │ '0.32'         │
└─────────┴────────────────────────┴───────────────┴───────────────┴────────────────┘

because object keys usually don't have chars that need this escaping

I assume yes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants