Open Bug 1887997 Opened 1 year ago Updated 11 months ago

URL.searchParams performance issue

Categories

(Core :: DOM: Networking, enhancement, P3)

enhancement

Tracking

()

Performance Impact low

People

(Reporter: MattIPv4, Unassigned)

References

(Blocks 1 open bug)

Details

(Whiteboard: [necko-triaged])

Attachments

(1 file)

Steps to reproduce:

👋 It seems that SpiderMonkey suffers from the same performance issue that I fixed in Node.js (https://github.com/nodejs/node/pull/51520), where repeat writes to URL.searchParams trigger URL to re-parse the params on every write, leading to a performance bottleneck.

// Will lock up the browser
const params = new URL('https://mozilla.org').searchParams;
for (let i = 0; i < 100_000; i++) params.append('test', i.toString());
// Will run without issue
const params = new URLSearchParams();
for (let i = 0; i < 100_000; i++) params.append('test', i.toString());

I suspect a patch similar to what I landed in Node.js, where URL is lazily updated if searchParams has changed the next time a getter is called, rather than immediately updating URL when searchParams changes, would fix it.

Actual results:

Browser locks up when writing repeatedly in excess to URL.searchParams.

Expected results:

There should be no substantial performance overhead for using URL.searchParams compared to URLSearchParams.

Can repro the slowdown : https://share.firefox.dev/3xbZ5pz
But even Chrome is slow here.

Status: UNCONFIRMED → NEW
Ever confirmed: true
Attached file OPs testcase.html

Yeah, all major JS runtimes seem to suffer from the same issue, as this implementation is what most folks land on based on the wording of the spec. The fix I landed in Node.js seems to be spec-compliant still, just a bit more convoluted in how the updates propagate to avoid the performance penalty. I've filed https://issues.chromium.org/issues/331406951 + https://bugs.webkit.org/show_bug.cgi?id=271735 for the same.

This has nothing to do with JS runtimes. All the time is spent inside the native implementation.

Now, a question is, why would anyone add so many params? The testcase feels rather artificial micro-benchmark.
Are there use cases when thousands of entries are needed?

Severity: -- → S3

I ran into the slowdown in the wild while investigating a bug report in one of our sites, for a couple thousand query params on a URL we were generating. Investigated further and realised there was this substantial performance difference between URL.searchParams and URLSearchParams. Agree I don't think there is a "proper" test case for this -- when I submitted the patch into Node.js, I just added a benchmark to compare the performance of URL.searchParams and URLSearchParams at 1k and 1m query params. This is definitely more a performance enhancement than it is a "bug" per se.

Component: DOM: Core & HTML → DOM: Networking
Flags: needinfo?(smayya)
Severity: S3 → N/A
Type: defect → enhancement
Flags: needinfo?(smayya)
Priority: -- → P3
Whiteboard: [necko-triaged][necko-priority-review]

I think the problem is that for URLSearchParams we just append a new entry in the array, but for URL('https://mozilla.org').searchParams we also need to update the URL href - which tends to reallocate a large string every time.

Valentin, yes indeed, every time URL.searchParams is updated, a stringified version of the params is sent back to the URL, which is then re-parsed, as I understand it. The fix I made in Node.js was to set a flag on URL indicating that URL.searchParams had updated, and then on the next read/write to URL itself, the stringified params from URL.searchParams would be fetched and parsed. That means the stringification/parsing only happens once for a set of URL.searchParams changes, rather than for each change.

Whiteboard: [necko-triaged][necko-priority-review] → [necko-triaged]

Perf impact is low, since this isn't hit often (probably very rarely).

Performance Impact: --- → low
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: