Open
Bug 1381492
Opened 7 years ago
Updated 2 years ago
Event.timeStamp resolution should have a minimum of 5 microseconds
Categories
(Core :: DOM: Events, defect, P3)
Tracking
()
UNCONFIRMED
People
(Reporter: majidvp, Unassigned)
Details
User Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36 Steps to reproduce: This test is failing on Firefox 54: https://w3c-test.org/dom/events/Event-timestamp-safe-resolution.html Actual results: Test should pass. Expected results: The recent change to the dom spec for high resolution event timestamp [1] strongly recommends having a minimum 5 microseconds which matches the requirement for performance.now(). The test verifies this requirement. [1] https://dom.spec.whatwg.org/#dom-event-timestamp
Updated•7 years ago
|
Component: Untriaged → DOM: Events
Product: Firefox → Core
Comment 1•7 years ago
|
||
My guess is our MouseEvent ctor takes more than 5μs. I'm not sure that's a reasonable thing for the test to assume (and it's certainly never going to hold in our automation).
Comment 2•7 years ago
|
||
I get ~925 mouse events per 1ms.
Comment 3•7 years ago
|
||
I mean when created using JS
Comment 4•7 years ago
|
||
Interesting. What do you get for: ``` let e1 = new MouseEvent('test1'); let e2 = new MouseEvent('test2'); console.log((e2.timeStamp - e1.timeStamp) * 1000); ``` ? On Windows I get > 7μs
Comment 5•7 years ago
|
||
If I run that just once in console, it may give for example 17, but running in a loop something like 1 var min = 1000; for (var i = 0; i < 10000; ++i) {var e1 = new MouseEvent('test1'); var e2 = new MouseEvent('test2'); min = Math.min((e2.timeStamp - e1.timeStamp) * 1000, min);} console.log(min);
Comment 6•7 years ago
|
||
If I stick those results in an array, I get a pattern something like: 11.061732759117149 1.5802475390955806 2.370371308643371 1.1851856543216854 1.5802475245436653 1.5802475390955806 1.975309438421391 1.5802475390955806 1.5802475390955806 18.96297045459505 6.320990156382322 1.975309438421391 1.5802475390955806 1.5802475245436653 1.1851856543216854 ... That is, the first iteration always considerably longer. If I try timing the individual calls using performance.now() like so: ``` let a = performance.now(); let e1 = new MouseEvent('test1'); let b = performance.now(); let e2 = new MouseEvent('test2'); let c = performance.now(); let e3 = new MouseEvent('test3'); let d = performance.now(); console.log((b-a) * 1000); console.log((c-b) * 1000); console.log((d-c) * 1000); ``` I see: * 75~160 (usually ~80) * 0~20 (usually 5) * 0~50 (usually 10) I don't really know how to understand that except that the first call appears to take significantly longer. If this test wants to check the resolution of Event.timeStamp, then for Gecko it should probably discard the first call to MouseEvent and measure the delta between the subsequent events.
Comment 7•7 years ago
|
||
On Windows Event.timeStamp is currently limited by the system timer resolution which is about 16ms.
Comment 9•7 years ago
|
||
(In reply to Kan-Ru Chen [:kanru] (UTC+8) from comment #7) > On Windows Event.timeStamp is currently limited by the system timer > resolution which is about 16ms. This only applies to native events. (In reply to Brian Birtles (:birtles) from comment #8) > The numbers in comment 6 are from Windows. I guess for events created by script it's more accurate.
Reporter | ||
Comment 10•7 years ago
|
||
The test can definitely be improved. Here are a few things that we can improve: - Don't assume that the minimum resolution is exactly a multiple of 5μs. Instead verify that it is larger than that which is what the spec recommends. - The above means that we have to estimate the minimum resolution. Rather than taking one sample (which is subject to variation of Event constructor cost), take many more and use an estimator to compute the minimum resolution given the samples. * I suspect Olli's suggestion in #5, i.e., taking the minimum diff (ignoring zero samples), can be an effective estimator. * Another option is to take gcd of samples which I think is better specially if our sampling has a fixed cost which is larger that minimum resolution. Here is a sample code that I put together: https://gist.github.com/majido/9b01aa248551072dc4c132930b245e9a Running this code in FF, I get an accurate estimate of the min resolution (1 microseconds) using the GCD estimator. I think this is because the const of calling Event constructor is actually pretty high so using minimum is not that useful. If this makes sense we can update the test to use the GCD estimator.
Comment 11•7 years ago
|
||
Olli, does the test from comment 10 makes more sense to you?
Flags: needinfo?(bugs)
Comment 12•7 years ago
|
||
Not sure what there is to make sense :) To fix this bug I assume we might want to limit the accuracy for JS callers only. So .webidl could have BinaryName for timeStamp and we would limit the accuracy. One should take a look at how we limit performance.now() accuracy.
Flags: needinfo?(bugs)
Updated•7 years ago
|
Priority: -- → P3
Reporter | ||
Comment 13•7 years ago
|
||
FYI, I sent a WPT PR[1] to improve the test based one ideas listed in #10. birtles@, smaug@, I would appreciate if either of you can take a look and review. [1] https://github.com/w3c/web-platform-tests/pull/7449
Updated•2 years ago
|
Severity: normal → S3
You need to log in
before you can comment on or make changes to this bug.
Description
•