Duration values of event timing entries should have 8 milliseconds granularity
Categories
(Core :: DOM: Performance APIs, defect)
Tracking
()
Tracking | Status | |
---|---|---|
firefox142 | --- | fixed |
People
(Reporter: canova, Assigned: canova)
References
(Blocks 1 open bug)
Details
Attachments
(1 file)
In the event timing API spec, we have this:
The duration has an 8 millisecond granularity (it is computed as such by performing rounding). Thus, a high resolution timer cannot be produced from this timestamps. However, it does introduce new information that is not readily available to web developers: the time pixels draw after an event has been processed. We do not find security or privacy concerns on exposing the timestamp, especially given its granularity. In an effort to expose the minimal amount of new information that is useful, we decided to pick 8 milliseconds as the granularity. This allows relatively precise timing even for 120Hz displays.
It has to be rounded to closest 8ms. But currently we have a different mechanism. We call nsRFPService::ReduceTimePrecisionAsMSecs
when we are returning the duration
attribute. It looks like we are already reducing the precision, but I'm not 100% sure what this function does.
On the other hand, there is a WPT test that's failing because of this. This is the test:
/event-timing/duration-with-target-low.html
Currently it's failing because the event takes 300ms and we are returning 301, and the test expects us to return at least 304 because this is the rounded value to the nearest 8ms. So fixing that issue would solve this test.
Assignee | ||
Comment 1•3 months ago
•
|
||
@sefeng, do you remember what the nsRFPService::ReduceTimePrecisionAsMSecs
function does? And what do you think about adding this rounding to the duration overall? I think it would help us to match the behavior with Chrome as it's already doing this.
Comment 2•3 months ago
|
||
hmm, I don't think we should change this. We use ReduceTimePrecisionAsMSecs
to guard all our timestamps for Performance APIs, it provides less granularity than what the spec specifies, and I think this is better for security/privacy. We should adjust the test to make it accept less granularity.
However for this test...I don't think I understand why a event takes 300ms but the test expects for 304ms? 301 sounds more correct to me.
Assignee | ||
Comment 3•3 months ago
|
||
I guess we can use both ReduceTimePrecisionAsMSecs
and rounding at the same time, no?
FWIW, I still don't know how ReduceTimePrecisionAsMSecs
works, I should just put some printfs before and after so I know how it affects the numbers.
Rounding is pretty simple compared to this one, and it sounds like an okay change, it essentially reduces the duration to a single refreshdriver tick (8ms for 120hz and 16ms for 60hz).
However for this test...I don't think I understand why a event takes 300ms but the test expects for 304ms? 301 sounds more correct to me.
Per spec, every duration number should be rounded to nearest 8ms and its multiplications. So if an event duration is 300 or 301, it will roll up to 304 which is the closes number that's divisible to 8. Without checking it fully, I don't 100% know if ReduceTimePrecisionAsMSecs gives less granularity. From a quick first look, I think it provides more granularity. Do you know if this is true?
Comment 4•3 months ago
|
||
I don't 100% know if ReduceTimePrecisionAsMSecs gives less granularity. From a quick first look, I think it provides more granularity. Do you know if this is true?
hmm..you are right, I was thinking about some other timestamps..sorry!
I am a bit confused here. The spec said Set timingEntry’s duration to a DOMHighResTimeStamp resulting from renderingTimestamp - start, with granularity of 8ms or less.
, so it's granularity of 8ms or less, I think what we have is allowed here?
https://github.com/w3c/event-timing/issues/68 looks to be where it was discussed, it sounds like more granularity (a.k.a, less than 8ms) is better? Am I not understanding the English here..
Assignee | ||
Comment 5•3 months ago
|
||
Hmm, after reading that issue I think "less granularity" here means "larger value than 8ms". So the minimum granularity is 8ms. That's why I think some people suggested 16ms for 60hz. I didn't come up with the " it sounds like more granularity (a.k.a, less than 8ms) is better? " conclusion. But I agree that it's difficult to understand.
Comment 6•3 months ago
|
||
Yeah okay...I think ReduceTimePrecisionAsMSecs
and then plus the rounding sounds good to me.
I think what ReduceTimePrecisionAsMSecs
would do is say there's a timestamp like 23.1234567
, it would truncate it to 23
or 24
or whatever values based on some factors.
Comment 7•2 months ago
|
||
The severity field is not set for this bug.
:bas.schouten, could you have a look please?
For more information, please visit BugBot documentation.
Updated•2 months ago
|
Assignee | ||
Comment 8•1 month ago
|
||
It looks like even after this I get some intermittent failures on the
previously perma failing test. From my testing this is because sometimes our
ReduceTimePrecisionAsMSecs function returns lower values.
Updated•1 month ago
|
Comment 10•1 month ago
|
||
Comment 11•1 month ago
|
||
bugherder |
Comment 12•1 month ago
|
||
bugherder |
Updated•21 days ago
|
Description
•