Closed
Bug 1366803
Opened 7 years ago
Closed 7 years ago
Many cleanup tasks can run before a setTimeout(0) handler that is called when a refresh driver tick is about to run
Categories
(Core :: DOM: Core & HTML, enhancement, P1)
Core
DOM: Core & HTML
Tracking
()
RESOLVED
FIXED
Performance Impact | high |
People
(Reporter: ehsan.akhgari, Unassigned)
References
(Blocks 1 open bug)
Details
This is copied from a private email thread from smaug and jandem. Jan asked us to look into the following: <quote> you can try this: http://sm.duct.io/InteractiveRunner.html?suite=Vanilla Look at the "Async:" numbers it reports. If v2 matches v1, this is time between a setTimeout call and executing the timeout code. In general Chrome seems to get better numbers for this. It would be interesting to understand better what we're doing in this time (restyle? layout? painting?), and if we could delay it somehow. </quote> And here is the result of my investigation: I did some measurements on what happens here, and here is a summary of my findings. Firstly, here is what the terminology means. The test runs some test and measures how long that takes, calls that sync time, then dispatches a setTimeout(0) and in its callback measures the delay of its handling and calls that the async time. It is trying to measure the cost of the async work that the browser does for the script that it runs. In this time we typically run one refresh driver tick, which I think is the main intention behind the measurement. But then here is the overhead that we currently run before running the setTimeout(0) handler. (In the profile views, the black box markers with the label UserTiming correspond to the measured AsyncTime in the benchmark.) Note that the timings are on my fast laptop development Linux machine. * CCTimerFired can run in this period, see https://perfht.ml/2qFMy9d (7.9ms), https://perfht.ml/2qFXBPF (8ms) * AsyncFreeSnowwhite::Run can run in this period, see https://perfht.ml/2qFSC1q (3.7ms), https://perfht.ml/2qFBJno (1.7ms), 60ms overall https://perfht.ml/2qFSY8g * Small ContentUnbinder::Run waste spread around, https://perfht.ml/2qFYzLN overall 21ms. This is all basically work that we need to be deprioritizing when there is an active timeout pending IMO. However I don't know if we can defer the refresh driver tick itself, since the scripts are causing DOM modifications... What do you think, Olli? Olli's response: In general we have tried to deprioritize CC stuff several times over the years and that has lead to OOMs. ContentUnbinder isn't any waste, but moves otherwise synchronous DOM tree destruction to happen asynchronously. The plan is to move this stuff to use idle dispatch with timeout. Unfortunately that may not help when running tests, since there aren't necessarily any idle periods. So, I think we need to also ensure more of this stuff happens right after a tick when vsync is active.
Reporter | ||
Comment 1•7 years ago
|
||
(In reply to :Ehsan Akhgari (super long backlog, slow to respond) from comment #0) > The plan is to move this stuff to use idle dispatch with timeout. > Unfortunately that may not help when running tests, since there aren't > necessarily any idle periods. > So, I think we need to also ensure more of this stuff happens right after a > tick when vsync is active. Looking at the profiles that I have collected, I would say that the current issue is *exactly* that these things run after a refresh driver tick (well and sometimes before it, in the case of AsyncFreeSnowwhite::Run at least) when there is a pending timeout handler, so I don't think doing that more intentionally is going to improve anything...
Comment 2•7 years ago
|
||
Can we make AsyncFreeSnowWhite run for a fixed time per refresh driver? Say, spend up to 500us or 1ms deleting things before we force it to yield. If we detect CC'able things to be deleted exceeds a threshold, then kick up a full AsyncFreeSnowWhite runnable.
Comment 3•7 years ago
|
||
Another idea: I could make TimeoutManager hold on to the first N Timeout objects it allocates and recycle them for later setTimeout calls. This would avoid the AsyncFreeSnowWhite costs since we wouldn't actually be deleting anything. Of course, keeping these CC'able objects around could raise general CC load? How many Timeout objects are getting collected per AsyncFreeSnowWhite here?
Updated•7 years ago
|
Whiteboard: [qf]
Comment 4•7 years ago
|
||
(In reply to Ben Kelly [reviewing, but slowly][:bkelly] from comment #3) > Another idea: > > I could make TimeoutManager hold on to the first N Timeout objects it > allocates and recycle them for later setTimeout calls. This would avoid the > AsyncFreeSnowWhite costs since we wouldn't actually be deleting anything. > Of course, keeping these CC'able objects around could raise general CC load? Sorry, this probably doesn't make sense. Its likely that there are many different kinds of CC objects in the AsyncFreeSnowWhite runnable, so doing a Timeout pool would not help.
Reporter | ||
Comment 5•7 years ago
|
||
(In reply to Ben Kelly [reviewing, but slowly][:bkelly] from comment #3) > Another idea: > > I could make TimeoutManager hold on to the first N Timeout objects it > allocates and recycle them for later setTimeout calls. This would avoid the > AsyncFreeSnowWhite costs since we wouldn't actually be deleting anything. > Of course, keeping these CC'able objects around could raise general CC load? > > How many Timeout objects are getting collected per AsyncFreeSnowWhite here? Note that TimeoutManager isn't necessarily relevant here. The task we may be about to run may just as well be an event handler, a postMessage handler, etc. so the solution shouldn't need to be specific to timeouts IMO. It just happens that Speedometer tests this using a setTimeout(0). Also AsyncFreeSnowWhite isn't just freeing Timeout objects here, as you can see in the profile links in comment 0, it is responsible for freeing many different kinds of DOM objects and various things freed by the cycle collector. (In reply to Ben Kelly [reviewing, but slowly][:bkelly] from comment #2) > Can we make AsyncFreeSnowWhite run for a fixed time per refresh driver? > Say, spend up to 500us or 1ms deleting things before we force it to yield. > If we detect CC'able things to be deleted exceeds a threshold, then kick up > a full AsyncFreeSnowWhite runnable. That's one way to reduce this problem, yes, but see Olli's other idea about using idle dispatch with a timeout. But neither of these ideas are going to help that much with Speedometer because I think the ideal solution there would be not scheduling any cleanup work while there is pending work in our event queue unless we are running the risk of OOMing. I think Olli's concerns around deferring this work indefinitely causing OOMs is valid but our scheduling right now isn't very aware of the amount of available memory and how likely it is for you to actually OOM, and therefore we tend to treat all cases similarly...
Comment 6•7 years ago
|
||
I was thinking about this a bit more. If we make AsyncFreeSnowWhite yield to the main thread after a time threshold it would be effective during the test and also complete ASAP when idle. Free objects for up to 500us and then dispatch to current thread to continue. Yielding seems like the best de-prioritization mechanism here to me.
Reporter | ||
Comment 7•7 years ago
|
||
But this comes directly from nsCycleCollector_doDeferredDeletion(), so we can also just as easily do something better from the get go, otherwise we'd always be 500us slower here. The point to note here is that this work after significant modifications to the DOM on a super fast machine takes a few milliseconds, so yielding for any few microseconds means a guaranteed slow-down of that amount for any significant amount of DOM manipulation that a script can make to the page that can cause a lot of DOM nodes to be freed through the cycle collector. I think it is better to aim for a solution that allows us to run setTimeout(0) handlers without having to run any cleanup code if we can afford the memory cost on the machine, if possible.
Comment 8•7 years ago
|
||
> yielding for any few microseconds What do you mean by that? I meant making the code do `NS_DispatchToCurrentThread(this)` to let other work execute if it exceeds a threshold. If we are idle this should add negligible time to completion. If we are busy then this gracefully de-prioritizes without wasting CPU. (In reply to :Ehsan Akhgari (super long backlog, slow to respond) from comment #7) > that allows us to run setTimeout(0) handlers without having to run any > cleanup code if we can afford the memory cost on the machine, if possible. I guess I'd have to see more details of what you mean, but I'm suspicious of doing anything that increases its priority. I've done a lot of work to make them yield to other things that need to other work. (Exactly as I have described above.)
Comment 9•7 years ago
|
||
It already has the continuation logic there, AFAICT. It just needs a time budget threshold.
Comment 10•7 years ago
|
||
To clarify some of my concern here. I don't want to special case setTimeout(f, 0) since it ignores other async work sites use. Some sites use self postMessage() to avoid setTimeout(f, 0) 4ms clamping. Other work like IDB callbacks (which are IPC runnables), etc, should also be considered higher priority that background cleanup. I'd really like to see a solution that lowers the cleanup priority without special casing what is considered higher priority. Just my 2 cents.
Comment 11•7 years ago
|
||
I'm getting a bit lost with this bug, but anyhow... the main issue here is to know when we can postpone cleanups. Speedometer runs for a long time and creates tons of garbage. If we have some idle periods there, we should definitely use those to do cleanups. Since Speedometer happens to load new pages all the time, there seems to be in practice idle periods while a new page is loading.
Comment 12•7 years ago
|
||
Using idle dispatch definitely sounds like the best place to start to me. Even if it doesn't move the needle on Speedometer it would probably be a great improvement for the real world.
Reporter | ||
Comment 13•7 years ago
|
||
(In reply to Ben Kelly [reviewing, but slowly][:bkelly] from comment #12) > Using idle dispatch definitely sounds like the best place to start to me. > Even if it doesn't move the needle on Speedometer it would probably be a > great improvement for the real world. Good! That would give the semantics that I was describing in comment 7 for the most part I think without needing to special case anything for setTimeout(0) (which wasn't what I was suggesting.)
Updated•7 years ago
|
Priority: -- → P1
Updated•7 years ago
|
Whiteboard: [qf] → [qf][QRC_Analyzed]
Updated•7 years ago
|
Whiteboard: [qf][QRC_Analyzed] → [qf:p1][QRC_Analyzed]
Reporter | ||
Updated•7 years ago
|
Comment 14•7 years ago
|
||
This needs to be re-profiled
Comment 15•7 years ago
|
||
(In reply to Olli Pettay [:smaug] from comment #14) > This needs to be re-profiled Moving from [qf:p1] to [QRC_NeedAnalysis] as per Ehsan's comment 14.
Whiteboard: [qf:p1][QRC_Analyzed] → [QRC][QRC_NeedAnalysis]
Reporter | ||
Comment 16•7 years ago
|
||
I remeasured the VanillaJS suite after the recent improvements that have landed. Here is a profile: http://bit.ly/2sIliIY First things first, the async times are a lot more stable now, see ~20 runs here for example: VanillaJS-TodoMVC : Adding100Items : Sync: 117.17000000000007 ms VanillaJS-TodoMVC : Adding100Items : Async: 14.5649999999996 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 49.9399999999996 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 4.625 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 42.00500000000011 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.204999999999927 ms VanillaJS-TodoMVC : 232.5099999999993 ms Total : 232.5099999999993 ms VanillaJS-TodoMVC : Adding100Items : Sync: 123.01499999999942 ms VanillaJS-TodoMVC : Adding100Items : Async: 13.265000000001237 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 48.93499999999949 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.110000000000582 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 40.43499999999949 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.180000000000291 ms VanillaJS-TodoMVC : 234.9400000000005 ms Total : 234.9400000000005 ms VanillaJS-TodoMVC : Adding100Items : Sync: 123.36999999999898 ms VanillaJS-TodoMVC : Adding100Items : Async: 15.8849999999984 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 50.63000000000102 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.040000000000873 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 41.905000000000655 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.524999999999636 ms VanillaJS-TodoMVC : 241.35499999999956 ms Total : 241.35499999999956 ms VanillaJS-TodoMVC : Adding100Items : Sync: 122.76000000000022 ms VanillaJS-TodoMVC : Adding100Items : Async: 14.020000000000437 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 49.07999999999993 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.3799999999992 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 41.600000000000364 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 5.615000000001601 ms VanillaJS-TodoMVC : 238.45500000000175 ms Total : 238.45500000000175 ms VanillaJS-TodoMVC : Adding100Items : Sync: 118.80500000000029 ms VanillaJS-TodoMVC : Adding100Items : Async: 11.555000000000291 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 47.29000000000087 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.099999999998545 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 44.45000000000073 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.5349999999998545 ms VanillaJS-TodoMVC : 231.73500000000058 ms Total : 231.73500000000058 ms VanillaJS-TodoMVC : Adding100Items : Sync: 125.98999999999978 ms VanillaJS-TodoMVC : Adding100Items : Async: 17.900000000001455 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 49.13500000000022 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 4.704999999999927 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 41.335000000000946 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.760000000000218 ms VanillaJS-TodoMVC : 243.82500000000255 ms Total : 243.82500000000255 ms VanillaJS-TodoMVC : Adding100Items : Sync: 127.21999999999935 ms VanillaJS-TodoMVC : Adding100Items : Async: 18.3650000000016 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 49.650000000001455 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.650000000001455 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 43.219999999999345 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.200000000000728 ms VanillaJS-TodoMVC : 248.30500000000393 ms Total : 248.30500000000393 ms VanillaJS-TodoMVC : Adding100Items : Sync: 121.61000000000058 ms VanillaJS-TodoMVC : Adding100Items : Async: 12.355000000001382 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 52.725000000000364 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 4.119999999998981 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 41.11499999999978 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.434999999999491 ms VanillaJS-TodoMVC : 236.36000000000058 ms Total : 236.36000000000058 ms VanillaJS-TodoMVC : Adding100Items : Sync: 124.61499999999978 ms VanillaJS-TodoMVC : Adding100Items : Async: 15.600000000000364 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 47.8700000000008 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.8700000000008 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 51.5049999999992 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.229999999999563 ms VanillaJS-TodoMVC : 249.6900000000005 ms Total : 249.6900000000005 ms VanillaJS-TodoMVC : Adding100Items : Sync: 118.44500000000153 ms VanillaJS-TodoMVC : Adding100Items : Async: 15.824999999998909 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 45.399999999999636 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 4.829999999999927 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 39.655000000000655 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.209999999999127 ms VanillaJS-TodoMVC : 228.36499999999978 ms Total : 228.36499999999978 ms VanillaJS-TodoMVC : Adding100Items : Sync: 140.05499999999847 ms VanillaJS-TodoMVC : Adding100Items : Async: 16.1150000000016 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 47.94499999999971 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 4.9650000000001455 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 41.63500000000022 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.895000000000437 ms VanillaJS-TodoMVC : 255.61000000000058 ms Total : 255.61000000000058 ms VanillaJS-TodoMVC : Adding100Items : Sync: 125.05500000000029 ms VanillaJS-TodoMVC : Adding100Items : Async: 12.030000000000655 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 50.86000000000058 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.069999999999709 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 43.22999999999956 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 6.235000000000582 ms VanillaJS-TodoMVC : 242.48000000000138 ms Total : 242.48000000000138 ms VanillaJS-TodoMVC : Adding100Items : Sync: 122.08000000000175 ms VanillaJS-TodoMVC : Adding100Items : Async: 16.724999999998545 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 48.6349999999984 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.165000000000873 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 39.70999999999913 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 3.9650000000001455 ms VanillaJS-TodoMVC : 236.27999999999884 ms Total : 236.27999999999884 ms VanillaJS-TodoMVC : Adding100Items : Sync: 118.39500000000044 ms VanillaJS-TodoMVC : Adding100Items : Async: 13.455000000001746 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 45.7599999999984 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.299999999999272 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 40.534999999999854 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.5900000000001455 ms VanillaJS-TodoMVC : 228.03499999999985 ms Total : 228.03499999999985 ms VanillaJS-TodoMVC : Adding100Items : Sync: 120.02499999999782 ms VanillaJS-TodoMVC : Adding100Items : Async: 12.590000000000146 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 46.2599999999984 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.43999999999869 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 38.154999999998836 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.169999999998254 ms VanillaJS-TodoMVC : 226.63999999999214 ms Total : 226.63999999999214 ms VanillaJS-TodoMVC : Adding100Items : Sync: 121.07500000000073 ms VanillaJS-TodoMVC : Adding100Items : Async: 12.215000000000146 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 46.029999999998836 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.085000000002765 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 38.974999999998545 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.170000000001892 ms VanillaJS-TodoMVC : 227.5500000000029 ms Total : 227.5500000000029 ms VanillaJS-TodoMVC : Adding100Items : Sync: 122.68499999999767 ms VanillaJS-TodoMVC : Adding100Items : Async: 15.650000000001455 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 48.335000000002765 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.225000000002183 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 41.62999999999738 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.180000000000291 ms VanillaJS-TodoMVC : 237.70500000000175 ms Total : 237.70500000000175 ms VanillaJS-TodoMVC : Adding100Items : Sync: 126.27999999999884 ms VanillaJS-TodoMVC : Adding100Items : Async: 11.909999999999854 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 48.64500000000044 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.139999999999418 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 40.404999999998836 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.565000000002328 ms VanillaJS-TodoMVC : 236.9449999999997 ms Total : 236.9449999999997 ms VanillaJS-TodoMVC : Adding100Items : Sync: 150.62999999999738 ms VanillaJS-TodoMVC : Adding100Items : Async: 12.625 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 55.92000000000189 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 5.639999999999418 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 47.18500000000131 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.375 ms VanillaJS-TodoMVC : 276.375 ms Total : 276.375 ms VanillaJS-TodoMVC : Adding100Items : Sync: 118.00500000000102 ms VanillaJS-TodoMVC : Adding100Items : Async: 16.68999999999869 ms VanillaJS-TodoMVC : CompletingAllItems : Sync: 48.6449999999968 ms VanillaJS-TodoMVC : CompletingAllItems : Async: 6.529999999998836 ms VanillaJS-TodoMVC : DeletingAllItems : Sync: 40.94000000000233 ms VanillaJS-TodoMVC : DeletingAllItems : Async: 4.309999999997672 ms VanillaJS-TodoMVC : 235.11999999999534 ms Total : 235.11999999999534 ms In the majority of the times, the async times now purely measure a refresh driver tick, as we were hoping to achieve once the dependency list gets fixed. If you look at the profile, at the times that for example correspond to the longer than usual runs of the Adding100Items async test (for example this one <http://bit.ly/2sI20mW> which corresponds to the 6th run of the subtest in this profile) we get unlucky and get a CC slice forcefully run through <https://searchfox.org/mozilla-central/rev/b2d072aa5847996b8276bd8d7b150e0ea6bf6283/dom/base/nsJSEnvironment.cpp#278>. Next I will look at some other subtests.
Reporter | ||
Comment 17•7 years ago
|
||
(In reply to Chris Peterson [:cpeterson] from comment #15) > (In reply to Olli Pettay [:smaug] from comment #14) > > This needs to be re-profiled > > Moving from [qf:p1] to [QRC_NeedAnalysis] as per Ehsan's comment 14. I think we've got this one! :-)
Whiteboard: [QRC][QRC_NeedAnalysis] → [qf:p1]
Comment 18•7 years ago
|
||
(In reply to :Ehsan Akhgari (needinfo please, extremely long backlog) from comment #16) > I remeasured the VanillaJS suite after the recent improvements that have > landed. > > First things first, the async times are a lot more stable now [...] [...] > I think we've got this one! :-) It sounds like this ^^ assessment was based on local results for the VanillaJS suite -- but sadly, the AWFY graph for "VanillaJS-TodoMVC-DeletingAllItems-async" is still pretty noisy (and I'm guessing cycle collection might to blame): https://arewefastyet.com/#machine=36&view=single&suite=speedometer-misc&subtest=VanillaJS-TodoMVC-DeletingAllItems-async Should we leave this bug open until that's ^^ been investigated/addressed?
Flags: needinfo?(ehsan)
Reporter | ||
Comment 19•7 years ago
|
||
(In reply to Daniel Holbert [:dholbert] from comment #18) > (In reply to :Ehsan Akhgari (needinfo please, extremely long backlog) from > comment #16) > > I remeasured the VanillaJS suite after the recent improvements that have > > landed. > > > > First things first, the async times are a lot more stable now [...] > [...] > > I think we've got this one! :-) > > It sounds like this ^^ assessment was based on local results for the > VanillaJS suite -- but sadly, the AWFY graph for > "VanillaJS-TodoMVC-DeletingAllItems-async" is still pretty noisy (and I'm > guessing cycle collection might to blame): > https://arewefastyet.com/#machine=36&view=single&suite=speedometer- > misc&subtest=VanillaJS-TodoMVC-DeletingAllItems-async Not quite sure if I understand your question. But there has been lots of profiling on Speedometer and a lot of investigation on the async test times and not all of it has been reflected in this one bug. Here is a summary of what we know that currently runs inside the async period: * The refresh driver tick. This is by far the biggest chunk of the async test times at the moment and I believe this is generally true for all of the subtests now (although it may be the case that there are some subtests that show slightly different behavior than others, but please don't assume that the investigations have been specific to VanillaJS -- that's just where I started!) * Some GC/CC runnables, including forgetSkippable, since these run off of the idle queue with a timeout, so if we don't get to an idle period before the timeout, we may run them inside an async period. * Some timers scheduled by the tests themselves (for example, bug 1373723 comment 7, or bug 1376698 that you noticed) * A bunch of super tiny time (typically on the order on 1 or 2 samples captured when profiling at the 0.1ms frequency) in various misc runnables that I can't remember off the top of my head that I have been ignoring for now since the previous categories dominate our times. Every now and then I come across other things (example: bug 1373857) but so far it seems like there really isn't much if anything other than these three main categories of things happening in the async times. > Should we leave this bug open until that's ^^ been investigated/addressed? Well, it is unclear if we can improve the GC/CC runnable issue further. We only have so much idle time when running the full Speedometer suite and I guess leaving this bug open isn't going to mean that we'll do something new in that domain. About the refresh driver tick, we've asked help from the Layout and Graphics teams as you know. About the timeouts that do run in those times, that gets partly looked at as part of the "usual" Speedometer profiling of looking at things running invoked under JS, and perhaps it deserves some specific attention, although I'm not sure why we would treat JS code running in async tests any more special than JS code running under sync tests. At any rate, since it seems like all of the benefit this bug was filed to reap has been reaped, I'm going to close it. (BTW another point about the test times reported on AWFY, please note that the times reported there can be *extremely* noisy, so don't read anything whatsoever in the variance of the individual numbers compared to each other. See bug 1369587 for an example of how this has confused me in the past, and those were the "reliable" sync times!)
Flags: needinfo?(ehsan)
Reporter | ||
Updated•7 years ago
|
Status: NEW → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
Comment 20•7 years ago
|
||
OK, thanks for the detailed response! > Not quite sure if I understand your question. Basically, I was skimming AWFY graphs for speedometer "async" subtests, and I noticed that the VanillaJS one that I linked was especially noisy (with the noise not involving Layout/Graphics AFAICT), and I was wondering if I should file a new bug for it. And I ran across this bug when searching for existing VanillaJS suite bugs, & noticed that my noisiness observation seemed to superficially contradict your "times...a lot more stable now" assessment in comment 16. Anyway -- the graph has actually gotten a bit less noisy since I posted comment 18 here, and if that's a fluke & it remains/returns to being noisy, I'll file a new bug for that.
Reporter | ||
Comment 21•7 years ago
|
||
(In reply to Daniel Holbert [:dholbert] from comment #20) > OK, thanks for the detailed response! > > > Not quite sure if I understand your question. > > Basically, I was skimming AWFY graphs for speedometer "async" subtests, and > I noticed that the VanillaJS one that I linked was especially noisy (with > the noise not involving Layout/Graphics AFAICT), and I was wondering if I > should file a new bug for it. And I ran across this bug when searching for > existing VanillaJS suite bugs, & noticed that my noisiness observation > seemed to superficially contradict your "times...a lot more stable now" > assessment in comment 16. I have done zero investigation into what's happening on AWFY FWIW. Mine has all happened locally (and last time a few weeks ago, FWIW.) > Anyway -- the graph has actually gotten a bit less noisy since I posted > comment 18 here, and if that's a fluke & it remains/returns to being noisy, > I'll file a new bug for that. Sure. One thing to note is that in any single async run, all it takes is for one GC or CC to run for it to become "noisy". Another thing to note is that AWFY machines produce noisy numbers even in sync tests, so the amount of noise observed in AWFY deserves only so much attention IMO (YMMV). See bug 1369587. I guess so far I just haven't gained any value out of investigating any noise that we have seen on AWFY that I haven't been able to reproduce locally. But of course don't let me stop you there. :-)
Assignee | ||
Updated•6 years ago
|
Component: DOM → DOM: Core & HTML
Updated•3 years ago
|
Performance Impact: --- → P1
Whiteboard: [qf:p1]
You need to log in
before you can comment on or make changes to this bug.
Description
•