Closed Bug 970177 Opened 10 years ago Closed 10 years ago

Share pool of mark stacks between runtimes


(Core :: JavaScript: GC, defect)

Not set





(Reporter: mccr8, Unassigned)



(Whiteboard: [MemShrink:P2])

If we turn on incremental GC for DOM workers, then we're going to take a hit of about 220KB of memory per worker, for the GC mark stack.  There aren't a huge number of workers right now, but if ongoing work to reduce DOM worker memory usage is successful, then this hit will be larger, because people will use more DOM workers.

Maybe this is a terrible idea, but it seems to me that the various runtimes could share a pool of GC mark stacks, and hand them out when a runtime decides to start a GC, releasing it back to the pool when the runtime is done with the GC.

This is a tradeoff of improved memory usage for worst-case pauses, as in the worst case, all k runtimes will want to run an IGC at the same time, which can't be done without k IGC-sized mark stacks.

Some ideas for how you could save some memory:
- Split a pool of size k between IGC and non-IGC stacks.  This way, the first so many runtimes that start a GC get a IGC stack, and if things start piling up, then later runtimes end up with non-IGC stacks.  They get more jank, but can still run.
- Make the pool smaller than k.  If a runtime wants to start a GC, you have to wait until something is back in the pool.  Timer-based GCs are fairly arbitrary, so they could be pushed back a bit, or in the worst case, you could suspend the worker until an extant GC finishes marking, which would (loosely speaking) double the pause time.

You'd also need some kind of priority or reservation system so that eg the main thread could be guaranteed to always have an IGC-sized stack available.

Anyways, this sounds a bit complex, but it might be worth thinking about if we end up with a ton of DOM workers, or want to run IGC on workers.
Whiteboard: [MemShrink]
Whiteboard: [MemShrink] → [MemShrink:P2]
This is probably more trouble than it is worth.
Closed: 10 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.