Closed Bug 877557 Opened 11 years ago Closed 6 years ago

[B2G] Limit the number of layers that content process can allocate

Categories

(Core :: Graphics: Layers, defect)

ARM
Gonk (Firefox OS)
defect
Not set
normal

Tracking

()

RESOLVED WONTFIX

People

(Reporter: sotaro, Assigned: mattwoodrow)

References

Details

+++ This bug was initially created as a clone of Bug #877495 +++

As in Bug 877495, there is no limit mechanizm about how much b2g process' file descriptors a content process can allocate. Therefore the content process can consume any number of file descriptors in b2g process.Then any apps and any web pages can comsume all file descriptors that b2g process can allocate and can stall the all system.

Some mechanism to limit the number of file descriptors that content process can allocate, is necessary.

The probem is caused by layers in Bug 877495. Each layer allocate buffers for rendering and each buffer has a file descriptor. Then increasing the number of layer becomes the increasing the number of file descriptor.

So, limit the number of layers per content process seems adequate control point.
That sounds like a good extension to have.  If the limit doesn't exist, we'd continue using the current heuristics to decide what gets a separate layer, but if one does exist, we can modify the heuristics to deal with it.  Sotaro, is the total number of FD available to a b2g process available to query at run time?
Matt, does this sound like a feature you can take?
Flags: needinfo?(matt.woodrow)
Yep, I can probably get this done.

How high of a priority is this?
Assignee: nobody → matt.woodrow
Flags: needinfo?(matt.woodrow)
(In reply to Milan Sreckovic [:milan] from comment #1)
> Sotaro, is the total number of FD available to a b2g process available to
> query at run time?

Yes, it can get by using getrlimit().
(In reply to Matt Woodrow (:mattwoodrow) from comment #2)
> Yep, I can probably get this done.
> 
> How high of a priority is this?

It hasn't been identified as a blocker, but it seems that we keep running into issues that stem from us not having this.  I would say end of July is fine, probably into August as well, but then getting tight if it's to make 1.2
What is the goal here? To get correct rendering, at some performance cost? Or just to avoid crashing, even if rendering falls apart.

Currently we require active layers to handle preserve-3d content, which is what bug 877495 is.

From what I've seen, this is the most common reason to have a large number of layers and we can't easily draw it correctly without layers.

I don't think it's ever going to be possible to get halfway through a preserve-3d tree, run out of layers, and draw the second half as inactive layers.

We could probably determine the number of nodes in a preserve-3d tree in advance, and just skip rendering all or part of it.

Alternatively we could shift the entire tree into being inactive, and I can try figure out how to make that render something resembling correct.
That helps, thanks.  But, yes, the goal was to get the correct rendering at some performance cost.
Blocks: 889957
Depends on: 873378
Closing as we are not working on Firefox OS anymore.
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.