At the moment, if you're using a HiDPI device, most images that you see are going to be scaled up. That's bad especially on Mac because we don't have content acceleration there, so we're scaling the images on the CPU. For example, when I scroll up and down as fast as I can on http://imgur.com/ we spend about 20% of the time in argb32_image_mark_argb32. If you set the pref layout.gpu-image-scaling.enabled to true, we attempt to layerize as many scaled images as we can. This reduces the CPU load for scaling images, but makes composition more expensive, and cause other things to layerize, which increases memory usage and makes us invalidate more when things shift between layers. Ideally we'd be able to tell, at the point where we make a decision about layerizing scaled images, whether using an image layer will cause layerization for other stuff, or in general how much impact layerization will have. But that seems hard and brittle. We could start by tweaking the heuristics of the "GPU image scaling" mode, for example only layerize images that are larger than a certain size.
(In reply to Markus Stange [:mstange] from comment #0) > We could start by tweaking the heuristics of the "GPU image scaling" mode, > for example only layerize images that are larger than a certain size. Implementing this bit could make a good mentored bug.
Assignee: nobody → mchang
Status: NEW → ASSIGNED
Closing as won't fix for old bugs.
Status: ASSIGNED → RESOLVED
Last Resolved: 11 months ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.