Closed
Bug 564695
Opened 15 years ago
Closed 9 years ago
Store scaled images at their scaled size
Categories
(Core :: Graphics: ImageLib, defect)
Core
Graphics: ImageLib
Tracking
()
RESOLVED
DUPLICATE
of bug 1045926
blocking-basecamp | - |
People
(Reporter: animalfriend, Unassigned)
References
()
Details
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729)
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729)
The above mentioned URL contains several (hundreds) of large (>1MB) JPEG images that are being scaled down to thumbnail size. Scrolling the page to the bottom is very sluggish (Intel Core2Duo, 2GB memory) at the first time, then the scrolling speed (up & down) is good.
Upon switching to another application or tab, waiting for some seconds, then coming back, any action (scrolling or tab change) takes several seconds. The application appears to hang.
I also see the memory usage rising and dropping in between these actions.
I know the HTML of this site is **** beyond belief, but IE6 has no problems displaying the page (of course, using IE6 is not an option).
Reproducible: Always
Steps to Reproduce:
1. Go to the URL (see bug details)
2. Scroll to the bottom (scrolling gets slower as the line of images gets longer
3. Got to TaskManager. The memory consumption of firefox.exe is very high (700MB on my system)
4. Wait a bit. The memory consumption is dropping (to about 150MB in my case)
5. Go back to FireFox: Scroll the page or switch to another tab and back. Scrolling is slow, tab changing takes several seconds.
Actual Results:
Sluggish scrolling, application appears to hang.
Expected Results:
Smooth operation (like IE6)
I suppose FireFox is decoding & scaling the image every time. It seems like IE6 is doing the decode & scale just once, caching the result.
Comment 1•15 years ago
|
||
Decoded images time out after 30 seconds or so. As you can see in your example, there's a very good reason why decoded images aren't kept around : memory usage. I think that the scaling is done on the fly, the scaled image isn't the one that is stored in memory.
Now, IE uses up to 500MB, which I think is because they use 3 bytes per pixel, not 4 like Firefox. But the memory doesn't go down.
Memory usage was always a big complaint for Firefox ; a lot was cut down in Firefox 3.5 and 3.6. Removing decoded images from memory was a big part of that. I don't think anybody wants to change that, just for this one lousy programmed website.
Note that storing a scaled image might be a benefit in this case (and enhancing the performance too), but don't forget that even a scaled down image can still be large memory consumer (for instance some 'graphic websites' send 2048x2048 images that are scaled down to your screen size). And for a little moment, both images would need to be present in memory. And there are also website that upscale images, so storing that wouldn't be a big advantage either. I think that memory usage would even be higher if a scaling-cache would be used.
![]() |
||
Comment 2•15 years ago
|
||
Actually, I wonder how retained layers will affect this...
But yes, we don't want to be keeping the fully-decoded image data in memory forever here.
Retained layers will not affect this, unless we get more aggressive at retaining scrolled-out-of-view content. Even then, they won't speed up the initial load and scroll down.
I could imagine combining decode-on-draw with some kind of special "scaled down JPEG decoder" which decodes and scales down in one pass, somehow. There are probably cool tricks you can play in the guts of JPEG to make that fast. But that seems much too much work to cater for a few crappy sites like this.
Off-main-thread image decoding sounds like the best bet for a solution that would be actually useful and would make this page perform better.
Comment 4•13 years ago
|
||
We should store images only drawn at a certain scale factor at that scale size. If we subsequently get a request for a different scale, we can then throw away that scaled cache and store it full size.
Status: UNCONFIRMED → NEW
Ever confirmed: true
Summary: Slow scrolling on page with many scaled-down images → Store scaled images at their scaled size
We need this on B2G (and Fennec, I'd imagine) too.
OS: Windows XP → All
Hardware: x86 → All
Oops, forgot this flag. We want this for B2G v2, it's too late for v1.
blocking-basecamp: --- → ?
Comment 7•13 years ago
|
||
One of the tricky bits about this will deciding when it's safe to throw away the full size image. The cost of getting this wrong will be a full image decode when the scale of an image changes.
Good suggestions for heuristics would be welcome.
I think this is too big to block on for B2G. We'll have to find another solution unless this turns out to be very safe and easy.
blocking-basecamp: ? → -
Comment 9•9 years ago
|
||
This is known nowadays as downscale-during-decode; it's implemented and working.
Status: NEW → RESOLVED
Closed: 9 years ago
Resolution: --- → DUPLICATE
You need to log in
before you can comment on or make changes to this bug.
Description
•