Closed
Bug 105000
Opened 23 years ago
Closed 23 years ago
investigate img decompression on the fly
Categories
(Core :: Graphics: ImageLib, defect, P3)
Core
Graphics: ImageLib
Tracking
()
RESOLVED
WONTFIX
mozilla1.0
People
(Reporter: cathleennscp, Assigned: pavlov)
References
Details
(Keywords: perf)
We need to investigate storing compressed images into memory, then decompress
them on the fly, to see if it will give us any performance gain.
Chris Saari wrote:
>
> Chris Waterson wrote:
>
> > pavlov and I spent some time discussing 24 bit images yesterday. He is
> > going to explore image decoding ``on the fly'' (i.e., at paint time) to
> > see if the performance is reasonable.
>
> Remember that the old image lib did "on the fly" decoding for animated GIFs
> and it sucked notably when scrolling a page, that is one of the bugs we fixed.
>
> That said, it should be possible to speed up image decoding enough that this
> would be quite reasonable.
>
> There are also cases in the GIF spec where animations require sequential
> decodes of frames from the beginning of the sequence if you don't have each
> frame handy for compositing.
>
> Just like anything else, this is a tradeoff, but definitely one worth
> investigating. I just dread slowing down mouseover image response time for
> example. >
Assignee | ||
Updated•23 years ago
|
Status: NEW → ASSIGNED
Priority: -- → P3
Target Milestone: mozilla0.9.7 → mozilla0.9.8
Comment 1•23 years ago
|
||
Cathleen, I'm not sure how image decompression on the fly will give us a *performance* win. A footprint win makes more sense. Did you mean that?
Comment 2•23 years ago
|
||
We got this from Intel. They suggested that we're blowing the processor's L1 and L2 caches slinging uncompressed image data back and forth from main memory. They suggested that modern CPUs are fast enough that we ought to store images in the cache _encoded_ (e.g., as JPEG, GIF, etc.) rather than as 32-bit RGBA data. For painting, we should decode to device-dependent bitmaps that can be kept on the video card. (Mumble, mumble.) I think that we already do this to some extent, on some platforms.
Comment 3•23 years ago
|
||
yeah, the platform support for on-card caching is the issue. Does Intel have GIF and JPEG decompressors that have been MMX optimized that they want to contribute?
Comment 4•23 years ago
|
||
We know that our graphics pipeline (cache, I/O worker threads, main thread, latency, roundtrips, you name it) and/or the decompressors are too slow for on the fly decompression of animated images. We saw the impact when scrolling a page with animated GIFs. So this is interesting but I suspect that we'll need low level, maybe assembly level, love to make it fast enough with the decoders, and probably graphics pipeline and mozilla subsystem interaction understanding to make it work fast. For example, images decoding on a seperate thread might help, but only if they can talk to the cache without hitting the main thread and only if cache doesn't hit disk for those requests at all. And then you have thread latency and scheduling to contend with. I think it is interesting, but will have a domino effect on the system as a whole. Not that it is a bad thing, I'm just say'n.
Assignee | ||
Updated•23 years ago
|
Target Milestone: mozilla0.9.8 → mozilla0.9.9
Assignee | ||
Comment 5•23 years ago
|
||
I don't really understand this bug. Decompressing images on the fly is going to require: a) more processing due to us having to decompress. b) more data through memory since we have to: 1) stream the data in to decompress it 2) stream the decompressed data out to some place, be it VRAM or main memory, it is still going across the bus. None of this seems like a win to me. What we really want is to store the image data, decompressed, on the video card so that the operation of drawing it requires nothing more than a message to the video card to copy the image buffer on to the frame buffer. I think we should mark this bug invalid and concentrate on the gains that fixing 104999 will give us.
Depends on: 104999
Assignee | ||
Comment 6•23 years ago
|
||
shifting off to 1.0 before marking it wontfix/invalid
Target Milestone: mozilla0.9.9 → mozilla1.0
what about using a simple compression/decompression algo that will reduce mem usage? Like if a large area of an image has the same colour, isn't there some way to compress it?
Comment 8•23 years ago
|
||
sure, that is called run length encoding, which is the most basic compression. That is taken into account by Huffman compression, and revisions of Huffman compression like LZW, which is what GIF images use. GIF decompression can be very very fast... but it needs to have the whole image pipeline to be properly set up for it. It isn't just a question of choosing a compression algorithm to make the end result fast enough to use for everything. LZW is pretty much perfectly suited from a speed perspective, but our implementation would need a little more work, and the cache/main thread/image decompression pipeline would need more work to make this fast enough.
Assignee | ||
Comment 9•23 years ago
|
||
intel says they tried this and it didn't yeild positive results so i'm going to mark this wontfix
Status: ASSIGNED → RESOLVED
Closed: 23 years ago
Resolution: --- → WONTFIX
You need to log in
before you can comment on or make changes to this bug.
Description
•