Closed
Bug 36002
Opened 25 years ago
Closed 14 years ago
composite alpha transparency only once on static pages
Categories
(Core Graveyard :: GFX, defect, P4)
Tracking
(Not tracked)
RESOLVED
INVALID
Future
People
(Reporter: newt, Unassigned)
References
()
Details
(Keywords: helpwanted, perf)
The new Unix alpha-compositing code contributed by Tim Rowley (see bug 3013)
works fine, but it appears that the layout engine doesn't make a distinction
between dynamic pages and completely static ones. That is, opaque images scroll
quickly enough once drawn (which is why this is a separate issue from bug
26502), but alpha images appear to be recomposited continuously while scrolling,
even in the absence of dynamic content on the page.
At a minimum, it should be possible to mark pages with a "static" keyword of
some sort if they contain none of the following:
- animated background image
- JavaScript
- (other kinds of DHTML?)
In those cases, the image can be composited against the background color or
image just once, thereafter behaving just like any other opaque image.
A longer-term enhancement might be to localize dynamic and static content eveb
within a single page, but I suspect the per-page optimization would suffice in
95% of the cases.
Greg Roelofs
Changing component to Compositor
Assignee: troy → kmcclusk
Component: Layout → Compositor
I agree that we should only composite once if possible, but this is what
currently makes things slow:
* Reading back the framebuffer and compositing isn't cheap. I've checked
in some changes this weekend that will speed things up on common
framebuffer formats (32bpp, 24bpp, 16bpp).
* Underlying drawing surface does not get resized when the window is
resized. This means that often the surface we're clipping the drawing
region to is much larger than needed.
* nsImage::Draw isn't told about the dirty area when scrolling. If we
had this information (and knew the underlying surface hadn't changed),
we would only need to recomposite/redraw the dirty region. This also
hurts opaque image performance.
If a layout person can tell us about how to determine if the page/surface
is static and possibly the dirty areas, this could help performance a lot.
Updated•25 years ago
|
Status: NEW → ASSIGNED
Target Milestone: --- → M18
Comment 4•25 years ago
|
||
This bug has been marked "future" because the original netscape engineer working
on this is over-burdened. If you feel this is an error, that you or another
known resource will be working on this bug,or if it blocks your work in some way
-- please attach your concern to the bug for reconsideration.
Target Milestone: M18 → Future
Updated•25 years ago
|
Keywords: helpwanted
Comment 5•25 years ago
|
||
Marking 'helpwanted' since this looks like it would improve performance on these
type of pages.
I've got a patch that fixes part 3 of the problems tor describes. It is attached
to bug 37779.
Comment 7•25 years ago
|
||
Based on the comments here, adding dependency on bug 37779. While it may not
exactly be a dependency, at least that would reduce the problem from "composite
the entire image every time anything scrolls" to "composite only the dirty
region every time part of the image is uncovered".
To completely close this bug would require further reducing to "composite once
when the page loads and never again" - which may not even be a worthwhile
optimization, considering how difficult it is in the general case to figure out
whether a portion of a page is "static".
Depends on: 37779
It's impossible to predict in advance with 100% accuracy whether or not we'll
need to re-composite an alpha image. Some chrome script could always decide to
fiddle with the document's background color, for example. The best we could do
is to cache composited images. But that has all sorts of problems --- how do we
know when we can safely use the cache, when can we put images into the cache
(e.g. what if part of the image is off-screen), etc etc.
My tree has a fix for bug 37779 (modulo some issues), and that basically takes
care of scrolling speed.
I don't think this bug should be fixed unless someone comes up with a clever
scheme, a patch, and evidence that it makes a noticeable difference.
Comment 9•25 years ago
|
||
IOW, 37779's fix should be landed, and then this should be resolved WONTFIX?
Okay - I'll move my vote to 37779 :)
Comment 10•24 years ago
|
||
Mozilla is unusably slow when I display a large (1143 by 1530 pixels, 70K bytes)
GIF file (http://htmlhelp.inet.tele.dk/reference/charset/latin1.gif).
I'm using Mozilla 0.9.3 on windows NT on a 600 MHz pentium III PC with 128 MB of
memory. My display adapter is a GeForce 256 with 24 MB display memory. I'm using
a display mode of true colour 1024 by 768.
I downloaded the file and removed transparency from the image. The new file is
73K and Mozilla can scroll about this image with no noticeable delay.
Is this problem covered by this bug (n.b. this bug's OS is set to Linux) ?
Comment 11•24 years ago
|
||
The way this is described I don't think is doable, but what would be both
easier, faster, and more effective is to keep the composited image, and either
update that image when content behind it changes, mark a portion as dirty, or
discard it.
When content behind an image with transparency changes, it is already
recomposited with the image and the screen is updated. It should just be a
matter of updating the composited image then, and refreshing the screen with
that composited image.
Another solution is to keep a single composited image for the entire page. In
this case, none of the normal page drawing code would even have to be touched to
scroll! Of course, fixed position stuff would have to be kept on a seperate
layer or layers.
Reporter | ||
Comment 13•24 years ago
|
||
fixing broken URL
Assignee | ||
Updated•17 years ago
|
Product: Core → Core Graveyard
Updated•16 years ago
|
Assignee: kmcclusk → nobody
Status: ASSIGNED → NEW
QA Contact: chrispetersen → general
Comment 14•14 years ago
|
||
Nothing works like this anymore.
Status: NEW → RESOLVED
Closed: 14 years ago
Resolution: --- → INVALID
You need to log in
before you can comment on or make changes to this bug.
Description
•