Closed Bug 649924 Opened 14 years ago Closed 13 years ago

HTML5 Canvas slow/out of memory on Win7/64Bit

Categories

(Core :: Graphics: Canvas2D, defect)

2.0 Branch
x86_64
Windows 7
defect
Not set
normal

Tracking

()

RESOLVED FIXED
mozilla8

People

(Reporter: pk9560660, Assigned: roc)

References

Details

(Keywords: regression)

Attachments

(3 files, 1 obsolete file)

User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0) Gecko/20100101 Firefox/4.0 Build Identifier: 4.0 I use a very large HTML5 Canvas (640*1000) for painting. Running this with WinXP/32Bit is no problem. With Win7/64Bit the canvas size is limited to 640x8192. A larger canvas results in an "0x8007000e (NS_ERROR_OUT_OF_MEMORY) [nsIDOMHTMLCanvasElement.getContext]" error. An other problem is, that paint on the large canvas is much slower on Win7/64 compared to WinXP/32? Reproducible: Always Steps to Reproduce: <canvas width="640" height="10000" id="Console"></canvas> ... <script> var objCanvas = document.getElementById("Console"); var context = objCanvas.getContext("2d"); </script> Actual Results: Fehler: uncaught exception: [Exception... "Component returned failure code: 0x8007000e (NS_ERROR_OUT_OF_MEMORY) [nsIDOMHTMLCanvasElement.getContext]" nsresult: "0x8007000e (NS_ERROR_OUT_OF_MEMORY)" location: "JS frame :: file:///C:/Users/Windows%207/Desktop/test.html :: <TOP_LEVEL> :: line 10" data: no] Expected Results: context object -
Component: General → Canvas: 2D
QA Contact: general → canvas.2d
Do you have a testcase that could be attached to the bug?
Version: unspecified → 2.0 Branch
Attached file HTML Sample File (obsolete) —
Attached image Error Screenshot
The attached sample page works not on FF4, Windows7, 64 Bit. (Chrome and IE have no problems)
This is more likely to be related to hardware acceleration (see bug 633936) than 32 vs 64 bit. WinXP doesn't have direct2d; Win7 does. Does turning off direct2d fix your issue with sizes larger than 8192? If so, then we should focus this on the performance problem.
I have disabled "Options/...Use hardware acceleration when available" -> No change, same error.
OS: Windows 7 → Windows XP
You restarted the browser after doing that
Yes, I have restarted FF. Same error.
That's really odd. Bas, Benoit, are we limiting canvas sizes on platforms that support hardware accel even when hardware accel is off?
Status: UNCONFIRMED → NEW
Ever confirmed: true
Keywords: regression
(In reply to comment #9) > That's really odd. Bas, Benoit, are we limiting canvas sizes on platforms > that support hardware accel even when hardware accel is off? bz,Bas,Benoit -- any progress on this issue? I suspect that some users of our Page Saver extension are running into this bug (Page Saver uses drawWindow() with a potentially large canvas). Unfortunately, I don't have access to a Win7 computer where hardware acceleration is possible, so I can't be sure.
CC'd a few people who might know. Not my area :)
Not mine other (on at least two counts!)...
Assignee: nobody → bas.schouten
We limit the canvas size when using hardware acceleration, I don't see why it would be limited when not using hardware acceleration.
I cannot reproduce any slow painting fwiw, with HW acceleration at least.
Attached file fixed testcase
The provided testcase is a bit misleading; Gecko doesn't support innerText, so "Hurray It Works!" is never displayed. Here is a fixed testcase that uses textContent instead.
Attachment #526305 - Attachment is obsolete: true
With that testcase, on Windows 7, I don't get this error with acceleration disabled, but I do get it with acceleration enabled. So can we do anything about this? Could we revert to the cairo image backend when the canvas is too large to work with D2D?
OS: Windows XP → Windows 7
Sounds like this is a duplicate of bug 633936, then.
(In reply to comment #16) > With that testcase, on Windows 7, I don't get this error with acceleration > disabled, but I do get it with acceleration enabled. > > So can we do anything about this? Could we revert to the cairo image backend > when the canvas is too large to work with D2D? It's a possibility we debated. But back when I discussed this with Vlad, he was more in favor of limiting canvas sizes (possibly adding something to the spec to do it more sensibly). The problem is having massive differences in performance for large and small canvases is also a bit annoying. Another thing we could do is draw everything to the canvas 'scaled down' make the canvas max Texture size and then scale up when blending the layer. Obviously this causes resampling artifacts. The other option is what IE9 does, display the first 'maxTextureWidth' pixels and leave out the rest. In general I'm not sure if I think someone making a canvas significantly larger than the screen is generally using Canvas the way they should. These things will grow very big in memory and usually will be used for static content (after all a user will have to be scrolling them), where in many cases SVG or some other markup based thing is possibly a better option. I don't know.
The way someone "should" use canvas is however one would use an image... There are plenty of use cases for images that are significantly larger than the screen size. Now repeatedly redrawing a huge canvas doesn't seem like it has much in the way of use cases. But the numerous bug reports we've had about this problem aren't doing that. Furthermore, having canvas behave completely differently for different platforms (e.g. the large canvases work on Mac and Linux, right?) seems like a bad thing for the web. So I don't think we should do either what we do now or what IE9 does.
Depends on: 633936
(In reply to comment #19) > The way someone "should" use canvas is however one would use an image... > There are plenty of use cases for images that are significantly larger than > the screen size. The use cases afaics, are generally very high resolution images, or poorly implemented tile maps (i.e. that tile in 1 dimension). Neither of which seem like they apply very well to canvas. > > Now repeatedly redrawing a huge canvas doesn't seem like it has much in the > way of use cases. But the numerous bug reports we've had about this problem > aren't doing that. Well, not repeatedly drawing it sounds bad too, in the Software case for example. Having a 10000x2000 pixel canvas will consume 80 MB of RAM -just to maintain the surface- (or 80 MB of VRAM in the HW accelerated case, plus roughly another 80MB of VRAM for support surfaces). And unlike in the case of an image, you can't discard that memory once you've drawn to the screen. Which is a bad practice for a bunch of pages to do. Especially considering people love to leave a bunch of tabs open. > > Furthermore, having canvas behave completely differently for different > platforms (e.g. the large canvases work on Mac and Linux, right?) seems like > a bad thing for the web. So I don't think we should do either what we do > now or what IE9 does. Yes, consistency is important. I'm personally in favor of either capping canvas sizes completely, or adding something to the API to report max sizes.
(In reply to comment #18) > (In reply to comment #16) > > So can we do anything about this? Could we revert to the cairo image backend > > when the canvas is too large to work with D2D? > > It's a possibility we debated. But back when I discussed this with Vlad, he > was more in favor of limiting canvas sizes (possibly adding something to the > spec to do it more sensibly). The problem is having massive differences in > performance for large and small canvases is also a bit annoying. A massive performance difference would be annoying, but surely it's better for huge canvases to at least work?
(In reply to comment #21) > (In reply to comment #18) > > (In reply to comment #16) > > > So can we do anything about this? Could we revert to the cairo image backend > > > when the canvas is too large to work with D2D? > > > > It's a possibility we debated. But back when I discussed this with Vlad, he > > was more in favor of limiting canvas sizes (possibly adding something to the > > spec to do it more sensibly). The problem is having massive differences in > > performance for large and small canvases is also a bit annoying. > > A massive performance difference would be annoying, but surely it's better > for huge canvases to at least work? It's a valid question. I don't know, this way we're at least not allocating huge amounts of memory. I certainly wouldn't object to a max size to canvases. Tiling software supported canvases wouldn't make me happy, but I also wouldn't object strongly to that. If we want to make them work I actually favour the sampling version, although that's going to look bad for text. And text is what some people use these huge canvases for (to draw timelines or graphs .... where I still think Canvas is a poor choice).
Other than memory allocation, using the cairo image backend for oversized canvases seems fine to me. It will look good, it won't impact performance of normal canvases, and it should be very easy to implement.
(In reply to comment #23) > Other than memory allocation, using the cairo image backend for oversized > canvases seems fine to me. It will look good, it won't impact performance of > normal canvases, and it should be very easy to implement. I agree. It will further encourage what I think is a bad thing though. It'll make it harder to take away in the future, or get it at accepted that Canvas are of limited size. FWIW IE10 Preview also still does not support larger than 8192 in either dimension.
Apparently, this bug causes an issue with Google+, with their "Submit feedback" system. Clicking on the button, a dialog box is supposed to appear. Summary of my findings, using FF5 on beta channel: 1) In normal conditions (addons + HW accel), the dialog do not show and a "(NS_ERROR_OUT_OF_MEMORY) [nsIDOMHTMLCanvasElement.getContext]" appear on the web console 2) in safe mode => OK 3) with all addons/plugins disabled manually => NS_ERROR_OUT_OF_MEMORY 4) Normal with HW disabled => OK Just my 2 cents...
(In reply to comment #25) > Apparently, this bug causes an issue with Google+, with their "Submit > feedback" system. Clicking on the button, a dialog box is supposed to appear. > > Summary of my findings, using FF5 on beta channel: > > 1) In normal conditions (addons + HW accel), the dialog do not show and a > "(NS_ERROR_OUT_OF_MEMORY) [nsIDOMHTMLCanvasElement.getContext]" appear on > the web console > 2) in safe mode => OK > 3) with all addons/plugins disabled manually => NS_ERROR_OUT_OF_MEMORY > 4) Normal with HW disabled => OK > > Just my 2 cents... That's really strange, I wonder what the underlying cause is.
Would anyone with Google+ access confirm my issue, please (I can send invites if needed (providing G+ invites are open)). Might be driver / GPU related. I have an ATI Radeopn HD 46xx.
(In reply to comment #27) > Would anyone with Google+ access confirm my issue, please (I can send > invites if needed (providing G+ invites are open)). > > Might be driver / GPU related. I have an ATI Radeopn HD 46xx. I can take a look but need an invite. (I also have a Radeon at work, so can test that if need be)
I can't invite anyone currently. The Feedback button works for me, so this problem doesn't affect everyone.
Indeed, it is working at work, too. Maybe a driver issue...
I'm the lead for Google Feedback and we've been getting a large number of reports of Feedback on Firefox 5 failing. The issues is as described by Chris Browet. We create a canvas that covers the entire page in a glass effect and use composite operations to cut sections out to make the annotator UI work. You can see the UI on http://www.youtube.com/ (at the bottom, click "Report a bug"). To reproduce on Google+: 1. Go to http://plus.google.com/ 2. Make sure you're on the Stream and not the welcome. 3. Scroll to the bottom of the page repeatedly and click "More" to load more posts into the stream. 4. Click "Send Feedback" in the bottom right corner. To reproduce on YouTube: 1. Go to http://www.youtube.com/ 2. Open the console and execute: document.body.style.height = '51721px'; 3. Click "Report a bug" Note: the 51721 number is a real value taken from a stream page that had More clicked a few times. Chrome and Safari are both fine with creating these massive canvases. We saw issues in Firefox 4 where the context would be created and not throw this exception, but the entire canvas was black. Making the canvas height less than 8192 made the canvas work again. Firefox 5 now throws an OOM exception instead of turning black. Note that this fails on both Intel and Nvidia graphics modes. OS X 10.6.8, Firefox 5.0, Macbook Pro, NVIDIA GeForce GT 330M/Intel HD Graphics.
Tested in Firefox 3.6 and there's no exception, but the canvas is also entirely transparent and nothing is drawn. Seems similar to the entirely black case from Firefox 4.
I can reproduce with the Youtube steps. Seems very likely that it's simply the texture size limit. I wrote the obvious patch to fall back to Thebes contexts and image surfaces if normal context creation fails. That works, but we die horribly painting the page since CanvasLayerD3D9/D3D10/OGL all assume that they can copy the source surface to a texture and composite that :-(.
I suppose it's worth noting that Chrome fails if the canvas height is greater than 32767 pixels.
> Seems very likely that it's simply the texture size limit. Except comment 31 claims to be on Mac... I wouldn't think that we'd run into the texture size limit there. If the issue in comment 31 appears on Mac, then it's a different bug, no?
Probably drawing into the canvas works, but GL compositing the canvas onto the window doesn't work.
(In reply to comment #36) > Probably drawing into the canvas works, but GL compositing the canvas onto > the window doesn't work. Yeah, the all black canvas did this. Drawing and toDataURL() both worked fine, but the user sees all black.
I have a patch that fixes this partially; we can create canvases up to 30000px high at least, probably up to 32767px. It doesn't fix the Youtube STR in comment #31, but then, that doesn't work for me in Chrome dev either.
(In reply to comment #38) > I have a patch that fixes this partially; we can create canvases up to > 30000px high at least, probably up to 32767px. It doesn't fix the Youtube > STR in comment #31, but then, that doesn't work for me in Chrome dev either. It works for me in Chrome 14.0.803.0 dev with heights in the 200000px range, but drawing is incredibly slow. Around 300000px the canvas fails to draw (it's entirely transparent). Chrome never throws exceptions though. The machine is as described in comment #31 32767 seems more than reasonable, we're looking into ways to avoid creating massive canvases too. The spec should probably be updated to specify the failure mode when the canvas is too large and there's not enough memory. Documentation about the exception for OOM situations would be very helpful. Chrome should probably throw this exception too instead of just drawing nothing. Thanks for looking into this!
Attached patch fixSplinter Review
Assignee: bas.schouten → roc
Attachment #544680 - Flags: review?
Attachment #544680 - Flags: review? → review?(bas.schouten)
I'm okay with this patch, but can we -please- move towards adding an API to Canvas for having a maximum size? Personally I'd feel much more for a mechanism in which a website that -really- needs a super-large canvas tiles itself. And canvases itself would be limited to the texture size supported by the underlying hardware. I still believe when wanting to created huge graphical displays vastly larger than the window size we're usually looking at things which are static. These things should be done through a declarative system like SVG where they also won't use insane amounts of memory. The bottom line is that keeping huge rasterizations much larger than the visible area in memory is a waste, and we should in my opinion encourage people not to do that.
Comment on attachment 544680 [details] [diff] [review] fix Review of attachment 544680 [details] [diff] [review]: ----------------------------------------------------------------- See my previous comment on creating a max size on canvases. ::: gfx/layers/d3d10/LayerManagerD3D10.h @@ +117,5 @@ > > + // D3D10 guarantees textures can be at least this size > + enum { > + MAX_TEXTURE_SIZE = 8192 > + }; Why not just make this a const PRUint32 or something like that?
Attachment #544680 - Flags: review?(bas.schouten) → review+
(In reply to comment #41) > I'm okay with this patch, but can we -please- move towards adding an API to > Canvas for having a maximum size? > > Personally I'd feel much more for a mechanism in which a website that > -really- needs a super-large canvas tiles itself. And canvases itself would > be limited to the texture size supported by the underlying hardware. I still > believe when wanting to created huge graphical displays vastly larger than > the window size we're usually looking at things which are static. These > things should be done through a declarative system like SVG where they also > won't use insane amounts of memory. Yes, SVG would work much better for Elliott's use-case, I think. However, I don't think it's a good idea to limit canvas sizes in the spec, or via some API. I don't think it would make sense for canvas users to tile explicitly; that's difficult to do at the JS level. I think as much as possible, features should "just work". Canvas sizes that seem inappropriate now might seem just fine in ten years. Actually, I think the long-term fix is to limit the canvas buffer size to the max texture size by automatically scaling down as necessary (or up, in high-DPI situations). Maybe we should do this in the Azure backend.
(In reply to comment #42) > ::: gfx/layers/d3d10/LayerManagerD3D10.h > @@ +117,5 @@ > > > > + // D3D10 guarantees textures can be at least this size > > + enum { > > + MAX_TEXTURE_SIZE = 8192 > > + }; > > Why not just make this a const PRUint32 or something like that? You can't write "static const PRUint32 MAX_TEXTURE_SIZE = 8192;" inline in the class declaration in standard C++, AFAIK.
(In reply to comment #44) > (In reply to comment #42) > > ::: gfx/layers/d3d10/LayerManagerD3D10.h > > @@ +117,5 @@ > > > > > > + // D3D10 guarantees textures can be at least this size > > > + enum { > > > + MAX_TEXTURE_SIZE = 8192 > > > + }; > > > > Why not just make this a const PRUint32 or something like that? > > You can't write "static const PRUint32 MAX_TEXTURE_SIZE = 8192;" inline in > the class declaration in standard C++, AFAIK. Hrm, so ISO C++ says you can for integral types: 9.4.2/4: "If a static data member is of const integral or const enumeration type, its declaration in the class definition can specify a constant-initializer which shall be an integral constant expression (5.19). In that case, the member can appear in integral constant expressions. The member shall still be defined in a namespace scope if it is used in the program and the namespace scope definition shall not contain an initializer." Anyway, if it could give compatibility issues we shouldn't do it of course.
I think I'm just wrong! But the latter sentence means we'd still have to write static const PRUint32 LayerManagerD3D10::MAX_TEXTURE_SIZE; somewhere, right? Which sucks.
http://hg.mozilla.org/integration/mozilla-inbound/rev/364d2debc2a7 Followup fix to mark passing test: http://hg.mozilla.org/integration/mozilla-inbound/rev/cd7a7ad30fca (thanks bz) I think the reason this test 579323-1.html started passing is that this patch makes never-drawn-into canvases return LAYER_INACTIVE for GetLayerState, which means the opacity layer will also be LAYER_INACTIVE, so the entire testcase is rendered by drawing into a single ThebesLayer. Effectively the test is no longer testing what it used to test. This is a good change, because it means canvases that aren't drawn into are handled more efficiently, but the test needs to be fixed.
Whiteboard: [inbound]
roc, can you either mark bug 623450 and bug 627560 as fixed (and file a new bug on fixing 579323-1.html) or mutate them to cover fixing 579323-1.html? I can't quite tell what the state of things is there at this point....
(In reply to comment #46) > I think I'm just wrong! > > But the latter sentence means we'd still have to write > static const PRUint32 LayerManagerD3D10::MAX_TEXTURE_SIZE; > somewhere, right? Which sucks. Yes, it does. Nor am I sure if g++ adheres to the standard and allows that.
Status: NEW → RESOLVED
Closed: 13 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla8
I'll mutate them.
Whiteboard: [inbound]
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: