Last Comment Bug 649924 - HTML5 Canvas slow/out of memory on Win7/64Bit
: HTML5 Canvas slow/out of memory on Win7/64Bit
Status: RESOLVED FIXED
: regression
Product: Core
Classification: Components
Component: Canvas: 2D (show other bugs)
: 2.0 Branch
: x86_64 Windows 7
: -- normal (vote)
: mozilla8
Assigned To: Robert O'Callahan (:roc) (Exited; email my personal email if necessary)
:
Mentors:
Depends on: 633936
Blocks: 634581
  Show dependency treegraph
 
Reported: 2011-04-13 23:21 PDT by marc
Modified: 2011-11-10 01:01 PST (History)
17 users (show)
See Also:
Crash Signature:
(edit)
QA Whiteboard:
Iteration: ---
Points: ---
Has Regression Range: ---
Has STR: ---


Attachments
HTML Sample File (440 bytes, text/html)
2011-04-15 10:54 PDT, marc
no flags Details
Error Screenshot (85.10 KB, image/png)
2011-04-15 10:55 PDT, marc
no flags Details
fixed testcase (442 bytes, text/html)
2011-05-31 20:30 PDT, Robert O'Callahan (:roc) (Exited; email my personal email if necessary)
no flags Details
fix (19.49 KB, patch)
2011-07-07 18:54 PDT, Robert O'Callahan (:roc) (Exited; email my personal email if necessary)
bas: review+
Details | Diff | Splinter Review

Description marc 2011-04-13 23:21:40 PDT
User-Agent:       Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0) Gecko/20100101 Firefox/4.0
Build Identifier: 4.0

I use a very large HTML5 Canvas (640*1000) for painting. Running this with WinXP/32Bit is no problem. With Win7/64Bit the canvas size is limited to 640x8192. A larger canvas results in an "0x8007000e (NS_ERROR_OUT_OF_MEMORY) [nsIDOMHTMLCanvasElement.getContext]" error.

An other problem is, that paint on the large canvas is much slower on Win7/64 compared to WinXP/32?

Reproducible: Always

Steps to Reproduce:
<canvas width="640" height="10000" id="Console"></canvas>
...
<script>
  var objCanvas = document.getElementById("Console");
  var context = objCanvas.getContext("2d");
</script>
Actual Results:  
Fehler: uncaught exception: [Exception... "Component returned failure code: 0x8007000e (NS_ERROR_OUT_OF_MEMORY) [nsIDOMHTMLCanvasElement.getContext]"  nsresult: "0x8007000e (NS_ERROR_OUT_OF_MEMORY)"  location: "JS frame :: file:///C:/Users/Windows%207/Desktop/test.html :: <TOP_LEVEL> :: line 10"  data: no]

Expected Results:  
context object

-
Comment 1 Kevin Brosnan [:kbrosnan] 2011-04-15 00:12:07 PDT
Do you have a testcase that could be attached to the bug?
Comment 2 marc 2011-04-15 10:54:26 PDT
Created attachment 526305 [details]
HTML Sample File
Comment 3 marc 2011-04-15 10:55:28 PDT
Created attachment 526306 [details]
Error Screenshot
Comment 4 marc 2011-04-15 10:57:59 PDT
The attached sample page works not on FF4, Windows7, 64 Bit.

(Chrome and IE have no problems)
Comment 5 Boris Zbarsky [:bz] 2011-04-15 18:14:23 PDT
This is more likely to be related to hardware acceleration (see bug 633936) than 32 vs 64 bit.  WinXP doesn't have direct2d; Win7 does.  Does turning off direct2d fix your issue with sizes larger than 8192?

If so, then we should focus this on the performance problem.
Comment 6 marc 2011-04-15 23:50:17 PDT
I have disabled "Options/...Use hardware acceleration when available"
-> No change, same error.
Comment 7 Boris Zbarsky [:bz] 2011-04-16 04:55:22 PDT
You restarted the browser after doing that
Comment 8 marc 2011-04-17 12:55:18 PDT
Yes, I have restarted FF.  Same error.
Comment 9 Boris Zbarsky [:bz] 2011-04-21 18:36:25 PDT
That's really odd.  Bas, Benoit, are we limiting canvas sizes on platforms that support hardware accel even when hardware accel is off?
Comment 10 Mark Smith (:mcs) 2011-05-31 11:33:49 PDT
(In reply to comment #9)
> That's really odd.  Bas, Benoit, are we limiting canvas sizes on platforms
> that support hardware accel even when hardware accel is off?

bz,Bas,Benoit -- any progress on this issue?  I suspect that some users of our Page Saver extension are running into this bug (Page Saver uses drawWindow() with a potentially large canvas).  Unfortunately, I don't have access to a Win7 computer where hardware acceleration is possible, so I can't be sure.
Comment 11 Benoit Jacob [:bjacob] (mostly away) 2011-05-31 11:43:23 PDT
CC'd a few people who might know. Not my area :)
Comment 12 Boris Zbarsky [:bz] 2011-05-31 12:12:21 PDT
Not mine other (on at least two counts!)...
Comment 13 Bas Schouten (:bas.schouten) 2011-05-31 14:29:23 PDT
We limit the canvas size when using hardware acceleration, I don't see why it would be limited when not using hardware acceleration.
Comment 14 Bas Schouten (:bas.schouten) 2011-05-31 14:30:37 PDT
I cannot reproduce any slow painting fwiw, with HW acceleration at least.
Comment 15 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-05-31 20:30:54 PDT
Created attachment 536513 [details]
fixed testcase

The provided testcase is a bit misleading; Gecko doesn't support innerText, so "Hurray It Works!" is never displayed. Here is a fixed testcase that uses textContent instead.
Comment 16 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-05-31 20:34:01 PDT
With that testcase, on Windows 7, I don't get this error with acceleration disabled, but I do get it with acceleration enabled.

So can we do anything about this? Could we revert to the cairo image backend when the canvas is too large to work with D2D?
Comment 17 Boris Zbarsky [:bz] 2011-05-31 21:08:59 PDT
Sounds like this is a duplicate of bug 633936, then.
Comment 18 Bas Schouten (:bas.schouten) 2011-05-31 23:28:13 PDT
(In reply to comment #16)
> With that testcase, on Windows 7, I don't get this error with acceleration
> disabled, but I do get it with acceleration enabled.
> 
> So can we do anything about this? Could we revert to the cairo image backend
> when the canvas is too large to work with D2D?

It's a possibility we debated. But back when I discussed this with Vlad, he was more in favor of limiting canvas sizes (possibly adding something to the spec to do it more sensibly). The problem is having massive differences in performance for large and small canvases is also a bit annoying.

Another thing we could do is draw everything to the canvas 'scaled down' make the canvas max Texture size and then scale up when blending the layer. Obviously this causes resampling artifacts.

The other option is what IE9 does, display the first 'maxTextureWidth' pixels and leave out the rest.

In general I'm not sure if I think someone making a canvas significantly larger than the screen is generally using Canvas the way they should. These things will grow very big in memory and usually will be used for static content (after all a user will have to be scrolling them), where in many cases SVG or some other markup based thing is possibly a better option.

I don't know.
Comment 19 Boris Zbarsky [:bz] 2011-06-01 05:30:04 PDT
The way someone "should" use canvas is however one would use an image...  There are plenty of use cases for images that are significantly larger than the screen size.

Now repeatedly redrawing a huge canvas doesn't seem like it has much in the way of use cases.  But the numerous bug reports we've had about this problem aren't doing that.

Furthermore, having canvas behave completely differently for different platforms (e.g. the large canvases work on Mac and Linux, right?) seems like a bad thing for the web.  So I don't think we should do either what we do now or what IE9 does.
Comment 20 Bas Schouten (:bas.schouten) 2011-06-01 06:55:11 PDT
(In reply to comment #19)
> The way someone "should" use canvas is however one would use an image... 
> There are plenty of use cases for images that are significantly larger than
> the screen size.

The use cases afaics, are generally very high resolution images, or poorly implemented tile maps (i.e. that tile in 1 dimension). Neither of which seem like they apply very well to canvas.

> 
> Now repeatedly redrawing a huge canvas doesn't seem like it has much in the
> way of use cases.  But the numerous bug reports we've had about this problem
> aren't doing that.

Well, not repeatedly drawing it sounds bad too, in the Software case for example. Having a 10000x2000 pixel canvas will consume 80 MB of RAM -just to maintain the surface- (or 80 MB of VRAM in the HW accelerated case, plus roughly another 80MB of VRAM for support surfaces). And unlike in the case of an image, you can't discard that memory once you've drawn to the screen. Which is a bad practice for a bunch of pages to do. Especially considering people love to leave a bunch of tabs open.

> 
> Furthermore, having canvas behave completely differently for different
> platforms (e.g. the large canvases work on Mac and Linux, right?) seems like
> a bad thing for the web.  So I don't think we should do either what we do
> now or what IE9 does.

Yes, consistency is important. I'm personally in favor of either capping canvas sizes completely, or adding something to the API to report max sizes.
Comment 21 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-06-01 17:50:07 PDT
(In reply to comment #18)
> (In reply to comment #16)
> > So can we do anything about this? Could we revert to the cairo image backend
> > when the canvas is too large to work with D2D?
> 
> It's a possibility we debated. But back when I discussed this with Vlad, he
> was more in favor of limiting canvas sizes (possibly adding something to the
> spec to do it more sensibly). The problem is having massive differences in
> performance for large and small canvases is also a bit annoying.

A massive performance difference would be annoying, but surely it's better for huge canvases to at least work?
Comment 22 Bas Schouten (:bas.schouten) 2011-06-01 18:02:45 PDT
(In reply to comment #21)
> (In reply to comment #18)
> > (In reply to comment #16)
> > > So can we do anything about this? Could we revert to the cairo image backend
> > > when the canvas is too large to work with D2D?
> > 
> > It's a possibility we debated. But back when I discussed this with Vlad, he
> > was more in favor of limiting canvas sizes (possibly adding something to the
> > spec to do it more sensibly). The problem is having massive differences in
> > performance for large and small canvases is also a bit annoying.
> 
> A massive performance difference would be annoying, but surely it's better
> for huge canvases to at least work?

It's a valid question. I don't know, this way we're at least not allocating huge amounts of memory. I certainly wouldn't object to a max size to canvases. Tiling software supported canvases wouldn't make me happy, but I also wouldn't object strongly to that.

If we want to make them work I actually favour the sampling version, although that's going to look bad for text. And text is what some people use these huge canvases for (to draw timelines or graphs .... where I still think Canvas is a poor choice).
Comment 23 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-06-01 18:59:47 PDT
Other than memory allocation, using the cairo image backend for oversized canvases seems fine to me. It will look good, it won't impact performance of normal canvases, and it should be very easy to implement.
Comment 24 Bas Schouten (:bas.schouten) 2011-06-01 19:13:39 PDT
(In reply to comment #23)
> Other than memory allocation, using the cairo image backend for oversized
> canvases seems fine to me. It will look good, it won't impact performance of
> normal canvases, and it should be very easy to implement.

I agree. It will further encourage what I think is a bad thing though. It'll make it harder to take away in the future, or get it at accepted that Canvas are of limited size.

FWIW IE10 Preview also still does not support larger than 8192 in either dimension.
Comment 25 Chris Browet 2011-07-02 05:22:13 PDT
Apparently, this bug causes an issue with Google+, with their "Submit feedback" system. Clicking on the button, a dialog box is supposed to appear.

Summary of my findings, using FF5 on beta channel:

1) In normal conditions (addons + HW accel), the dialog do not show and a "(NS_ERROR_OUT_OF_MEMORY) [nsIDOMHTMLCanvasElement.getContext]" appear on the web console
2) in safe mode => OK
3) with all addons/plugins disabled manually => NS_ERROR_OUT_OF_MEMORY
4) Normal with HW disabled => OK

Just my 2 cents...
Comment 26 Bas Schouten (:bas.schouten) 2011-07-03 01:36:03 PDT
(In reply to comment #25)
> Apparently, this bug causes an issue with Google+, with their "Submit
> feedback" system. Clicking on the button, a dialog box is supposed to appear.
> 
> Summary of my findings, using FF5 on beta channel:
> 
> 1) In normal conditions (addons + HW accel), the dialog do not show and a
> "(NS_ERROR_OUT_OF_MEMORY) [nsIDOMHTMLCanvasElement.getContext]" appear on
> the web console
> 2) in safe mode => OK
> 3) with all addons/plugins disabled manually => NS_ERROR_OUT_OF_MEMORY
> 4) Normal with HW disabled => OK
> 
> Just my 2 cents...

That's really strange, I wonder what the underlying cause is.
Comment 27 Chris Browet 2011-07-03 02:14:16 PDT
Would anyone with Google+ access confirm my issue, please (I can send invites if needed (providing G+ invites are open)).

Might be driver / GPU related. I have an ATI Radeopn HD 46xx.
Comment 28 Jeff Muizelaar [:jrmuizel] 2011-07-03 05:07:31 PDT
(In reply to comment #27)
> Would anyone with Google+ access confirm my issue, please (I can send
> invites if needed (providing G+ invites are open)).
> 
> Might be driver / GPU related. I have an ATI Radeopn HD 46xx.

I can take a look but need an invite. (I also have a Radeon at work, so can test that if need be)
Comment 29 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-03 22:40:47 PDT
I can't invite anyone currently.

The Feedback button works for me, so this problem doesn't affect everyone.
Comment 30 Chris Browet 2011-07-04 01:39:42 PDT
Indeed, it is working at work, too. Maybe a driver issue...
Comment 31 Elliott Sprehn 2011-07-07 01:42:21 PDT
I'm the lead for Google Feedback and we've been getting a large number of reports of Feedback on Firefox 5 failing. The issues is as described by Chris Browet. We create a canvas that covers the entire page in a glass effect and use composite operations to cut sections out to make the annotator UI work. You can see the UI on http://www.youtube.com/ (at the bottom, click "Report a bug").

To reproduce on Google+:

1. Go to http://plus.google.com/
2. Make sure you're on the Stream and not the welcome.
3. Scroll to the bottom of the page repeatedly and click "More" to load more posts into the stream.
4. Click "Send Feedback" in the bottom right corner.

To reproduce on YouTube:

1. Go to http://www.youtube.com/
2. Open the console and execute: document.body.style.height = '51721px';
3. Click "Report a bug"

Note: the 51721 number is a real value taken from a stream page that had More clicked a few times.

Chrome and Safari are both fine with creating these massive canvases. We saw issues in Firefox 4 where the context would be created and not throw this exception, but the entire canvas was black. Making the canvas height less than 8192 made the canvas work again.

Firefox 5 now throws an OOM exception instead of turning black. Note that this fails on both Intel and Nvidia graphics modes.

OS X 10.6.8, Firefox 5.0, Macbook Pro, NVIDIA GeForce GT 330M/Intel HD Graphics.
Comment 32 Elliott Sprehn 2011-07-07 01:53:12 PDT
Tested in Firefox 3.6 and there's no exception, but the canvas is also entirely transparent and nothing is drawn. Seems similar to the entirely black case from Firefox 4.
Comment 33 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-07 03:31:52 PDT
I can reproduce with the Youtube steps. Seems very likely that it's simply the texture size limit.

I wrote the obvious patch to fall back to Thebes contexts and image surfaces if normal context creation fails. That works, but we die horribly painting the page since CanvasLayerD3D9/D3D10/OGL all assume that they can copy the source surface to a texture and composite that :-(.
Comment 34 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-07 03:42:28 PDT
I suppose it's worth noting that Chrome fails if the canvas height is greater than 32767 pixels.
Comment 35 Boris Zbarsky [:bz] 2011-07-07 08:21:46 PDT
> Seems very likely that it's simply the texture size limit.

Except comment 31 claims to be on Mac... I wouldn't think that we'd run into the texture size limit there.  If the issue in comment 31 appears on Mac, then it's a different bug, no?
Comment 36 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-07 15:27:46 PDT
Probably drawing into the canvas works, but GL compositing the canvas onto the window doesn't work.
Comment 37 Elliott Sprehn 2011-07-07 15:54:53 PDT
(In reply to comment #36)
> Probably drawing into the canvas works, but GL compositing the canvas onto
> the window doesn't work.

Yeah, the all black canvas did this. Drawing and toDataURL() both worked fine, but the user sees all black.
Comment 38 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-07 17:40:30 PDT
I have a patch that fixes this partially; we can create canvases up to 30000px high at least, probably up to 32767px. It doesn't fix the Youtube STR in comment #31, but then, that doesn't work for me in Chrome dev either.
Comment 39 Elliott Sprehn 2011-07-07 18:01:34 PDT
(In reply to comment #38)
> I have a patch that fixes this partially; we can create canvases up to
> 30000px high at least, probably up to 32767px. It doesn't fix the Youtube
> STR in comment #31, but then, that doesn't work for me in Chrome dev either.

It works for me in Chrome 14.0.803.0 dev with heights in the 200000px range, but drawing is incredibly slow. Around 300000px the canvas fails to draw (it's entirely transparent). Chrome never throws exceptions though. The machine is as described in comment #31

32767 seems more than reasonable, we're looking into ways to avoid creating massive canvases too.

The spec should probably be updated to specify the failure mode when the canvas is too large and there's not enough memory. Documentation about the exception for OOM situations would be very helpful. Chrome should probably throw this exception too instead of just drawing nothing.

Thanks for looking into this!
Comment 40 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-07 18:54:51 PDT
Created attachment 544680 [details] [diff] [review]
fix
Comment 41 Bas Schouten (:bas.schouten) 2011-07-07 23:57:07 PDT
I'm okay with this patch, but can we -please- move towards adding an API to Canvas for having a maximum size?

Personally I'd feel much more for a mechanism in which a website that -really- needs a super-large canvas tiles itself. And canvases itself would be limited to the texture size supported by the underlying hardware. I still believe when wanting to created huge graphical displays vastly larger than the window size we're usually looking at things which are static. These things should be done through a declarative system like SVG where they also won't use insane amounts of memory.

The bottom line is that keeping huge rasterizations much larger than the visible area in memory is a waste, and we should in my opinion encourage people not to do that.
Comment 42 Bas Schouten (:bas.schouten) 2011-07-08 00:06:21 PDT
Comment on attachment 544680 [details] [diff] [review]
fix

Review of attachment 544680 [details] [diff] [review]:
-----------------------------------------------------------------

See my previous comment on creating a max size on canvases.

::: gfx/layers/d3d10/LayerManagerD3D10.h
@@ +117,5 @@
>  
> +  // D3D10 guarantees textures can be at least this size
> +  enum {
> +    MAX_TEXTURE_SIZE = 8192
> +  };

Why not just make this a const PRUint32 or something like that?
Comment 43 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-08 03:34:02 PDT
(In reply to comment #41)
> I'm okay with this patch, but can we -please- move towards adding an API to
> Canvas for having a maximum size?
> 
> Personally I'd feel much more for a mechanism in which a website that
> -really- needs a super-large canvas tiles itself. And canvases itself would
> be limited to the texture size supported by the underlying hardware. I still
> believe when wanting to created huge graphical displays vastly larger than
> the window size we're usually looking at things which are static. These
> things should be done through a declarative system like SVG where they also
> won't use insane amounts of memory.

Yes, SVG would work much better for Elliott's use-case, I think.

However, I don't think it's a good idea to limit canvas sizes in the spec, or via some API. I don't think it would make sense for canvas users to tile explicitly; that's difficult to do at the JS level. I think as much as possible, features should "just work". Canvas sizes that seem inappropriate now might seem just fine in ten years.

Actually, I think the long-term fix is to limit the canvas buffer size to the max texture size by automatically scaling down as necessary (or up, in high-DPI situations). Maybe we should do this in the Azure backend.
Comment 44 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-08 03:35:07 PDT
(In reply to comment #42)
> ::: gfx/layers/d3d10/LayerManagerD3D10.h
> @@ +117,5 @@
> >  
> > +  // D3D10 guarantees textures can be at least this size
> > +  enum {
> > +    MAX_TEXTURE_SIZE = 8192
> > +  };
> 
> Why not just make this a const PRUint32 or something like that?

You can't write "static const PRUint32 MAX_TEXTURE_SIZE = 8192;" inline in the class declaration in standard C++, AFAIK.
Comment 45 Bas Schouten (:bas.schouten) 2011-07-08 04:01:02 PDT
(In reply to comment #44)
> (In reply to comment #42)
> > ::: gfx/layers/d3d10/LayerManagerD3D10.h
> > @@ +117,5 @@
> > >  
> > > +  // D3D10 guarantees textures can be at least this size
> > > +  enum {
> > > +    MAX_TEXTURE_SIZE = 8192
> > > +  };
> > 
> > Why not just make this a const PRUint32 or something like that?
> 
> You can't write "static const PRUint32 MAX_TEXTURE_SIZE = 8192;" inline in
> the class declaration in standard C++, AFAIK.

Hrm, so ISO C++ says you can for integral types:

9.4.2/4: "If a static data member is of const integral or const enumeration type, its declaration in the class definition can specify a constant-initializer which shall be an integral constant expression (5.19). In that case, the member can appear in integral constant expressions. The member shall still be defined in a namespace scope if it is used in the program and the namespace scope definition shall not contain an initializer."

Anyway, if it could give compatibility issues we shouldn't do it of course.
Comment 46 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-08 15:18:44 PDT
I think I'm just wrong!

But the latter sentence means we'd still have to write
  static const PRUint32 LayerManagerD3D10::MAX_TEXTURE_SIZE;
somewhere, right? Which sucks.
Comment 47 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-08 16:25:56 PDT
http://hg.mozilla.org/integration/mozilla-inbound/rev/364d2debc2a7
Followup fix to mark passing test:
http://hg.mozilla.org/integration/mozilla-inbound/rev/cd7a7ad30fca
(thanks bz)

I think the reason this test 579323-1.html started passing is that this patch makes never-drawn-into canvases return LAYER_INACTIVE for GetLayerState, which means the opacity layer will also be LAYER_INACTIVE, so the entire testcase is rendered by drawing into a single ThebesLayer. Effectively the test is no longer testing what it used to test.

This is a good change, because it means canvases that aren't drawn into are handled more efficiently, but the test needs to be fixed.
Comment 48 Boris Zbarsky [:bz] 2011-07-08 20:28:42 PDT
roc, can you either mark bug 623450 and bug 627560 as fixed (and file a new bug on fixing 579323-1.html) or mutate them to cover fixing 579323-1.html?  I can't quite tell what the state of things is there at this point....
Comment 49 Bas Schouten (:bas.schouten) 2011-07-09 10:41:36 PDT
(In reply to comment #46)
> I think I'm just wrong!
> 
> But the latter sentence means we'd still have to write
>   static const PRUint32 LayerManagerD3D10::MAX_TEXTURE_SIZE;
> somewhere, right? Which sucks.

Yes, it does. Nor am I sure if g++ adheres to the standard and allows that.
Comment 50 Kyle Huey [:khuey] (khuey@mozilla.com) 2011-07-09 20:21:09 PDT
http://hg.mozilla.org/mozilla-central/rev/364d2debc2a7
Comment 51 Robert O'Callahan (:roc) (Exited; email my personal email if necessary) 2011-07-10 04:19:39 PDT
I'll mutate them.

Note You need to log in before you can comment on or make changes to this bug.