Closed Bug 554513 Opened 10 years ago Closed 10 years ago

[HTML5] DoS limit on buffer sizes is too small for attribute values on client.schwab.com to fit

Categories

(Core :: DOM: HTML Parser, defect, P1)

x86
Linux
defect

Tracking

()

VERIFIED FIXED

People

(Reporter: info, Unassigned)

References

()

Details

(Keywords: regression)

User-Agent:       Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.3a4pre) Gecko/20100323  Minefield/3.7a1pre
Build Identifier: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.3a4pre) Gecko/20100323  Minefield/3.7a1pre ID:20100323030550

Some links on this URL stopped working with recent (last few days or so?) nightly linux 64-bit builds.  The site's response is a web page displaying "An error has been detected... please try again."  If I set html5.enable to false, these links work fine, and they work fine in the Konqueror browser.  No errors show in Tools > Error Console.

All the links that break use JavaScript to do a POST, e.g.:
javascript:__doPostBack('ctl00$wpm$AccountSummary$AccountSummary$repBankAccounts$ctl00$lnkBankAccountId','')

Reproducible: Always

Steps to Reproduce:
1.  Set up an account at schwab.com
2.  Login, go to account summary page
3.  Click on any of the links under Account or Next Steps
Actual Results:  
If html5.enable is true, the site displays an error page -- it didn't like something it got from the browser.  With html5.enable false it works fine.


I've struggled to puzzle out the interaction in Firebug.
The POST contents are less with html5 enabled:
  Content-Type: application/x-www-form-urlencoded
  Content-Length: 4305 , vs. 12114 with html5 disabled.

The length difference is nearly all in a __VIEWSTATE= parameter in the POST that's much shorter (3850 vs. 9772).  And it has some weird encoded stuff at the front:
  %E2%80%A6%EF%BF%BDjaH...
vs.
  %2FwEPDwUJODI4NjMyMzY...
When Firebug displays the former in POST > Post Summary > Parameters, it shows   "…�jaH", presumably because it's invalid under some Unicode encoding.  But all this could be irrelevant.  It could be going wrong before the POST as the page introspects/builds up/encodes its __VIEWSTATE.

I can reproduce at will so let me know if there's anything more I can do to help!

about:buildconfig reports

Source

Built from http://hg.mozilla.org/mozilla-central/rev/e9b7e0b5821d
Build platform
target
x86_64-unknown-linux-gnu

Build tools
Compiler 	Version 	Compiler flags
/tools/gcc/bin/gcc 	gcc version 4.3.3 (GCC) 	-Wall -W -Wno-unused -Wpointer-arith -Wcast-align -W -Wno-long-long -pedantic -gstabs+ -fno-strict-aliasing -pthread -pipe -DNDEBUG -DTRIMMED -gstabs+ -Os -freorder-blocks -fno-reorder-functions
/tools/gcc/bin/g++ 	gcc version 4.3.3 (GCC) 	-fno-rtti -fno-exceptions -Wall -Wpointer-arith -Woverloaded-virtual -Wsynth -Wno-ctor-dtor-privacy -Wno-non-virtual-dtor -Wcast-align -Wno-invalid-offsetof -Wno-variadic-macros -Werror=return-type -Wno-long-long -pedantic -gstabs+ -fno-strict-aliasing -fshort-wchar -pthread -pipe -DNDEBUG -DTRIMMED -gstabs+ -Os -freorder-blocks -fno-reorder-functions

Configure arguments
--enable-application=browser --enable-optimize --enable-update-channel=nightly --enable-update-packaging --disable-debug --enable-tests --enable-codesighs --enable-debug-symbols=-gstabs+
skierpage, can you maybe hunt down the build id of the last working nightly and the build id of the first broken one?
It seems that I made the Denial of Service avoidance limit on attribute value length too small.

Marking NEW without actually confirming on the site, because the described symptoms are obvious enough.

All that's needed to proceed is a better guess of what the length limit should be so that real sites work but that pages that have immense attribute values don't make the HTML5 parser kill the app with memory exhaustion.
Status: UNCONFIRMED → NEW
Ever confirmed: true
Priority: -- → P1
Summary: [html5] JavaScript POSTs lead to site error → [HTML5] DoS limit on buffer sizes is too small for attribute values on client.schwab.com to fit
How does the old parser handle this?
(In reply to comment #3)
> How does the old parser handle this?

It seems to me the old parser has no limit on the length of attribute values.
I guess I could remove all limits from the tokenizer and leave the stack depth limit (copied from the old parser) in the tree builder. 

The other things to consider are the length of the list of formatting elements (currently unlimited) and a length-based flushing threshold for text node contents (copied from the old parser but doesn't do the same thing as in the old parser).
I removed the limits that didn't have an exact corresponding limit in the old parser. That is, I reverted the limit additions for everything except the stack depth.
http://hg.mozilla.org/mozilla-central/rev/560598d37063
Blocks: 483209
Status: NEW → RESOLVED
Closed: 10 years ago
Resolution: --- → FIXED
Those POST links work again, thanks!
Status: RESOLVED → VERIFIED
You need to log in before you can comment on or make changes to this bug.