User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1b2) Gecko/20060821 Firefox/2.0b2 Build Identifier: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1b2) Gecko/20060821 Firefox/2.0b2 I have a web page I'm trying to view which consists of some invalid HTML. It is basically a series of several hundred tables, with each table containing a couple of other tables. Some of the <table> tags are not terminated with a </table>. This means that eventually the DOM reaches 128 nodes deep (rather than 10-12 as was intended by whoever wrote the code to view the page). It turns out that Firefox is pruning the DOM at a depth of 128. The page displays fine in Konqueror and IE, but is pruned on Firefox. This is not likely to be the best way to handle this situation, even though I realize this was probably intentional to avoid layouts bringing the browser to its knees. Perhaps the max number of levels should be raised to 1024, or lifted entirely? Or child nodes added as siblings beyond a certain depth, so that at least the info doesn't actually disappear? (I thought it was a missing record problem, then a server problem, then a browser bug, then I realized what was actually going on...) Reproducible: Always
Can you create a minimized testcase that demonstrates the bug?
Created attachment 240074 [details] Minimalistic test case This test case contains 64 repeats of (erroneously-nested) HTML fragment; only 32 repeats actually show. Additionally, repeats 1-30 show txt0 to txt7 correctly; repeat 31 shows only txt0, txt5, txt6 and txt7; and repeat 32 shows only txt0. The text that actually shows is what remains above the depth of 128 at that point in the document. Thanks!
Dup of bug 256180?
Let me try that again...
The limit is now 200. The limit is going to stay in the parser as long as layout has algorithms that are recursive along the depth of the tree. :-(
The DOM is presumably a DAG or tree, but should definitely be acyclic. If so, why prune the depth at all? If it is a limitation from the system stack running out of space, then why not implement a separate stack in the heap? (I'm sure that's a lot of work, all the recursive calls have to be moved into a loop that works from the heapified stack etc. -- how much code would this touch?)
(In reply to comment #8) > The DOM is presumably a DAG or tree, but should definitely be acyclic. The DOM is a tree (with parent pointers). > If so, > why prune the depth at all? As already mentioned in the comment above yours, layout has algorithms that are recursive along the depth of the tree. This kind of recursion easily runs out of stack space on Windows in particular. To keep the behavior of Gecko consistent across platforms, the limit of 200 is the same even on platforms that have deeper runtime stacks than Windows has. > If it is a limitation from the system stack > running out of space, then why not implement a separate stack in the heap? I don't know, but my best guess is that there are always more important layout problems to deal with than rewriting recursive algorithms into iterative ones.
As a note, the MSVC linker has an option to increase the stack size if necessary.
Setting dependency to bug 256180 which is pointed in comment #3, for ease of tracking and search.