Closed
Bug 76457
Opened 23 years ago
Closed 23 years ago
Doesn't load entire page
Categories
(Core :: DOM: HTML Parser, defect)
Tracking
()
People
(Reporter: klaas, Assigned: harishd)
References
()
Details
(Keywords: testcase)
Attachments
(2 files)
From Bugzilla Helper: User-Agent: Mozilla/5.0 (Windows; U; WinNT4.0; en-US; 0.8) Gecko/20010215 BuildID: 2001021508 At the left, there's a dynamically generated list of names. This page doesn't load completely. At the top is the total number of members, you'll notice it only gets to 193 or so. Reproducible: Always Steps to Reproduce: 1.Just go there 2.Look at the namelist at the left 3.There's a number at the top 4.Scroll down to the bottom 5.Hover over the last name, you'll see the numbers don't match Actual Results: Page displayed partially Expected Results: Page displayed completely Okay, it's a big list, but other browsers display it correctly.
Reporter | ||
Updated•23 years ago
|
The problem lies in the HTML. The closing tag is missing for all entries in the HTML file. Adding the closing tag solves the problem. However, this might indicate the weakness in the parser (kind of like not enough buffer/stack for parsing those buggy HTML code).
Reporter | ||
Comment 4•23 years ago
|
||
Thanks for this doctor, shows indeed a certain weakness in the parser, related to not closing tags. Reminds me of Netscape that crashed when it found a </div> tag that just stood there. Would be good to know if this happens with ALL tags after 193 times not being closed... the delights of dynamically generating pages... Should this 'bug' or weakness in the parser be confirmed to NEW?
Comment 5•23 years ago
|
||
This limitation of the number of levels of nesting is by design as a workaround to a serious crash bug (bug 18480). Basically, a large number of nested tags means a large number of recursive function calls in the way we do things now and we end up blowing the runtime stack. This is not a parser issue -- it's a content model issue. The parser parses arbitrary levels of nesting fine. This is probably a wontfix unless all of gecko is rearchitected (see discussion in bug 18480). Ccing jst for his comments.
Comment 6•23 years ago
|
||
*** This bug has been marked as a duplicate of 58917 ***
Severity: blocker → normal
Status: UNCONFIRMED → RESOLVED
Closed: 23 years ago
Keywords: testcase
Resolution: --- → DUPLICATE
That reminds me the brilliant insight from Bill Gates: "640kb should be enough for everybody." But I wonder how can Netscape 4 do that without problem... Can we just perform like NS4, if not to out-perform NS4?
Comment 8•23 years ago
|
||
Netscape 4's layout engine may be using an iterative approach instead of a recursive one....
You need to log in
before you can comment on or make changes to this bug.
Description
•