Closed Bug 407231 Opened 17 years ago Closed 17 years ago

Loading large XML file extremely slow, freezing browser for minutes, 100% CPU

Categories

(Core :: XSLT, defect)

x86
Windows Server 2003
defect
Not set
normal

Tracking

()

RESOLVED DUPLICATE of bug 197956

People

(Reporter: mariusads, Unassigned)

Details

User-Agent:       Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US; rv:1.8.1.11) Gecko/20071127 Firefox/2.0.0.11
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US; rv:1.8.1.11) Gecko/20071127 Firefox/2.0.0.11

Not really a bug but probably a performance issue.
While testing a Sitemap generator in PHP, I have tried to load the dynamically generated URL in Firefox to see if it's generated properly and I've been surprised to see that Firefox froze for about 8-10 minutes all the time using about 420MB of  memory.
The dynamic URL is http://directory.helpedia.com/xmlsitemap.php and generates a file of about 4.5 MB (sent to the browser gzipped to about 360 KB). The generated file is attached to this post, so please don't abuse my web server, it's the same xml file.
The browser is extremely slow, loads the file in 8-10 minutes, uses one core of a Intel D805 at 2.66Ghz at 96-100%, and all this time is unresponsive, unlike Internet Explorer 7, which loads the file in about 20 seconds. 
Internet explorer also uses about 400MB of memory, so I guess Firefox would not be worse at memory usage, even though I don't see how a browser could need so much memory to parse 30.000 lines of XML with a very simple structure.

I suppose there will be answers saying who would be crazy to open a 4-5MB XML file, but maybe there are people out there trying to run a report and accidentally generate a huge xml file.

Reproducible: Always

Steps to Reproduce:
1. Download the attached XML file and load with Firefox

Actual Results:  
Loads extremely slow, 10-20 times slower than Internet Explorer, freezes, uses over 400MB of memory

Expected Results:  
Should load faster, maybe should not try to render the XML on the screen that often (if this is the cause), use less memory
The upload system won't allow files larger than 300KB, so the link to the XML
file is here:

http://directory.helpedia.com/xmlsitemap.rar 308 KB (315,788 bytes)

Regarding dupe of bug 197956, it's possible.
Severity: enhancement → normal
Component: File Handling → XSLT
Product: Firefox → Core
QA Contact: file.handling → xslt
Also note that we're lots and lots faster in firefox 3 already.
Status: UNCONFIRMED → RESOLVED
Closed: 17 years ago
Resolution: --- → DUPLICATE
Oh, and the reason that both IE and firefox use so much memory is that they are web-browsers, not simple XML readers. So we both turn that 4.5MB xml file into a gigantic web page. You wouldn't be surprised if a 4.5MB HTML page used lots of memory, right?
Actually, I would be surprised if a 4.5MB HTML page would use 400MB of memory to render. 
Are you saying that if I take a 4-5MB text book from Project Gutenberg and convert it to HTML using only paragraphs and indents, it would be normal to see Firefox using 400MB of memory and freezing for over 10 minutes? For me, it doesn't sound normal, especially when there's no Javascript to initialize and use, no CSS, no nothing, just plain HTML.

I'm a programmer but compared to the guys that work on Firefox, i would be a lousy one. However it's hard for me to accept this as normal.

It seems to me Firefox retrieves bits of the XML file from the server and renders those bits as received from web server. 

Does it make sense to have Firefox refresh for ever and ever the rendering of the XML file with each chunk of data received from the server, if the user is looking at the top of the page? I believe that's why it loads so slow, because the XML file keeps getting refreshed. But maybe I'm wrong, I don't know..
Maybe it's possible to have the XML file refreshed faster for the first few vertical screens and for the rest, render it less often.

Sorry for the rant... you guys are doing a great job anyway and don't need to be annoyed by people like me.
Actually, the XML display here is a one-shot thing: we apply the XSLT transformation once it all loads.

For your text book example, I'd expect 4-5MB of textbook to end up as 40-50MB in memory, based on my experience.  The XML prettyprinter is particularly bloaty, especially on the 1.8 branch.  On trunk it uses a good bit less memory.  Nevertheless it uses nested tags for every level of tag nesting you have (to allow the expand/collapse behavior), which takes up a good bit of RAM, sadly.
Rendering web pages uses a lot of memory. We always try to improve it and we will keep doing so. However the web is a complex place, and as you pointed out, other browsers are struggling with this problem too.

The 10 minute freeze is not good though, and like I said before, it's much much better in firefox 3. And it certainly wouldn't be acceptable if that happened when loading an HTML page. It's definitely not good when viewing XML either. But it's not something that affects most people.
You need to log in before you can comment on or make changes to this bug.