User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:18.104.22.168) Gecko/20091109 Ubuntu/9.10 (karmic) Firefox/3.5.5 Build Identifier: Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:22.214.171.124) Gecko/20091109 Ubuntu/9.10 (karmic) Firefox/3.5.5 Firefox is vulnerable to gzip-bombs. Imagine a very big and sparse (many spaces) HTML. When served using Content-encoding: gzip the actual size transfered is like 100 times slower than what browser wants to render. This means that after loading 1 MB from network (which usually takes seconds) browser renders next 100 MB of HTML document. As rendering takes longer than downloading the GUI is frozen and never lets you even close the faulty tab just after very first seconds. Good thing is is does not eat RAM like Opera for example. On the other hand Opera have UI thread separated from rendering thread and let's you just close the tab freeing all allocated memory. Reproducible: Always Steps to Reproduce: 1. Go to http://lastlook.pl/~quake/die.php 2. Wait 10 seconds Actual Results: Firefox stops responding. Expected Results: Firefox should be still responsive and allow at least closing the tab. Ideally it could detect gzip bombs and stop loading them after certain size (100 MB uncompressed HTML?). When dealing with good files it should actually process them in chunks (for example max 10 MB plain -- which means 99% of webpages is one chunk) and wait till one chunk ends rendering before continuing downloading the file. This would make UI responsive at least between each 10 MB of plain source. PHP source is: http://lastlook.pl/~quake/die.php.txt The served file is: http://lastlook.pl/~quake/die.html.gz To safely view it: curl http://lastlook.pl/~quake/die.html.gz 2>/dev/null | gunzip | less
I'm fairly sure this is a dupe, we've talked about this problem. We're unlikely to fix this until we split the UI into its own thread or even its own process like Chrome.