AREAsec's article about bzip2 bombs contains tests where Mozilla's crash. Mozilla needs hardcoded way to stop decompressing compressed data (png,html and so on) at some point. http://www.aerasec.de/security/advisories/decompression-bomb-vulnerability.html#Web_browsers
(In reply to comment #0) > Mozilla needs hardcoded way to stop decompressing compressed data (png,html and > so on) at some point. the problem has little to do with decompressing data and everything to do with trying to load a 1GB (or whatever) file. Doing this is going to hog resources until: 1. the system decides Mozilla is using too many resources and shuts it down 2. Mozilla notices that it ran out of memory and stops loading 3. Mozilla runs out of memory and crashes 4. Mozilla successfully loads the file #3 is the only thing that can be fixed in Mozilla. On linux, the crash mentioned on the page (1GB png) is actually a gdk crash, so even that is difficult to fix. And there is already a bug on not trying to load huge images on linux.
1) It's not only about images so i wouldn't say it's a dupe 2) Is it really impossible to implement Your's point 2?
2 and 4 are considered mozilla behaving correctly, there's nothing there to 'fix'. the only fixable item on his list is 3. and you're sort of misunderstanding his other point. some crashes are in code which we don't own, the example given is an image which crashes in gdk land. for a text file or html file we're more in control than for an image.
I understand it. Just wanted to notice that there is such problem and meaby it should be fixed where we are able to fix. According to tests Mozilla's don't stop loading gzipped HTML when it runs out of memory - so point 2 from Andrew's list is not working. And that's what I'm asking for.
Summary: Mozilla is vunerable on bzip2 bombs → Mozilla is vulnerable on bzip2 bombs
> According to tests Mozilla's don't stop loading gzipped HTML when it runs out > of memory What sort of tests? On what OS? How did you get it to run out of memory, exactly?
1. i'm not sure we need a hardcoded limit. what if i want to view the content? 2. the report did not indicate that mozilla actually crashed on windows for html input. 3. the comment about running out of memory is misleading if not outright false. if windows decides to grow vm instead of telling mozilla that it doesn't have memory, then that's not running out of memory from mozilla's perspective, there's not much we can/should do there.
Summary: Mozilla is vulnerable on bzip2 bombs → Mozilla is vulnerable to bzip2 bombs
this summary is outright misleading. those files use gzip, not bzip2. mozilla doesn't support a bzip2 content-encoding.
Summary: Mozilla is vulnerable to bzip2 bombs → Mozilla is vulnerable to gzip bombs
btw, mozilla crashed here (win2k, current cvs) loading the 1G HTML "bomb". I tried to get a stack but failed, as MSDEV.EXE also crashed. I got alerts telling me to increase the amount of virtual memory...
15 years ago
15 years ago
As a user, the last thing I want to do is rely on the OS to police memory usage -- that typically means it sits there and thrashes for a while, first, making my system useless. Seems to me the answer is to have a user preference: Max size allowed for one webpage's data. Set it to say, 1MB. Anytime Mozilla hits that limit, it stops loading/decompressing/displaying/whatever. Downloads (saved to disk) would not count. Yes, this means all the routines need to have a limit in them (all the strcpy()s need to become strncpy()s, so to speak). This "sanitizing" is similar to what the Linux kernel people have had to do to combat buffer-overflow attacks. Re: the "not in our code" problem -- agreed that this could be a pain to get done. Pushing the relevant parties to make their code safer would benefit everyone, however.
>As a user, the last thing I want to do is rely on the OS to police memory usage >-- that typically means it sits there and thrashes for a while, first, making my >system useless. as a user, I find that perfectly fine behaviour.
> the last thing I want to do is rely on the OS to police memory usage Excuse me? This is one of the few things the OS _should_ be doing. Handling allocation of hardware resources to applications. Applications have no call determining the hardware limitations of the system themselves -- that's all handled by the OS. Setting an artificial limit is not the anwer -- it won't help people who have fewer resources than the limit, it will hinder those who need to view content bigger than the limit (1MB is ludicrously low), and there is no way to dynamically set the limit (since the OS abstracts away the actual hardware). Let's not make up problems that don't exist, ok? We should not crash on out of memory, yes. But we should not be in the business of working around OSes which can't manage to schedule a swapping process such that other processes can run at the same time.
While it's certainly impossible to completely eliminate the possibility that the OS or some system library will abort mozilla, it should be possible to reduce the likelihood. The single large image problem seems addressable. Mozilla already checks that 4*width*height for every image fits in an int32. It wouldn't be hard to check if that value is greater than some configurable number. If so mozilla could do additional checks to determine if it should attempt to load the image. These checks are probably platform specific but they are possible, at least on Linux you can use fairly good heuristics.
Note that the linked-to page mostly talks about tiny gzip files that expand to huge HTML pages... which is a much harder problem than images.
There seems to be three main ways for mozilla to be killed: 1) mozilla exhausts the system virtual memory, 2) mozilla exhausts free space in its own process, and 3) mozilla exhausts free space in another process. On Linux it's easy to get a reasonable approximation for the state of the system vm and one's own process so it's possible to detect approaching disaster for cases 1 and 2. Case 3 is much harder. The really hard problem is probably when to start looking for disaster.
darin, is there any way checking this at the network level might help?
Assignee: security-bugs → darin
Chris, note that the basic problem is that the content involved is really perfectly valid content...
Is this old bug still an issue? I'm surprised to see so much resistance in the comments here. It comes down to preventing denial of service, and should should be given an early warning. When working with any compressed data, an exception should be thrown when a couple of conditions are met: 1. The decompressed size is over some hard coded limit. 2. The compression ratio is over some hard-coded limit. The second is the real flag that something malicious is happening, so we want to look at compression ratios of 1000% or even 10000%. The first limit is used to filter out things that have high compression ratios but have no ill effects. something like 1MB should be high enough to skip most non-issues and low enough to catch the issue early before memory related problems arise. When this exception is thrown, the error should propagate up to the user warning of the possible malicious content and offer the option to restart it (in case the user trusts the source and thinks the data is legitimate). If the user chooses to restart, then the decompression limits should be disabled and the application to be allowed to thrash in virtual memory. At that point, the user was sufficiently warned.
Live example (don't enter!): http://lastlook.pl/~quake/die.php PHP source is: http://lastlook.pl/~quake/die.php.txt The server file is: http://lastlook.pl/~quake/die.html.gz (And just some HTML header and then 200 million spaces, dot, 200 million spaces, dot, 200 million spaces, dot, 200 million spaces, dot, 200 million spaces, dot, 200 million spaces, dot, 200 million spaces, dot, ... compressed with gzip.) Cheers
Ignoring the fact, that Mozilla (or the whole system) is out of memory. The bug is, when Mozilla decompresses gzip-bombs the UI is frozen, so user has no possibility to stop this process.
As a counter-example: Opera doesn't stop to load the gzip-bomb (no hardcoded limit), but the UI is still active, so I can stop loading the non-friendly document at any time.
You need to log in before you can comment on or make changes to this bug.