Closed
Bug 1135354
Opened 9 years ago
Closed 9 years ago
OOM crash (or hang) when clicking link with a huge URL length
Categories
(Core :: Networking, defect)
Tracking
()
RESOLVED
FIXED
mozilla39
Tracking | Status | |
---|---|---|
firefox39 | --- | fixed |
People
(Reporter: nonameforgeorge, Assigned: valentin)
References
Details
Crash Data
Attachments
(1 file, 1 obsolete file)
7.40 KB,
patch
|
Details | Diff | Splinter Review |
User Agent: Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.115 Safari/537.36 Steps to reproduce: Click the link in attached file or create simply own test file. Win 7. After you click the link, Firefox is not responding and crashes. Sometimes it doesn´t show "crash dialog box", just restarts the program. Tested up-to-date version of firefox (about-about mozilla firefox, firefox is up to date. Tested at 20.2.2015) Win Vista 32-bit home premium. Firefox 35.0.1. After clicking the link, firefox goes "not responding" and seems to restart itself. Actual results: After clicking the link, firefox crashes / "not responding" and seems to restart itself in some cases (differs in win7 and Vista). Expected results: Not loading the content, only current sheet crashing, error message that content is not loaded. I believe Firefox should not crash even the link inserted is totally ridicilous. What do you guys think if this is a memory corruption fault? Or something else?
Reporter | ||
Comment 1•9 years ago
|
||
Link to file http://www.filedropper.com/test1_3
Comment 2•9 years ago
|
||
Could you look at about:crashes and paste the link for one of these crashes? I was unable to reproduce it myself. Odds are it is some kind of out-of-memory crash and not memory corruption, but the stack will tell us more.
Flags: needinfo?(nonameforgeorge)
Updated•9 years ago
|
Summary: memory corruption fault → crash when clicking on file download link
Reporter | ||
Updated•9 years ago
|
Flags: needinfo?(nonameforgeorge)
Reporter | ||
Comment 4•9 years ago
|
||
If this ticket will become publicly available, please remove the crash-stat link as i hope these informations NOT to be identified with this ticket. thanks.
Comment 5•9 years ago
|
||
Thanks for the link. Yes, that's just a safe OOM crash. It sounds like a check should be made there so that it doesn't just crash, though. I've hidden your comment.
Group: core-security
Status: UNCONFIRMED → NEW
Ever confirmed: true
Comment 6•9 years ago
|
||
The crash was in the function net_GetFileFromURLSpec in nsURLHelperWin.cpp, at the line: rv = localFile->InitWithPath(NS_ConvertUTF8toUTF16(path));
Component: Untriaged → Networking
Product: Firefox → Core
Updated•9 years ago
|
Crash Signature: [@ OOM | large | NS_ABORT_OOM(unsigned int) | net_GetFileFromURLSpec(nsACString_internal const&, nsIFile**) ]
Comment 7•9 years ago
|
||
The OOM allocation was about 44MB, so it isn't too surprising it crashed on Windows.
Reporter | ||
Comment 8•9 years ago
|
||
Just a question. Is the behavior same when user clicks the link locale and over the network?
Reporter | ||
Comment 9•9 years ago
|
||
(In reply to Andrew McCreight [:mccr8] from comment #5) > Thanks for the link. Yes, that's just a safe OOM crash. It sounds like a > check should be made there so that it doesn't just crash, though. > > I've hidden your comment. Thank you.
Comment 10•9 years ago
|
||
(In reply to nonameforgeorge from comment #8) > Just a question. Is the behavior same when user clicks the link locale and > over the network? I'm not familiar with this code, but I would guess so. It seems like the page is creating a giant URL, then tries to copy it.
Comment 11•9 years ago
|
||
valentin - maybe you can find a place to put in a circuit breaker?
Flags: needinfo?(valentin.gosu)
Assignee | ||
Comment 12•9 years ago
|
||
Yes, I remember we don't have any limits on the length of a URL. I'll start a patch for enforcing a maximum length, for which we can have a pref. IE currently enforces a max length of 2083 chars [1]. We could have a similar default. http://stackoverflow.com/questions/417142/what-is-the-maximum-length-of-a-url-in-different-browsers
Assignee: nobody → valentin.gosu
Flags: needinfo?(valentin.gosu)
Assignee | ||
Comment 13•9 years ago
|
||
Attachment #8571947 -
Flags: review?(mcmanus)
Assignee | ||
Comment 14•9 years ago
|
||
Push try with 2083 chars url limit: https://treeherder.mozilla.org/#/jobs?repo=try&revision=2bddc7e63aed Some tests are failing due to our use of URL for bookmarks and javascript: URIs. 65600 seems like a fair compromise, between not having a limit and being able to hold a lot of information in a URL. Push try with 65600 chars url limit: https://treeherder.mozilla.org/#/jobs?repo=try&revision=32adf3e79ec8
Comment 15•9 years ago
|
||
well, there's always 640KB - who could ever need more than that? So, more seriously, I expect this will be fine but its the kind of thing that could have webcompat issues at the tail. Can we stop the crash with something much bigger? I don't think we want to suss out the actual effective limit - just something that stops the pathological behavior here.. 1MB (guess)?
Assignee | ||
Comment 16•9 years ago
|
||
The patch just returns early from parsing for a huge URL, so we don't allocate or make extra copies of that memory. We still hold a giant string in memory, from the HTML or JS parsing, but I don't know if we can do anything about that. This does a good job for the testcase. It only janks for a second or so when clicking the URL, and doesn't crash.
Comment 17•9 years ago
|
||
sorry for being unclear. I just mean can we use a larger value and get the same benefit? (minimize tail risk)
Assignee | ||
Comment 18•9 years ago
|
||
Oh, definitely. I think 1Mb is a fair limit.
Comment 19•9 years ago
|
||
Comment on attachment 8571947 [details] [diff] [review] Crash/Hang when clicking URL with a huge length Review of attachment 8571947 [details] [diff] [review]: ----------------------------------------------------------------- lgtm ::: modules/libpref/init/all.js @@ +1591,5 @@ > // UTF-8. > pref("network.standard-url.encode-utf8", true); > > +// The maximum allowed length for a URL. > +pref("network.standard-url.max-length", 65600); 1mb ::: netwerk/base/nsURLHelper.cpp @@ +23,5 @@ > static bool gInitialized = false; > static nsIURLParser *gNoAuthURLParser = nullptr; > static nsIURLParser *gAuthURLParser = nullptr; > static nsIURLParser *gStdURLParser = nullptr; > +static int32_t gMaxLength = 65600; 1mb
Attachment #8571947 -
Flags: review?(mcmanus) → review+
Assignee | ||
Comment 20•9 years ago
|
||
Try: https://treeherder.mozilla.org/#/jobs?repo=try&revision=43890e95ad2b https://treeherder.mozilla.org/#/jobs?repo=try&revision=e5d21c9e7ac5 Inbound: https://hg.mozilla.org/integration/mozilla-inbound/rev/172eb08eb395
Assignee | ||
Updated•9 years ago
|
Attachment #8571947 -
Attachment is obsolete: true
Comment 21•9 years ago
|
||
https://hg.mozilla.org/mozilla-central/rev/172eb08eb395
Status: NEW → RESOLVED
Closed: 9 years ago
status-firefox39:
--- → fixed
Resolution: --- → FIXED
Target Milestone: --- → mozilla39
Comment 22•9 years ago
|
||
We rely on long data: URLs for reftest image comparisons, and for turning bookmarklets into greasemonkey scripts. I hope 1MB is enough.
Assignee | ||
Comment 23•9 years ago
|
||
640K seems to be the max used in the test suite. 1MB should be enough, I think. I have changed the MDN page to note this change. https://developer.mozilla.org/en-US/docs/Web/HTTP/data_URIs#Common_problems Please submit bugs if you find any.
Comment 25•9 years ago
|
||
> 4.14 + * Returns the max length of a URL. The default is 2083. The default is not 2083, afaict.... > 1MB should be enough, I think. Have you tested what other browsers do? What happens to long form GET URIs?
Flags: needinfo?(valentin.gosu)
Assignee | ||
Comment 26•9 years ago
|
||
(In reply to Not doing reviews right now from comment #25) > > 4.14 + * Returns the max length of a URL. The default is 2083. > > The default is not 2083, afaict.... I forgot to update that comment. It is actually 1MB (1048576) > > > 1MB should be enough, I think. > > Have you tested what other browsers do? What happens to long form GET URIs? http://stackoverflow.com/questions/417142/what-is-the-maximum-length-of-a-url-in-different-browsers Considering other browsers and search engines have a much lower limit, I'd say this should not be a problem. We only seem to be abusing it in our own test suite, with encoding stuff in the hash of the url, or with bookmarklets.
Flags: needinfo?(valentin.gosu)
Comment 27•9 years ago
|
||
My open source project TiddlyWiki makes extensive use of the ability to create multi-megabyte data URIs. I'm not familiar with the procedure in cases like this, but is there any chance the limit could be relaxed to 10MB or so? TiddlyWiki is a complete wiki written in JavaScript packaged as a single HTML file that is capable of running directly from a file:// URI. http://tiddlywiki.com/ https://github.com/Jermolene/TiddlyWiki5 Saving changes is done by creating an <a> element with a the href being data URI containing the data to be saved and also having a "download" attribute. A download is triggered by calling the .click() method. The technique is illustrated in this standalone demo: https://github.com/Jermolene/FileSavingDemo/blob/master/filesavingdemo.html#L84 The technique has worked well in Firefox and Chrome. (Safari doesn't implement the "download" attribute, and Internet Explorer imposes a length limit as noted above). The requirement for manual intervention means that it's not convenient for repetitive use, but it's a good solution for saving occasional snapshots of information from a browser-based web application. (For regular use we have a Firefox extension which handles saving without manual intervention). I appreciate that other browsers place similar limits on the size of data URIs but Firefox's liberal support has been very welcome, and has enabled an ecosystem of people working with and customizing TiddlyWiki while working entirely within Firefox, not needing a server.
Comment 28•9 years ago
|
||
Jeremy, are blob urls an option for you? Valentin, you still need to fix the comments, right? And it sounds like "other browsers" doesn't include Chrome? What, exactly, is Chrome's limit?
Flags: needinfo?(valentin.gosu)
Updated•9 years ago
|
Summary: crash when clicking on file download link → OOM crash (or hang) when clicking link with a huge URL length
Comment 29•9 years ago
|
||
bzbarsky - our implementation does use blob URLs where available. Here's the relevant code (it's very short!): https://github.com/Jermolene/TiddlyWiki5/blob/master/core/modules/savers/download.js#L35 I had not appreciated that blob URLs were not subject to the same limit. If there is no limit for blob URLs, or that limit is significantly higher than 1MB, then I don't think there is a problem after all.
Comment 30•9 years ago
|
||
There's no such limit for blob: URLs, because the string form of those does not grow with the size of the data, and the issue here was the size of the string showing up in the URL bar and other places. You should be able to test a current nightly to make sure that things work as you expect.
Assignee | ||
Comment 32•9 years ago
|
||
(In reply to Not doing reviews right now from comment #28) > Jeremy, are blob urls an option for you? > > Valentin, you still need to fix the comments, right? Fixed in rev 8b7b70870fd2. > And it sounds like > "other browsers" doesn't include Chrome? What, exactly, is Chrome's limit? It seems that Chrome has no limitation for URL lengths. http://wiert.me/2010/04/20/maximum-url-lengths/ Also, it seems I was mistaken changing the wiki in comment 23. data URI parsing does not go through the same code path, so the limit does not apply. We only enforce the limit of 1MB for standard URLs.
Flags: needinfo?(valentin.gosu)
Comment 34•9 years ago
|
||
> data URI parsing does not go through the same code path, so the limit does not apply. We only enforce the limit of 1MB for standard URLs.
Phew, that's good to hear. Thank you everybody for your helpful, speedy response.
You need to log in
before you can comment on or make changes to this bug.
Description
•