Closed Bug 498036 Opened 13 years ago Closed 12 years ago
"ASSERTION: there is already a low-level error" [@ ns
CSSScanner::Set Low Level Error] with malformed data uris
###!!! ASSERTION: there is already a low-level error: 'mLowLevelError == NS_OK', file /Users/jruderman/central/layout/style/nsCSSScanner.cpp, line 264
Looks like CSSParserImpl::ParseImportRule should return the result of ProcessImport. I think the same thing needs to happen analogously in CSSParserImpl::ParseNamespaceRule, otherwise these probably assert too: @namespace foo "data:css"; @namespace bar "data:css"; ParseGroupRule's call to ParseRuleSet similarly looks suspicious. There are probably more, but that's the limit of how much I want to skim for the moment.
OS: Mac OS X → All
Hardware: x86 → All
Assigning this to myself so I don't lose it.
Assignee: nobody → zweinberg
Status: NEW → ASSIGNED
At the point where ParseImportRule calls ProcessImport, we have already consumed the entire @import rule including the semicolon. If it were to return PR_FALSE when ProcessImport fails, that would cause ParseAtRule to skip the *next* top-level construct in the file, which would be wrong. Rather, what we should be doing is issuing an error console message directly from ProcessImport and *not* setting the low-level error. Same deal for ParseNameSpaceRule/ProcessNameSpace, except that we're already not setting the low-level error there, so we don't get the assertion. [Did you notice that at present ProcessNameSpace always returns PR_FALSE?] I'm not sure about ParseRuleSet. None of its callers look at its return value, which says to me that on error it's probably already skipping everything that needs to be skipped, and maybe it should always return PR_TRUE (or not return anything at all). It occurs to me that just about every use of SetLowLevelError(NS_ERROR_OUT_OF_MEMORY) has the potential to trigger this assertion, because higher layers don't try to distinguish "failed because of out-of-memory" from "failed because of a parse error" so they're just going to go on parsing, and we're going to hit out-of-memory again. Supposedly, infallible malloc is happening Real Soon Now so I'm inclined not to worry about that much. Except maybe it's an excuse to start ignoring the possibility of out-of-memory in the CSS parser already. dbaron, what do you think?
Also, CSSNameSpaceRule objects treat the URL as an opaque string, rather than converting it to an actual URI object; so we won't detect an error in that case. And the css3-namespaces draft makes me think we aren't even supposed to: http://dev.w3.org/csswg/css3-namespace/#syntax # ... # A URI string parsed from the URI syntax must be treated as a # literal string: as with the STRING syntax, no URI-specific # normalization is applied. # # All strings—including the empty string and strings representing # invalid URIs—are valid namespace names in @namespace declarations.
Here's a potential fix. As described above, I don't think anything else needs messing with. Note the added string in css.properties. Do I have to do anything special to get that on translators' radar?
Attachment #427021 - Flags: review?(bzbarsky)
Comment on attachment 427021 [details] [diff] [review] patch r=bzbarsky
Attachment #427021 - Flags: review?(bzbarsky) → review+
Status: ASSIGNED → RESOLVED
Closed: 12 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.