User-Agent: Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en) AppleWebKit/523.12.2 (KHTML, like Gecko) Version/3.0.4 Safari/523.12.2 Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9b2) Gecko/2007121120 Firefox/3.0b2 I have a well-formed XHTML document (with XHTML doctype and xmlns declarations), served with text/html MIME type: http://hardi.org/mozilla/test.html In previous versions of Firefox, the page renders normally, in standards compliance mode. FF3 also claims standards compliance mode, but the <script /> causes parsing to break. This also happens when http-equiv is set to specify either text/xml or application/xhtml+xml types: http://hardi.org/mozilla/test-equiv.html Files saved locally with a .html extension also fail to render. ( For comparison, when the webserver sends the content-type as xml, FF3 renders the document correctly: http://hardi.org/mozilla/test.xhtml ) I understand that the "100% correct" thing to do is serve the document with an XML mimetype, however this is a practical impossibility because a certain majority-marketshare web browser bonks on such documents. As a result, it's a vastly more common practice to serve XHTML as text/html than xml. Furthermore, lots of people operate in shared hosting and other situations where they do not have access to their server configurations. I know I can do "<script></script>" instead of "<script />"but that's not what libxml2 likes to output, and it seems like a hack. Is this a bug or policy change? (To me, this change [if it's intentional] seems like a regression from FF2 to FF3. A document with an XML doctype and xmlns declaration ought to be parsed as XML. Or an http-equiv ought to be honored if it's present.) Reproducible: Always Steps to Reproduce: 1. Go to http://hardi.org/mozilla/test.html 2. Instead of a page of text, you will see an empty page.
Status: UNCONFIRMED → RESOLVED
Last Resolved: 11 years ago
Resolution: --- → DUPLICATE
Duplicate of bug: 399232
Doesn't IE do the same thing when it encounters an unclosed <script> tag? Comments in bug 327796 indicate that it does...
Yes, it does, but this is generally seen as a bug in IE. All the standards-compliant browsers (including Mozilla) have supported the XML syntax for years. To break it now seems a bad move. To be an XML nazi about it, the tag isn't "unclosed", it's just like a <br /> or any other empty tag, it closes itself with the slash. Both <script/> and <script></script> are valid XML and ought to be supported IMHO. The problem is that if you now have Apache send an XML mimetype to make FF3 happy, IE will break on pages like http://hardi.org/mozilla/test.xhtml .
You're not using XML when you have text/html as media type.
Status: RESOLVED → VERIFIED
Well, technically speaking I understand that's what's going on, but I disagree. I would suggest that: * an XHTML doctype * an xml namespace declaration * an http-equiv content-type saying xml (i.e. why is gecko ignoring the content-type in <meta http-equiv> when it is supposed to override content-type headers sent by the webserver?) ... are all reasons for interpreting the doc as XHTML. As a practical matter, setting the type to anything other than text/html will break IE. I mean, the current standard practice for using XHTML is to send as text/html to make IE happy, and then the doctype puts all the browsers into standards mode. So, as a practical matter requiring an xml content-type makes it impossible to use xhtml (barring hacks like user-agent content negotiation, which is complicated and drags on webserver performance). This is a change to a longtime Mozilla behavior, it's inconsistent with the other standards-compliant browsers, not required by any W3 spec I can turn up, and I don't see where the reason has been explained. Please point me to one if it's out there and I'm missing it. :)
It's not done yet, but HTML 5 requires this behavior: http://www.w3.org/TR/html5/
Ah, the link of death ... the W3 draft spec! I have to say though, the XHTML2 spec is even sketchier than HTML5 at this point. And, I don't think it should apply if my docs specify XHTML1 in both the doctype and xml ns (to do otherwise seems broken): <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"> (My test pages do happily past the w3 validator tests because there's no such requirement in XHTML 1.) And there's still the http-equiv being ignored issue. From the web developer perspective, I'm more than happy for you to pipeline a change like this for when these specs are actually released, but causing breakage before the fact, and a conflict with how IE renders, seems like imaginary gain for causing real problems. If I *could* change all my file extensions to .xml and send everything as application/xhtml+xml without 2/3 the world getting raw XML in their browsers, I would *happily* do it. Anyway, I feel like I'm just repeating myself here. :) I appreciate your attention and link.
You need to log in before you can comment on or make changes to this bug.