User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0b11) Gecko/20100101 Firefox/4.0b11 Build Identifier: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0b11) Gecko/20100101 Firefox/4.0b11 If viewing a page that is served by a webserver that returns invalid http version, the rendering differs from previous version (3.6.x) and all other browsers. This is what I see in b11: HTTP/2.0 302 Found Server: SmarterTools/2.0.3974.16813 Date: Tue, 22 Feb 2011 14:03:41 GMT X-AspNet-Version: 4.0.30319 Location: /Login.aspx Set-Cookie: ASP.NET_SessionId=2bpm251hk15dkc1rylz1upya; path=/; HttpOnly Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Length: 128 Connection: Close <html><head><title>Object moved</title></head><body> <h2>Object moved to <a href="/Login.aspx">here</a>.</h2> </body></html> If visiting the /Login.aspx, it instead tries to render it as xml: XML Parsing Error: not well-formed Location: http://logmachine/Login.aspx Line Number 1, Column 5:HTTP/2.0 200 OK ----^ Notice the nonexisting HTTP version 2.0. I don't know why they send it, but I believe that the switch to text/plain and text/xml is wrong. This is somewhat related to https://bugzilla.mozilla.org/show_bug.cgi?id=629842 and https://bugzilla.mozilla.org/show_bug.cgi?id=628832 but isn't working as of b11 for me. I don't have any test url to test with today, but there are trial version of the software that I'm using available for download: http://www.smartertools.com/smarterstats/web-analytics-seo-software-download.aspx Reproducible: Always Steps to Reproduce: 1.Visit a page/site that returns HTTP/2.0 2. 3. Actual Results: HTTP/2.0 302 Found Server: SmarterTools/2.0.3974.16813 Date: Tue, 22 Feb 2011 14:03:41 GMT X-AspNet-Version: 4.0.30319 Location: /Login.aspx Set-Cookie: ASP.NET_SessionId=2bpm251hk15dkc1rylz1upya; path=/; HttpOnly Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Length: 128 Connection: Close <html><head><title>Object moved</title></head><body> <h2>Object moved to <a href="/Login.aspx">here</a>.</h2> </body></html> Expected Results: A redirect to /Login.aspx
Correcting platform, since your user agent indicates 32bit build (platform refers to Minefield build, not OS). Could you see if the issue occurs using the latest nightly: http://nightly.mozilla.org/ If it does, would you be able to try and find the regression range: http://harthur.github.com/mozregression/ Thanks!
Hardware: x86_64 → x86
Version: unspecified → Trunk
Same issue in latest nightly. I have been running 4.0 beta since it came out and have always had the issue, so I'm guessing it has something to do with 4.0? I'll try to find a regression range where it was introduced.
Okay, so basically it worked in build from 2010-11-06, but was broken in build 2010-12-12, both were b8pre-versions running nightly from 2010-12-12 Was this nightly good or bad? (type 'good' or 'bad' and press Enter): bad Downloading nightly... running nightly from 2010-11-06 Was this nightly good or bad? (type 'good' or 'bad' and press Enter): good
Could you run Mozregession a bit longer and get the range down to 24hours, if you are able to spare the time? To restart mozregression using the results you've found so far, use: mozregression --good=2010-11-06 --bad=2010-12-12 (1 month range takes ~5 iterations to get down to 24 hours)
Oh, I didn't understand that it was narrrowing down the window. Here is the result: Last good nightly: 2010-12-02 First bad nightly: 2010-12-03
And the pushlog bit? If you've closed the window already, I can always look them up, but I'd need to confirm you were using Windows builds for that range yeah?
Ok, presuming win32, just looked them up and the revs/pushlog is: http://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=baa5ae44f0ba&tochange=0ff6d5984287 Possible candidate: Michal Novotny — bug 363109 - body of HTTP 304 response is treated as a HTTP/0.9 response to subsequent HTTP request. r=biesi, sr=bz, a=blocker http://hg.mozilla.org/mozilla-central/rev/37aefdd76a22
Severity: minor → normal
Status: UNCONFIRMED → NEW
blocking2.0: --- → ?
Component: General → Networking: HTTP
Ever confirmed: true
Product: Firefox → Core
QA Contact: general → networking.http
1.Visit a page/site that returns HTTP/2.0 a change in the major protocol version number in http means an incompatible change in the parsing or structure of the protocol. Honestly, we should throw an error in this case instead of reverting to the 3.6 behavior of treating it as http/1.x. There is no reason to think that the http/1.x parsing rules apply. (they are defined to apply to minor version changes.. i.e. http/1.13 would be ok.) I don't see anything to block on.
Well, we should at least contact the author of the relevant software, right? Do we need a separate tech evang bug? Also, while you're technically right per spec it seems like de-facto behavior is to treat "HTTP/2.0" as .... what? 1.0? 1.1? Some testing across browsers might be a good idea. :(
But yes, I think failing an HTTP/2.0 response with an NS_ERROR_NET_UNKNOWN_HTTP_VERSION or whatever and a corresponding error page would be better than treating it as 0.9, which is what we do now, right?
(In reply to comment #9) > Well, we should at least contact the author of the relevant software, right? > Do we need a separate tech evang bug? makes sense. (In reply to comment #10) > But yes, I think failing an HTTP/2.0 response with an > NS_ERROR_NET_UNKNOWN_HTTP_VERSION or whatever and a corresponding error page > would be better than treating it as 0.9, which is what we do now, right? I think the error is the right thing to do. It clearly isn't either of the two things we understand (http/1.x and http/0.9). I don't think this is widespread - I've never seen a 2.0 response before and http clearly prohibits servers from replying to a 1.x request like that so I'm not real eager to add a bugwards compatible path that might make life difficult for somebody down the road.
I do agree that the software in question should be fixed, however all other tested browsers do treat it as a regular page. tested: Chrome 9 Safari 4 Internet Explorer 8 Firefox 3.6 I don't see what HTTP version has to do with content parsing, it should only specify the protocol, right? Content-Type should still apply in my opinion. Maybe it might be a long-shot in order to try to be "future-proof", trying to make sense of the response, but couldn't hurt. I have already tried to contact the manufacturer and will submit a formal ticket as well, but since any diskrepancy in HTTP version might break the parsing, it maybe needs an overhaul. Regards, Jimmy
personally, I think this is the wrong action to take, but holding up the awesomeness of firefox 4 would be a much much larger transgression. the internet indicates they've got some serious proxy interop problems for the same reason. I'll patch this to the minimal case I can scrounge up.
Proxy problems: http://forums.smartertools.com/p/15863/38795.aspx I also reached out to them on twitter, will also try other methods so that they fix the long-term problem.
blocking2.0: final+ → ?
I don't consider this is a big issue, but the irregular handling of different responses should be looked into. As I said, I got plain text rendering (including headers) on the 302 page, but on the 200 page /Login.aspx it tried to parse the gzipped content including headers as XML. That seems broken to me. I have filed an official support ticket to them and are awaiting their response, and as you said, some firewalls/content inspection servers blocks this as well.
I just went through their semi-automated (I suspect) live chat, hoping that it will make it through to a person who can route it to the right place.
blocking2.0: ? → final+
Created attachment 514299 [details] [diff] [review] 635892-http.2.0.v1 this will accept the base "http/2.0" response at the front of a response and treat it as if it were http/1.1.. I intentionally didn't overly generalize it to accept the various leading garbage cases or to accept other versions we haven't seen - there is no reason to open the garbage can wider than necessary. (for instance, when scanning the buffer looking for an http signature I don't want to accept http/2 when http/1 might be there farther along.. one hacky workaround per customer, please.) Compliant http/1.x responses only incur an extra PRBool assignment so its pretty much free on well behaved servers.. some of the workaround paths will pick up an extra strcmp().
Attachment #514299 - Flags: review?(bzbarsky)
I forgot to mention the attachment passes the xpcshell netwerk tests and the resource I created for the URL field as a test. httpd.js of course isn't so hot on returning http/2.0
Status: NEW → ASSIGNED
Comment on attachment 514299 [details] [diff] [review] 635892-http.2.0.v1 I did something wrong. Will update.
Attachment #514299 - Flags: review?(bzbarsky) → review-
Created attachment 514308 [details] [diff] [review] 635892-http.2.0.v2
Comment on attachment 514308 [details] [diff] [review] 635892-http.2.0.v2 r=me
Attachment #514308 - Flags: review?(bzbarsky) → review+
(In reply to comment #12) > I don't see what HTTP version has to do with content parsing, it should only > specify the protocol, right? Content-Type should still apply in my opinion. The point is that HTTP/2.0 is not backwards-compatible, so there isn't any way to even know what the Content-Type is, or indeed what the payload is; the bits on the wire can and probably will be completely different. It's not inconceivable that HTTP/2.0 will come down the pipe at some point; Google is already working on SPDY, after all. Making sites that erroneously advertise this fail early will make that transition significantly easier down the road, and preclude a lot of bugs then. > Maybe it might be a long-shot in order to try to be "future-proof", trying to > make sense of the response, but couldn't hurt. As per above, it can. Pretending that we understand HTTP/2.0 when we don't makes the future more complex.
I share mnot's concern here: this seems like one broken site that will fix its stuff. Have we seen it elsewhere? On the other hand, it seems like we're the only browser that doesn't behave this way, so conservativism would be to follow them. I'm pretty conservative right now.
Whiteboard: [hardblocker] → [hardblocker][needs landing][has patch]
(In reply to comment #24) > I share mnot's concern here: this seems like one broken site that will fix its > stuff. Have we seen it elsewhere? > > On the other hand, it seems like we're the only browser that doesn't behave > this way, so conservativism would be to follow them. I'm pretty conservative > right now. We should raise bugs against the other UAs. Centerum censeo, HTTP needs a test suite.
Status: ASSIGNED → RESOLVED
Last Resolved: 7 years ago
Resolution: --- → FIXED
Whiteboard: [hardblocker][needs landing][has patch] → [hardblocker][has patch]
You need to log in before you can comment on or make changes to this bug.