Closed Bug 221765 Opened 21 years ago Closed 17 years ago

HTTP header parsing code broken with multipacket HTTP headers

Categories

(Core :: Networking, defect)

defect
Not set
major

Tracking

()

RESOLVED INCOMPLETE

People

(Reporter: ebrandsberg, Unassigned)

References

()

Details

(Keywords: dataloss)

Attachments

(2 files)

User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.5) Gecko/20030916 Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.5) Gecko/20030916 This is actually a duplicate of other bugs, but those were never fixed or pinned down, although the symptoms are different depending on the headers. If Mozilla receives a multi-packet HTTP header (due to size), and any packet following the first starts with the data "\r\n" then in httpHttpTransaction.cpp, nsHttpTransaction::ParseLineSegment(char *segment, PRUint32 len), the assumption is made that a buffer that contains only \r\n means end of headers, which is false due to the way the header is split. Close examination of several bugs including 180831 will reveal some of the effects of dropped headers on parsing. Reproducible: Always Steps to Reproduce: 1. 2. 3.
Confirming to get on Darin's radar....
Blocks: 180831
Status: UNCONFIRMED → NEW
Ever confirmed: true
Keywords: dataloss
Mail from Erik: Some more info: I’ve verified that rendering of gzip encoded content breaks when the content-encoding header is in the second packet after the \r\n, and cookies that are set are dropped. We have several customers that have had this issue as a result of a package called siteminder that sets a 1K cookie containing security information, and on the odd case where the packet is broken on just the right boundry, the compressed pages don’t render properly. I’ve tracked down many bugs in the bugzilla database that seem related to this, including several with compression, due to how easy it is to simply append the content-encoding header when compressing the content using external devices. As a result of this, a competitor of ours apparently stopped supporting compression on mozilla and simply turns compression off—apparently they were never able to figure out what the problem was to solve it. Erik, please make comments on this bug itself instead of mailing people -- that way all the relevant people will see them...
For a limited time, the IP address 65.219.20.44 is available as a proxy server IP where any text/html content requested through it will be run through a caching/compressing proxy of sorts that helps break things when compression is in the loop. Set your MTU to 576 to simulate a dialup client, then run through the smokescreen tests with this IP set as your HTTP proxy, and several pages will end up broken. Some are broken in IE also, but that is another issue. There is the possibility of code issues on our side, however, I'm working to make sure these arn't an issue. From a test run today this way, I saw amazon and msn both breaking in Mozilla (Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.5) Gecko/20030916). All these problems appear to be related to the HTTP header parsing, although in this situation, the second packet doesn't have the \r\n, but the compression header is still in the second packet. Erik
Failed to mention the port is on port 80. If that doesn't work, it will probably be on port 3128. This will probably be removed within 48 hours, so please generate any traces you want to analyze, as you can use this to test compression against basically any website for compatibility. Erik
Erik: can you please collect a HTTP log for us? there are instructions available here: http://www.mozilla.org/projects/netlib/http/http-debugging.html thx!!
Status: NEW → ASSIGNED
Target Milestone: --- → mozilla1.6alpha
Using a tool like dr tcp (http://www.dslreports.com/front/drtcp.html), you can setup the mtu on your box to 576 and test any website you want. Nearly any site that issues any sort of large cookies will not be rendered correctly, which will allow you to duplicate the error extremely easily. Sites that I've now verified break: www.dell.com cgi.ebay.com www.amazon.com www.msn.com Note: Most of these will not break if you don't lower your MTU so that the response can't fit in one packet. Erik
New working theory: In the Mozilla code, the failure is triggered when either: a) the first packet ends with \r\n b) The second packet starts with \r\n IE appears to have an issue with a) above, but not b). Outside of this, everything is the same (except for the cookies, which naturally change). Erik
Erik: i'm unable to reproduce this bug. can you please provide the HTTP log per comment #5. it might really help me see what is going wrong. thanks!
erik: any update since our last discussion on this bug?
untargeting... i'm not yet convinced that there is a bug in mozilla. erik: i will likely close this bug as invalid if i don't hear from you by the end of the year.
Target Milestone: mozilla1.6alpha → ---
-> default owner
Assignee: darin → nobody
Status: ASSIGNED → NEW
Component: Networking: HTTP → Networking
QA Contact: networking.http → networking
=> incomplete, no response from Erik
Status: NEW → RESOLVED
Closed: 17 years ago
Resolution: --- → INCOMPLETE
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: