If you think a bug might affect users in the 57 release, please set the correct tracking and status flags for Release Management.

HTTP header parsing code broken with multipacket HTTP headers

RESOLVED INCOMPLETE

Status

()

Core
Networking
--
major
RESOLVED INCOMPLETE
14 years ago
10 years ago

People

(Reporter: Erik Brandsberg, Unassigned)

Tracking

({dataloss})

Trunk
dataloss
Points:
---

Firefox Tracking Flags

(Not tracked)

Details

(URL)

Attachments

(2 attachments)

(Reporter)

Description

14 years ago
User-Agent:       Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.5) Gecko/20030916
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.5) Gecko/20030916

This is actually a duplicate of other bugs, but those were never fixed or pinned
down, although the symptoms are different depending on the headers.  If Mozilla
receives a multi-packet HTTP header (due to size), and any packet following the
first starts with the data "\r\n" then in httpHttpTransaction.cpp,
nsHttpTransaction::ParseLineSegment(char *segment, PRUint32 len), the assumption
is made that a buffer that contains only \r\n means end of headers, which is
false due to the way the header is split.  Close examination of several bugs
including 180831 will reveal some of the effects of dropped headers on parsing.


Reproducible: Always

Steps to Reproduce:
1.
2.
3.
Confirming to get on Darin's radar....
Blocks: 180831
Status: UNCONFIRMED → NEW
Ever confirmed: true
Keywords: dataloss
Mail from Erik:

Some more info:  I’ve verified that rendering of gzip encoded content breaks
when the content-encoding header is in the second packet after the \r\n, and
cookies that are set are dropped.  We have several customers that have had this
issue as a result of a package called siteminder that sets a 1K cookie
containing security information, and on the odd case where the packet is broken
on just the right boundry, the compressed pages don’t render properly.  I’ve
tracked down many bugs in the bugzilla database that seem related to this,
including several with compression, due to how easy it is to simply append the
content-encoding header when compressing the content using external devices.  As
a result of this, a competitor of ours apparently stopped supporting compression
on mozilla and simply turns compression off—apparently they were never able to
figure out what the problem was to solve it.

Erik, please make comments on this bug itself instead of mailing people -- that
way all the relevant people will see them...
(Reporter)

Comment 3

14 years ago
For a limited time, the IP address 65.219.20.44 is available as a proxy server
IP where any text/html content requested through it will be run through a
caching/compressing proxy of sorts that helps break things when compression is
in the loop.  Set your MTU to 576 to simulate a dialup client, then run through
the smokescreen tests with this IP set as your HTTP proxy, and several pages
will end up broken.  Some are broken in IE also, but that is another issue. 
There is the possibility of code issues on our side, however, I'm working to
make sure these arn't an issue.  From a test run today this way, I saw amazon
and msn both breaking in Mozilla (Mozilla/5.0 (Windows; U; Windows NT 5.1;
en-US; rv:1.5) Gecko/20030916).  All these problems appear to be related to the
HTTP header parsing, although in this situation, the second packet doesn't have
the \r\n, but the compression header is still in the second packet.

Erik
(Reporter)

Comment 4

14 years ago
Failed to mention the port is on port 80.  If that doesn't work, it will
probably be on port 3128.  This will probably be removed within 48 hours, so
please generate any traces you want to analyze, as you can use this to test
compression against basically any website for compatibility.

Erik

Comment 5

14 years ago
Erik: can you please collect a HTTP log for us?  there are instructions
available here:

 http://www.mozilla.org/projects/netlib/http/http-debugging.html

thx!!

Updated

14 years ago
Status: NEW → ASSIGNED
Target Milestone: --- → mozilla1.6alpha
(Reporter)

Comment 6

14 years ago
Using a tool like dr tcp (http://www.dslreports.com/front/drtcp.html), you can
setup the mtu on your box to 576 and test any website you want.  Nearly any site
that issues any sort of large cookies will not be rendered correctly, which will
allow you to duplicate the error extremely easily.  Sites that I've now verified
break:

www.dell.com
cgi.ebay.com
www.amazon.com
www.msn.com

Note:  Most of these will not break if you don't lower your MTU so that the
response can't fit in one packet.

Erik
(Reporter)

Comment 7

14 years ago
Created attachment 133035 [details]
ethereal capture of traffic when Mozilla failed to render page
(Reporter)

Comment 8

14 years ago
Created attachment 133036 [details]
Ethereal capture on same page, refresh, and good render occured.
(Reporter)

Comment 9

14 years ago
New working theory:

In the Mozilla code, the failure is triggered when either:
a) the first packet ends with \r\n
b) The second packet starts with \r\n

IE appears to have an issue with a) above, but not b).  Outside of this,
everything is the same (except for the cookies, which naturally change).

Erik

Comment 10

14 years ago
Erik: i'm unable to reproduce this bug.  can you please provide the HTTP log per
comment #5.  it might really help me see what is going wrong.  thanks!

Comment 11

14 years ago
erik: any update since our last discussion on this bug?

Comment 12

14 years ago
untargeting... i'm not yet convinced that there is a bug in mozilla.

erik:  i will likely close this bug as invalid if i don't hear from you by the
end of the year.
Target Milestone: mozilla1.6alpha → ---

Comment 13

11 years ago
-> default owner
Assignee: darin → nobody
Status: ASSIGNED → NEW
Component: Networking: HTTP → Networking
QA Contact: networking.http → networking
=> incomplete, no response from Erik
Status: NEW → RESOLVED
Last Resolved: 10 years ago
Resolution: --- → INCOMPLETE
You need to log in before you can comment on or make changes to this bug.