Closed Bug 300438 Opened 19 years ago Closed 19 years ago

[FIXr]Gzipped Content-Encoding Renders Acrobat PDF Useless

Categories

(Core :: Networking: HTTP, defect, P1)

defect

Tracking

()

RESOLVED FIXED
mozilla1.8beta4

People

(Reporter: mozilla, Assigned: bzbarsky)

References

Details

Attachments

(2 files)

20050707 trunk

ALL PDF no longer load in Firefox. I get a message from Acrobat that the file
does not begin with %PDF and then Firefox hangs with 99% CPU. The same links
work in IE with no issues whatsoever.
Updating summary
Summary: All PDF Loadings Fail in Firefox → All Adobe Acrobat PDF Loadings Fail in Firefox
WFM. Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8b3) Gecko/20050711
Firefox/1.0+ ID:2005071121
Hmm. Seems to work as long as the PDF doesn't come from a loopback address. If
the PDF is served from Apache and you access it via 127.0.0.1 you get an error.
If you access it by the FQDN, it works fine. It's not a virtual host issue since
it works with IE fine either way.
Status: NEW → RESOLVED
Closed: 19 years ago
Resolution: --- → WORKSFORME
It doesn't even work from the FQDN if the IP is local.

PDFs also load if you go through a proxy to reach the local server, but not if
it's directly. Can't sniff packets on a loopback interface, so I'm not sure how
to troubleshoot the issue.
Status: RESOLVED → REOPENED
Resolution: WORKSFORME → ---
Summary: All Adobe Acrobat PDF Loadings Fail in Firefox → All Adobe Acrobat PDF Loadings Fail From Local Address
OK. Tracked it down. Sorry for all the spam.

Apparently the content transfer encoding is not being undone before handing the
bytes off to the plugin. Of course, Acrobat has no idea what to do with gzipped
data.

I'm guessing this is a core issue.
Component: General → Networking: HTTP
Product: Firefox → Core
Summary: All Adobe Acrobat PDF Loadings Fail From Local Address → Gzipped Transfer Encoding Renders Acrobat PDF Useless
Are you sure you mean Transfer-Encoding and not Content-Encoding?  Mozilla has
never handled Transfer-Encoding: gzip.
Assignee: nobody → darin
Status: REOPENED → NEW
QA Contact: general → networking.http
Jerry, can you post a sample URL where this problem occurs?
Yes, I meant Content-Encoding. Updating summary. Confusing it with
transfer-encoding in mail.

Try this:
http://jerbaker.dhs.org/test.pdf
Summary: Gzipped Transfer Encoding Renders Acrobat PDF Useless → Gzipped Content-Encoding Renders Acrobat PDF Useless
Attached patch PatchSplinter Review
Assignee: darin → bzbarsky
Status: NEW → ASSIGNED
Blocks: 275516
OS: Windows XP → All
Priority: -- → P1
Hardware: PC → All
Summary: Gzipped Content-Encoding Renders Acrobat PDF Useless → [FIX]Gzipped Content-Encoding Renders Acrobat PDF Useless
Target Milestone: --- → mozilla1.8beta4
Attached patch Same as diff -wSplinter Review
So what's going on here is that the test URL in question sends no
Content-Length header.	The plugin code failed to reach the code added in bug
165094 when that happened.  All the patch does is move the length check into
the one place it matters -- deciding whether to do byte-range requests.  This
means that we'll now look for Content-Encoding unconditionally, as well as
unconditionally setting the last-modified date, which I think is desirable.
Attachment #189079 - Flags: superreview?(darin)
Attachment #189079 - Flags: review?(jst)
Blocks: 224296
My build of Apache isn't sending content-length headers when transfer-encoding
is chunked. I don't know if that's right or not, however. It is alpha code
pulled from Apache's CVS (the 2.1 dev tree).
That is correct. Transfer-encoding: chunked is specifically for unknown-length
transfers. They should not be seen together, as if you have Content-Length you
have no need for the extra work of chunking it. :)
Ya. It appears that anything under around 10k will have a content-length header,
but larger files are chunked. I assume that this is somewhere around the buffer
size for mod_deflate and once the file to be compressed is larger than the
buffer, there's no way to know at the time the request is answered how many
times the buffer will fill and flush during the course of the request.
Given that the Content-Length header is generally optional, as I understand, we
want to deal well even when it's absent...
"because range request on compressed content is irrelevant,"

Actually, apache mod_gzip has (or had) a bug in it where it mistakenly treated
the range request as applying to the uncompressed document instead of the
compressed document :-(  Technically, range requests on an entity served with
Content-Encoding: gzip are perfectly reasonable.  A range request would be an
offset into the compressed document, which is in this case the entity.
Darin, I can update the comment to reflect that if you want.  Just let me
know...  If it's the plugin that makes the range requests, though, then they are
indeed "irrelevant", since the plugin only sees the uncompressed data.
The plugin is the one issuing the range requests, and I guess what you are
saying is that we need to suppress such range requests when the entity is
compressed because known plugins (ok, Adobe Acrobat) cannot handle ranges
requests on compressed documents?
Right.  More precisely, by the time the plugin gets the data it's always
uncompressed (in the current setup), so it can't make meaningful range requests
on compressed documents -- it has no way to compute the ranges....
OK, changing the comment to say that would probably be good :)
I can do that, no problem.
Comment on attachment 189079 [details] [diff] [review]
Same as diff -w

r=jst
Attachment #189079 - Flags: review?(jst) → review+
Comment on attachment 189079 [details] [diff] [review]
Same as diff -w

sr=darin (sorry for the delay)
Attachment #189079 - Flags: superreview?(darin) → superreview+
Summary: [FIX]Gzipped Content-Encoding Renders Acrobat PDF Useless → [FIXr]Gzipped Content-Encoding Renders Acrobat PDF Useless
Comment on attachment 189079 [details] [diff] [review]
Same as diff -w

This is a pretty safe fix that should make content-encoding play nice with
plugins in general...
Attachment #189079 - Flags: approval1.8b4?
Comment on attachment 189079 [details] [diff] [review]
Same as diff -w

What would be involved in creating a testcase for this behavior?
Attachment #189079 - Flags: approval1.8b4? → approval1.8b4+
Depends on what you want to test. If you want to see the difference in behavior
between encoding and no encoding, I can set my server up for that. It's just my
home PC on a DSL line, so it can't handle a huge amount of traffic, but it
should suffice for testing.
Bsmedberg, you mean a testcase that becomes part of our basic unit tests?  We
need an https-enabled server, preferably one capable of running CGIs (so that we
can control the headers) and which enables gzip compression (in a controllable
way) for the output of said CGIs.
Fixed.
Status: ASSIGNED → RESOLVED
Closed: 19 years ago19 years ago
Resolution: --- → FIXED
*** Bug 310871 has been marked as a duplicate of this bug. ***
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: