Closed
Bug 300438
Opened 19 years ago
Closed 19 years ago
[FIXr]Gzipped Content-Encoding Renders Acrobat PDF Useless
Categories
(Core :: Networking: HTTP, defect, P1)
Core
Networking: HTTP
Tracking
()
RESOLVED
FIXED
mozilla1.8beta4
People
(Reporter: mozilla, Assigned: bzbarsky)
References
Details
Attachments
(2 files)
4.16 KB,
patch
|
Details | Diff | Splinter Review | |
2.98 KB,
patch
|
jst
:
review+
darin.moz
:
superreview+
benjamin
:
approval1.8b4+
|
Details | Diff | Splinter Review |
20050707 trunk ALL PDF no longer load in Firefox. I get a message from Acrobat that the file does not begin with %PDF and then Firefox hangs with 99% CPU. The same links work in IE with no issues whatsoever.
Reporter | ||
Comment 1•19 years ago
|
||
Updating summary
Summary: All PDF Loadings Fail in Firefox → All Adobe Acrobat PDF Loadings Fail in Firefox
Comment 2•19 years ago
|
||
WFM. Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8b3) Gecko/20050711 Firefox/1.0+ ID:2005071121
Reporter | ||
Comment 3•19 years ago
|
||
Hmm. Seems to work as long as the PDF doesn't come from a loopback address. If the PDF is served from Apache and you access it via 127.0.0.1 you get an error. If you access it by the FQDN, it works fine. It's not a virtual host issue since it works with IE fine either way.
Status: NEW → RESOLVED
Closed: 19 years ago
Resolution: --- → WORKSFORME
Reporter | ||
Comment 4•19 years ago
|
||
It doesn't even work from the FQDN if the IP is local. PDFs also load if you go through a proxy to reach the local server, but not if it's directly. Can't sniff packets on a loopback interface, so I'm not sure how to troubleshoot the issue.
Status: RESOLVED → REOPENED
Resolution: WORKSFORME → ---
Summary: All Adobe Acrobat PDF Loadings Fail in Firefox → All Adobe Acrobat PDF Loadings Fail From Local Address
Reporter | ||
Comment 5•19 years ago
|
||
OK. Tracked it down. Sorry for all the spam. Apparently the content transfer encoding is not being undone before handing the bytes off to the plugin. Of course, Acrobat has no idea what to do with gzipped data. I'm guessing this is a core issue.
Component: General → Networking: HTTP
Product: Firefox → Core
Summary: All Adobe Acrobat PDF Loadings Fail From Local Address → Gzipped Transfer Encoding Renders Acrobat PDF Useless
Comment 6•19 years ago
|
||
Are you sure you mean Transfer-Encoding and not Content-Encoding? Mozilla has never handled Transfer-Encoding: gzip.
Assignee: nobody → darin
Status: REOPENED → NEW
QA Contact: general → networking.http
Comment 7•19 years ago
|
||
Jerry, can you post a sample URL where this problem occurs?
Reporter | ||
Comment 8•19 years ago
|
||
Yes, I meant Content-Encoding. Updating summary. Confusing it with transfer-encoding in mail. Try this: http://jerbaker.dhs.org/test.pdf
Summary: Gzipped Transfer Encoding Renders Acrobat PDF Useless → Gzipped Content-Encoding Renders Acrobat PDF Useless
Assignee | ||
Comment 9•19 years ago
|
||
Assignee: darin → bzbarsky
Status: NEW → ASSIGNED
Assignee | ||
Updated•19 years ago
|
Blocks: 275516
OS: Windows XP → All
Priority: -- → P1
Hardware: PC → All
Summary: Gzipped Content-Encoding Renders Acrobat PDF Useless → [FIX]Gzipped Content-Encoding Renders Acrobat PDF Useless
Target Milestone: --- → mozilla1.8beta4
Assignee | ||
Comment 10•19 years ago
|
||
So what's going on here is that the test URL in question sends no Content-Length header. The plugin code failed to reach the code added in bug 165094 when that happened. All the patch does is move the length check into the one place it matters -- deciding whether to do byte-range requests. This means that we'll now look for Content-Encoding unconditionally, as well as unconditionally setting the last-modified date, which I think is desirable.
Attachment #189079 -
Flags: superreview?(darin)
Attachment #189079 -
Flags: review?(jst)
Reporter | ||
Comment 11•19 years ago
|
||
My build of Apache isn't sending content-length headers when transfer-encoding is chunked. I don't know if that's right or not, however. It is alpha code pulled from Apache's CVS (the 2.1 dev tree).
Comment 12•19 years ago
|
||
That is correct. Transfer-encoding: chunked is specifically for unknown-length transfers. They should not be seen together, as if you have Content-Length you have no need for the extra work of chunking it. :)
Reporter | ||
Comment 13•19 years ago
|
||
Ya. It appears that anything under around 10k will have a content-length header, but larger files are chunked. I assume that this is somewhere around the buffer size for mod_deflate and once the file to be compressed is larger than the buffer, there's no way to know at the time the request is answered how many times the buffer will fill and flush during the course of the request.
Assignee | ||
Comment 14•19 years ago
|
||
Given that the Content-Length header is generally optional, as I understand, we want to deal well even when it's absent...
Comment 15•19 years ago
|
||
"because range request on compressed content is irrelevant," Actually, apache mod_gzip has (or had) a bug in it where it mistakenly treated the range request as applying to the uncompressed document instead of the compressed document :-( Technically, range requests on an entity served with Content-Encoding: gzip are perfectly reasonable. A range request would be an offset into the compressed document, which is in this case the entity.
Assignee | ||
Comment 16•19 years ago
|
||
Darin, I can update the comment to reflect that if you want. Just let me know... If it's the plugin that makes the range requests, though, then they are indeed "irrelevant", since the plugin only sees the uncompressed data.
Comment 17•19 years ago
|
||
The plugin is the one issuing the range requests, and I guess what you are saying is that we need to suppress such range requests when the entity is compressed because known plugins (ok, Adobe Acrobat) cannot handle ranges requests on compressed documents?
Assignee | ||
Comment 18•19 years ago
|
||
Right. More precisely, by the time the plugin gets the data it's always uncompressed (in the current setup), so it can't make meaningful range requests on compressed documents -- it has no way to compute the ranges....
Comment 19•19 years ago
|
||
OK, changing the comment to say that would probably be good :)
Assignee | ||
Comment 20•19 years ago
|
||
I can do that, no problem.
Comment 21•19 years ago
|
||
Comment on attachment 189079 [details] [diff] [review] Same as diff -w r=jst
Attachment #189079 -
Flags: review?(jst) → review+
Comment 22•19 years ago
|
||
Comment on attachment 189079 [details] [diff] [review] Same as diff -w sr=darin (sorry for the delay)
Attachment #189079 -
Flags: superreview?(darin) → superreview+
Assignee | ||
Updated•19 years ago
|
Summary: [FIX]Gzipped Content-Encoding Renders Acrobat PDF Useless → [FIXr]Gzipped Content-Encoding Renders Acrobat PDF Useless
Assignee | ||
Comment 23•19 years ago
|
||
Comment on attachment 189079 [details] [diff] [review] Same as diff -w This is a pretty safe fix that should make content-encoding play nice with plugins in general...
Attachment #189079 -
Flags: approval1.8b4?
Comment 24•19 years ago
|
||
Comment on attachment 189079 [details] [diff] [review] Same as diff -w What would be involved in creating a testcase for this behavior?
Attachment #189079 -
Flags: approval1.8b4? → approval1.8b4+
Reporter | ||
Comment 25•19 years ago
|
||
Depends on what you want to test. If you want to see the difference in behavior between encoding and no encoding, I can set my server up for that. It's just my home PC on a DSL line, so it can't handle a huge amount of traffic, but it should suffice for testing.
Assignee | ||
Comment 26•19 years ago
|
||
Bsmedberg, you mean a testcase that becomes part of our basic unit tests? We need an https-enabled server, preferably one capable of running CGIs (so that we can control the headers) and which enables gzip compression (in a controllable way) for the output of said CGIs.
Assignee | ||
Comment 27•19 years ago
|
||
Fixed.
Status: ASSIGNED → RESOLVED
Closed: 19 years ago → 19 years ago
Resolution: --- → FIXED
Comment 28•19 years ago
|
||
*** Bug 310871 has been marked as a duplicate of this bug. ***
You need to log in
before you can comment on or make changes to this bug.
Description
•