Closed Bug 292407 Opened 20 years ago Closed 9 years ago

I get an error reading "document contains no data" whenever uploading to this website

Categories

(Core :: Networking: HTTP, defect)

defect
Not set
normal

Tracking

()

RESOLVED WONTFIX

People

(Reporter: michaelj, Unassigned)

References

()

Details

User-Agent:       Mozilla/5.0 (Windows; U; Windows NT 5.0; rv:1.7.3) Gecko/20041001 Firefox/0.10.1
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.0; rv:1.7.3) Gecko/20041001 Firefox/0.10.1

This is a website where you can look up statistics on any city in the USA, and
you can also upload photos. 

Reproducible: Always

Steps to Reproduce:
1. Go to www.city-data.com
2. Click on the link that reads Do you have any pictures of this city?
Send them to us and we'll show them to thousands of people!
3. Fill out the information and check the appropriate boxes

Actual Results:  
I received an error window reading, "document contains no data"

Expected Results:  
Uploaded the picture normally. Then give a status message reading "your picture
has been uploaded successfully
I think this may have something to do with the way Firefox handles (or doesn't
handle) errors when sending large requests to a webserver.

I've been having exactly this problem myself when trying to set a
LimitRequestBody directive in my apache 1.3.33 server config. This *should*
limit the permitted size of a request, but it causes the above symptoms with
Firefox.

Here's what happens:

* FF sends the request. In my case it is a multi-part POST which includes a file
too big for the limit set by LimitRequestBody.

* The server responds with an expected 413 error page. It then partially closes
the socket, causing a FIN packet to be sent to the browser. Although this means
that the server wont send any more data to the browser, it continues to accept
incoming data from the browser in order to try and avoid a situation that
happens later anyway (read on).

* Instead of displaying the returned 413 error page, the FF ignores both
the 413 error page and the FIN packet and continues to send it's request! The
server continues to soak it up (waiting for the connection to close).

* After a designated timeout period (30 seconds with Apache), the server decides
to give up waiting for the browser, and closes the connection. When the TCP
mechanism at the server-end then recieves further incoming packets from the
browser, it responds with an RST packet. This is precisely the situation that it
tried to prevent earlier, because the browser now believes the connection has
been unexpectedly reset. This causes the browser to discard any previous
response from the server (which is why no one ever sees an error page!) and pop
up the "Document contains no data" dialog.

Although I've not tested it, I suspect FF will also behave the same with server
errors other than 413. IE suffers from the same problem. Interestingly though,
Opera waits for the server to send a "100 Continue" before attempting to upload
the rest of the request and as such it handles errors fine. Maybe this is the
way forward for Firefox?

There's a bug in Apache's Bugzilla tracking the same problem (but from the
server side obviously) at:
    http://issues.apache.org/bugzilla/show_bug.cgi?id=17722
it contains an Ethereal dump of packets showing exactly this behavior.

I'd be interested to hear what the FF developers think about all this.
(Excellent biece of software btw! :o) I've even bought one of your mugs to show
my support!)
*** Bug 297320 has been marked as a duplicate of this bug. ***
Does this still happen?, and is there an RFC that Firefox is violating?

If the server has decided to send no more data, then it will not send the 100 Continue response.
Just tested with FF 1.5.0.3 and the problem still exists.

>If the server has decided to send no more data, then it will not send the 100
>Continue response.

I think you're missing the point a bit. AFAIK, the server will only respond with 100 Continue if this has been requested anyway. The problem isn't that the server doesn't return 100 Continue, it's that FF just keeps on sending a request despite the server's error response saying that the request is too big.

I only suggested that FF request a 100 Continue response (like Opera does) in order to give FF a chance to check for an error returned from the server.
(In reply to comment #4)
> Just tested with FF 1.5.0.3 and the problem still exists.
> 
> >If the server has decided to send no more data, then it will not send 
> >the 100 Continue response.
> 
> I think you're missing the point a bit. AFAIK, the server will only respond
> with 100 Continue if this has been requested anyway. The problem isn't that 
> the server doesn't return 100 Continue, it's that FF just keeps on sending a
> request despite the server's error response saying that the request is too 
> big.

I am certain that I am missing the point! I was merely asking for more 
detailed information. I haven't yet seen Firefox repeatedly sending 
over-sized requests after being sent a pertinent error response, but that does 
sound like a fixable problem
(In reply to comment #5)
> I am certain that I am missing the point! I was merely asking for more 
> detailed information. I haven't yet seen Firefox repeatedly sending 
> over-sized requests after being sent a pertinent error response, but that does 
> sound like a fixable problem

Right, well I think that FF is ignoring the error response from the server because of a flaw in the way in which it sends it's HTTP requests.

I'm not familiar with the code that sends the requests, but I *assume* that what is happening is that FF assembles and sends the *entire* request (this includes headers, multipart mime encoding boundries, form submission data, file data, et al) and only *after* sending the entire request does it check the response from the server.

This normally works fine, except in the following circumstances: If the request is bigger (because, for example, it includes a large file) than the server allows, the server responds immediately after the request *headers* are recieved (i.e., before the data arrives) with "413 Request Entity Too Large". FF obviously receieves and buffers this response, but doesn't actually check it (beacuse it hasn't finished sending the whole request) and continues to send the request. Having sent an error, the server waits for the browser to stop sending the request but eventually gives up (time's out) and resets the connection.

These assumptions are based on 2 things:
1) Observed packet dumps.
   see http://issues.apache.org/bugzilla/show_bug.cgi?id=17722
   (actually from FF 1.0.7, but I dont think anything has changed)
2) Observed behaviour:
   If you upload a file that is too big for the server, but is sufficiently small enough that entire request can be uploaded before the server time's out (30 seconds with Apache), you do actually get to see the 413 error message returned by Apache! If the browser can't upload the file in this time though, you get the familiar "connection reset" message. I have tested this with FF 1.5.0.3.

As for RFCs, RFC 2616 (http://www.ietf.org/rfc/rfc2616.txt) states:

"10.4.14  413 Request Entity Too Large
   The server is refusing to process a request because the request
   entity is larger than the server is willing or able to process. The
   server MAY close the connection to prevent the client from continuing
   the request."

So there is certainly an implication here that the above behaviour should "prevent the client from continuing the request".
That is a lot clearer.

Clearly, Firefox ought to anticipate that a server might close the connection 
(e.g. in the face of Firefox sending a request after the the server has asked 
it not to).

This should not be regarded as unexpected.

I would agree that this looks like a (perhaps minor) standards violation within
necko, and it is to be hoped that someone with the power to fix it comes along.

It looks to me as though Firefox should either i) wait for a response after
the sending the headers (in at least some cases) or ii) regard the 'connection
close' as not abnormal as process the data received.

Arguably, we should do the same as you report for Opera, or at least document
why not.

Uploading files, especially large files is not a frequent use case for a 
browser, and performance ought not to be the priority here.
*** Bug 309705 has been marked as a duplicate of this bug. ***
is Bug 311192 a dup?
confirming (not finding an duplicate)
Status: UNCONFIRMED → NEW
Component: General → Networking: HTTP
Ever confirmed: true
Product: Firefox → Core
QA Contact: general → networking.http
Version: unspecified → Trunk
related (or duplicate) to Bug 213339?  (is bug 135182 in the same vein)
No progress on this?

I believe the problem here is simply that FF doesn't use the "100 continue" mechanism during HTTP POST requests.

As I understand the specs, the "Expect: 100-continue" header (in the request) which should result in an intermediate "HTTP/1.1 100 Continue" response from the server was designed specifically for this situation.

I think FF should make use of this mechanism when POSTing. Not using this mechanism should be considered a bug, because it causes a valid server error message (and possibly accompanying error page) not to be shown.
OS: Windows 2000 → All
Hardware: PC → All
Status: NEW → RESOLVED
Closed: 9 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.