Last Comment Bug 184452 - Necko - Allow handling of files > 2gig (>2 GB)
: Necko - Allow handling of files > 2gig (>2 GB)
Status: RESOLVED FIXED
DO NOT COMMENT HERE (see also comment...
:
Product: Core
Classification: Components
Component: Networking (show other bugs)
: Trunk
: All All
: -- normal with 69 votes (vote)
: Future
Assigned To: Nobody; OK to take it and work on it
: benc
: Patrick McManus [:mcmanus]
Mentors:
: 131439 205443 215091 225866 226391 229979 231788 242859 245115 248482 256338 260859 277785 286187 288939 290236 293036 293615 301543 304161 308424 311344 317085 336001 350903 (view as bug list)
Depends on: 238853 243974 264599
Blocks: 207400 215450 NegativeDownload 232371 263967
  Show dependency treegraph
 
Reported: 2002-12-09 09:41 PST by Doug Turner (:dougt)
Modified: 2013-06-28 21:30 PDT (History)
74 users (show)
See Also:
Crash Signature:
(edit)
QA Whiteboard:
Iteration: ---
Points: ---
Has Regression Range: ---
Has STR: ---


Attachments
DOwnload Screenshot (16.12 KB, image/jpeg)
2005-03-11 19:37 PST, Marc
no flags Details
Firefox 3.5 passed the 2GB mark successfully (18.34 KB, image/png)
2009-10-17 21:00 PDT, Joe Amenta
no flags Details

Description Doug Turner (:dougt) 2002-12-09 09:41:11 PST
we may need to create a 64 bit version of the necko interfaces.
Comment 1 Darin Fisher 2002-12-09 09:48:59 PST
the only way to really support >2G downloads would be to switch all interfaces
over and make those interfaces be the only way to do things.  this means
deprecating several key xpcom/necko interfaces (namely nsIInputStream and
nsIStreamListener).
Comment 2 David Bradley 2002-12-09 09:53:00 PST
Someone needs to really see how common these large files are now, and/or gauge
when they will be common. I've seen one game demo that broke the 1gig mark, and
I suspect this to be more and more common. That's still not 2gig's but I'm sure
it's comming in the not too distant future.

Another alternative is to do something similiar to what Microsoft did. Provide
alternatives that allow a secondary 32 bit value to be passed. This isn't as
clean, but allows existing code to work unchanged if it's known it doesn't have
to deal with such large files.
Comment 3 Christian :Biesinger (don't email me, ping me on IRC) 2002-12-09 12:23:12 PST
darin: are those interfaces frozen?
Comment 4 Darin Fisher 2002-12-09 13:14:29 PST
biesi: last i checked ;-)
Comment 5 benc 2003-03-25 08:12:28 PST
Shouldn't this really be "allow 2GB filesizes via 64 bit interfaces" ?
Comment 6 David Bradley 2003-03-25 08:15:22 PST
I guess it depends on what you mean 64 bit interfaces? Win32 a 32 bit interface
API provides 32 bit API functions that handle access to files larger than 2 gigs.
Comment 7 Darin Fisher 2003-05-12 22:48:17 PDT
*** Bug 205443 has been marked as a duplicate of this bug. ***
Comment 8 Frank Wein [:mcsmurf] 2003-08-08 11:03:51 PDT
*** Bug 215450 has been marked as a duplicate of this bug. ***
Comment 9 Frank Wein [:mcsmurf] 2003-08-08 11:11:41 PDT
*** Bug 131439 has been marked as a duplicate of this bug. ***
Comment 10 David Bradley 2003-08-08 11:32:10 PDT
Going to make this more general, so that it can apply to upload as well as download.
Comment 11 Boris Zbarsky [:bz] (still a bit busy) 2003-11-16 01:54:25 PST
*** Bug 215091 has been marked as a duplicate of this bug. ***
Comment 12 Boris Zbarsky [:bz] (still a bit busy) 2003-11-16 01:54:41 PST
*** Bug 225866 has been marked as a duplicate of this bug. ***
Comment 13 Bill Mason 2003-11-20 22:57:50 PST
*** Bug 226391 has been marked as a duplicate of this bug. ***
Comment 14 David Bradley 2003-11-22 20:40:58 PST
Well I guess we've arrived at the not too distant future
Comment 15 Darin Fisher 2003-11-22 22:17:30 PST
Inf / 10 == Inf :(
Comment 16 Christian :Biesinger (don't email me, ping me on IRC) 2003-11-23 07:55:13 PST
ok, list of frozen interfaces that would require changes for files >2gb:

nsIChannel.idl:
  attribute long contentLength;

nsIStreamListener:
 67                          in unsigned long aOffset,
 68                          in unsigned long aCount);
(parameters of onDataAvailable)

nsIFile.idl: surprisingly this requires no changes.

nsIInputStream.idl: several functions + nsWriteSegmentFun
nsIOutputStream.idl: basically same as nsIInputStream

nsIScriptableInputStream:
 49     unsigned long available(); 
 55     string read(in unsigned long aCount); 

-afaik this is a complete list of the frozen interfaces that would require changes-

of the unfrozen ones, nsIWebProgressListener.idl comes to mind, but most likely
there are others.
Comment 17 Christian :Biesinger (don't email me, ping me on IRC) 2003-11-23 08:05:36 PST
unfrozen ifaces in xpcom:
nsIAsync{Input,Output}Stream: in unsigned long aRequestedCount
nsIByteArrayInputStream.idl: (maybe)
NS_NewByteArrayInputStream (nsIByteArrayInputStream ** aResult, char * buffer,
unsigned long size);
(size would be the part to change, but who would create a byte input stream with
more than 4 GB?)

nsIObjectInputStream: unlikely (putBuffer(in charPtr aBuffer, in PRUint32 aLength))

nsIObservableOutputStream.idl:     void onWrite(in nsIOutputStream outStr, in
unsigned long amount);

nsIPipe.idl: segmentSize, segmentCount

nsISeekableStream.idl:
    void seek(in long whence, in long offset);
    unsigned long tell();

nsIStringStream.idl: similar to nsIByteArrayInputStream
Comment 18 Manoj 2003-11-27 01:58:59 PST
*** Bug 215450 has been marked as a duplicate of this bug. ***
Comment 19 Christian :Biesinger (don't email me, ping me on IRC) 2003-12-19 08:30:52 PST
*** Bug 228968 has been marked as a duplicate of this bug. ***
Comment 20 Robert Accettura [:raccettura] 2003-12-19 16:44:51 PST
I think were going to see more 2GB+ downloads in the future, as said earlier,
some gaming demo's are creeping up already to the 1GB mark.  2GB is only a
matter of time.

DVD images could also be that large.
Comment 21 Frank Wein [:mcsmurf] 2004-01-03 12:59:21 PST
*** Bug 229979 has been marked as a duplicate of this bug. ***
Comment 22 David :Bienvenu 2004-01-22 10:53:27 PST
I think that list of frozen interfaces is wrong - if unsigned longs are used,
that gives us 4GB, not 2GB. The content length is an issue, though happily, not
for mailnews, since we only open streams on parts of a file :-)
Comment 23 Christian :Biesinger (don't email me, ping me on IRC) 2004-01-22 12:54:09 PST
sure, 4 GB are better than 2 GB, but I don't think we should limit these APIs to
4 GB either.
Comment 24 Stefan Huszics 2004-03-02 03:04:08 PST
2GB -> 4GB is a bandaid, not a fix
For starters a DVD image is easily above 4 GB...

48 or 64bit sizes feels like the way to go here, unless people want to keep
revisiting this issue every 6 months.
Comment 25 David :Bienvenu 2004-04-21 15:15:45 PDT
nsISeekableStream now supports 64 bit streams (though some implementations will
truncate at 32 bits and ASSERT)
Comment 26 WD 2004-05-06 20:57:57 PDT
*** Bug 242859 has been marked as a duplicate of this bug. ***
Comment 27 Bill Mason 2004-05-30 15:53:42 PDT
*** Bug 245115 has been marked as a duplicate of this bug. ***
Comment 28 Bogdan Stroe 2004-06-20 09:49:29 PDT
Is bug 247599 related to this? Probably its the same issue of 32 vs 64 bits
representation, but maybe in a different part of Mozilla.
Comment 29 Christian :Biesinger (don't email me, ping me on IRC) 2004-06-20 12:35:37 PDT
(In reply to comment #28)
> Is bug 247599 related to this? Probably its the same issue of 32 vs 64 bits
> representation, but maybe in a different part of Mozilla.

um, 4 MB can fit easily into a 32 bit variable. that bug is not related.
Comment 30 Mike Connor [:mconnor] 2004-06-24 07:50:16 PDT
*** Bug 248482 has been marked as a duplicate of this bug. ***
Comment 31 Peter van der Woude [:Peter6] 2004-08-20 14:04:14 PDT
*** Bug 256338 has been marked as a duplicate of this bug. ***
Comment 32 Jo Hermans 2004-09-21 15:03:13 PDT
*** Bug 260859 has been marked as a duplicate of this bug. ***
Comment 33 Marek Beyer 2004-10-07 02:44:18 PDT
here an example for an application:

- a web-site with a form to choose software/update packages
- after submit the server sends you an ISO-image ready for burning on a CD/DVD

Problems:

- content-length-header ist limited to 2GB in mozilla (some browsers 4GB)
- without content-length-header the download stops after 2.4GB (of 5.5GB)

so it's time for the future :) or we have to use CDs forever
Comment 34 David Bradley 2004-10-07 05:51:35 PDT
I really think this needs some serious attention. This will be just another 
excuse for people not to use Gecko based browsers. I'm sure in intranet 
environments such large files are going to be more and more common.
Comment 35 Bill Mason 2004-10-27 14:17:22 PDT
*** Bug 266323 has been marked as a duplicate of this bug. ***
Comment 36 Olivier Vanderstraeten 2004-11-12 10:49:20 PST
I just ran into this bug.

While I'm just a user, I'm commenting to agree with the fact that while
currently uncommon, files in excess of 2 GB (or 4 GB) will be seen with
increasing regularity.  In my case, FC3 DVD ISO at 2.4 GB.  I realize the bug is
mostly cosmetic, but everything else in Mozilla/Firefox is so polished that it
really stood out.
Comment 37 Aleksanteri Aaltonen 2004-11-12 14:03:32 PST
This probably will never get fixed, altough it's being constantly reported as a
new bug (even I couldn't find it on bugzilla for the first time I reported it)

So much for the 1.0 hype:
- Tabs still open to windows which have no toolbars or tab-bars making it
impossible to use them (i.e. opening links in a popup window)
- DM does not behave correctly with files larger than 4GB
- DM retry-function seldomly works
- Crashes with known _overflow_ exploits
(http://lcamtuf.coredump.cx/mangleme/gallery/)
- on win32 does not recover quickly from being minimized for few hours (over
night), it's rather funny how I can leave opera or ie windows open and start
using them in the morning with no lag whatsoever, but with mozilla? no no.
Comment 38 logan 2004-11-12 14:10:20 PST
This is not a Firefox download manager tracking bug nor is it a place for you to
rant about the hype surrounding Firefox 1.0 and the problems you've had with it.
Comment 39 Aleksanteri Aaltonen 2004-11-12 14:27:56 PST
Oh, sorry. I thought this was the Firefox-product.. When I reported the bug I
put it under Firefox/DM, but this seems to be general Browser/Networking. 

Once again, I'm truly sorry for the confusion here.
Comment 40 Phil Ringnalda (:philor) 2004-11-29 15:57:37 PST
*** Bug 272315 has been marked as a duplicate of this bug. ***
Comment 41 Peter van der Woude [:Peter6] 2005-01-10 10:29:58 PST
*** Bug 277785 has been marked as a duplicate of this bug. ***
Comment 42 Marc 2005-03-11 19:37:25 PST
Created attachment 177200 [details]
DOwnload Screenshot

Screenshot of this bug in action
Comment 43 Frank Wein [:mcsmurf] 2005-03-12 01:08:11 PST
thewulf@gmail.com: That's Bug 228968 AND DON'T ATTACH ANY SCREENSHOTS ON THAT
BUG EITHER. We really know it's displaying negative values, no need for more
screenshots.
Comment 44 Eero Volotinen 2005-03-12 02:52:37 PST
This bug is on way for very long time, it should be fixed _fast_

just my .5 snts
Comment 45 Marc 2005-03-12 05:13:27 PST
I just didn't see any existing SS so I figured might as well. No biggie.
Comment 46 Kevin Brosnan 2005-03-14 21:49:44 PST
*** Bug 286187 has been marked as a duplicate of this bug. ***
Comment 47 Matthias Versen [:Matti] 2005-04-04 15:29:42 PDT
*** Bug 288939 has been marked as a duplicate of this bug. ***
Comment 48 Gabriel Chadwick 2005-04-05 18:59:57 PDT
*** Bug 288939 has been marked as a duplicate of this bug. ***
Comment 49 Matthias Versen [:Matti] 2005-04-13 15:25:57 PDT
*** Bug 290236 has been marked as a duplicate of this bug. ***
Comment 50 Andreas Fink 2005-04-13 15:48:05 PDT
I already run multiple times into this bug. once with fedora DVD image and one
wihth MacOS X DVD disk images. Both are around 2.5GB. So its definitively time
to fix this.

It is VERY ANNOYING if you download a 2.5GB file (took me 4h) and you end up
with a 2.0GB file without any error message whatsoever. You will burn it to DVD,
try to boot it. YOu will burn it again and again until you realize your image
file is too small for what it should be.

So you wasted already more time on this than what it takes to fix this :).
Comment 51 Matthias Versen [:Matti] 2005-05-05 11:54:36 PDT
*** Bug 293036 has been marked as a duplicate of this bug. ***
Comment 52 Jaime Mitchell (use bugmail@jaimem.org.uk for email) 2005-05-10 07:17:00 PDT
*** Bug 293615 has been marked as a duplicate of this bug. ***
Comment 53 dredd 2005-05-30 07:25:29 PDT
2.5 Years into this bug, and I still have to go use "some other browser" if I
want to download DVD ISOs (for things like Linux, etc.). 

Just another user voting that this really really needs to get fixed at some
point in the near future.
Comment 54 Christian :Biesinger (don't email me, ping me on IRC) 2005-05-30 09:16:01 PDT
(In reply to comment #53)
> 2.5 Years into this bug, and I still have to go use "some other browser" if I
> want to download DVD ISOs (for things like Linux, etc.). 

that is fixed (in versions newer than 1.0.x). my understanding is that this bug
refers to other places as well, not just downloads, and not all of those are fixed.
Comment 55 Andreas Fink 2005-05-30 11:37:45 PDT
(In reply to comment #54)
> (In reply to comment #53)
> > 2.5 Years into this bug, and I still have to go use "some other browser" if I
> > want to download DVD ISOs (for things like Linux, etc.). 
> 
> that is fixed (in versions newer than 1.0.x). my understanding is that this bug
> refers to other places as well, not just downloads, and not all of those are
fixed.

This is not true. it is not fixed. Of course you can download 2.5GB files with
Firefox. The download just stops after 2048MB and you THINK it has downloaded
everything. It would have been nice to have a dialogbox popping up in the
beginning saying the file is too big or such. But no, you have to wait hours and
hours to realize that all your download is wasted bandwith.

A very SERIOUS bug, especially its so old by now.
I had this again in Firefox 1.0.2. And I'm sure its still there in 1.0.4 (And
no, I wont try it until someone confirms its fixed, as downloading >2GB takes an
awful lot of time for me).
Comment 56 Robert Parenton 2005-05-30 11:54:29 PDT
He said newer than 1.0.x, which means 1.1, which is still in alpha
Comment 57 tachyon.eagle 2005-05-31 09:22:17 PDT
yes, I can confirm that it's fixed for 1.1 tree , I am using nightly snapshot of
Deer Park alpha 1 (2005.05.30) and I've just donwloaded 2.5 GB iso which passes
its own crc test ok. However I haven't found  > 4GB file on fast nearby network
so I don't know anything about > 4GB files yet, but if there are 64bit
interfaces already (and it seems like that from source codes to me) it will be
ok too.
Comment 58 Christian :Biesinger (don't email me, ping me on IRC) 2005-05-31 10:18:55 PDT
> but if there are 64bit
> interfaces already (and it seems like that from source codes to me)

yeah, for downloads there are. not for uploads... and maybe some other stuff too. 
Comment 59 Ethan T 2005-06-08 13:29:10 PDT
(In reply to comment #55)
> And I'm sure its still there in 1.0.4

I am using 1.0.4, and I just downloaded FC3 via FTP, which is 2.3GB. The
download counter went negative near the end, but the download was still valid.
Firefox produced a 2.3GB file that contains data throughout the entire file, so
I believe it's legit. That would suggest the bug is /partially/ fixed.
Comment 60 Andreas Fink 2005-06-10 15:32:39 PDT
I tried the same but in my case it used HTTP instead of FTP.
I have a file of 2117734496 bytes (1.97GB) instead of 2.7GB as it should be.
So the protocol type does make a difference.
Comment 61 Bojan Antonovic 2005-06-11 02:03:09 PDT
(In reply to comment #57)
> yes, I can confirm that it's fixed for 1.1 tree , I am using nightly snapshot of
> Deer Park alpha 1 (2005.05.30) and I've just donwloaded 2.5 GB iso which passes
> its own crc test ok. However I haven't found  > 4GB file on fast nearby network
> so I don't know anything about > 4GB files yet, but if there are 64bit
> interfaces already (and it seems like that from source codes to me) it will be
> ok too.

Debian 3.1 is >4 GB. See:
http://cdimage.debian.org/debian-cd/3.1_r0a/i386/iso-dvd/debian-31r0a-i386-binary-1.iso

(or use a mirror)

Firefox 1.0.4 stopps at 2 GB.
Comment 62 tachyon.eagle 2005-06-11 04:30:26 PDT
(In reply to comment #61)
> Firefox 1.0.4 stopps at 2 GB.

yes, indeed, but use 1.1 version tree (
http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/latest-trunk/ ), which
has it fixed by now, and you'll see that it work for > 4 GB with FTP or HTTP
(I've tried your iso link from both and run verification on files,they are
absolutely ok and over 4 GB) 
Comment 63 Ken Johanson 2005-06-16 17:22:46 PDT
It appears that the download manager.. or at least support for byte-range-resume
does not work yet. When I try to resume ('retry' as download mgr calls it), the
download manager app does not actually send the new (range) request...

sample: http://up.ascentmedia.com/upweb/test.jsp?file=FC4-i386-DVD.iso
Comment 64 Ken Johanson 2005-06-16 18:33:34 PDT
(In reply to comment #63)
> .. or at least support for byte-range-resume does not work yet.

Seems to be true for files less than 2GB as well.
Comment 65 Elmar Ludwig 2005-07-21 04:09:18 PDT
*** Bug 301543 has been marked as a duplicate of this bug. ***
Comment 66 Ryan Flint [:rflint] (ping via IRC for reviews) 2005-09-13 22:04:35 PDT
*** Bug 308424 has been marked as a duplicate of this bug. ***
Comment 67 jrl 2005-09-29 19:04:32 PDT
*** Bug 231788 has been marked as a duplicate of this bug. ***
Comment 68 Heinz Stüßer 2005-10-01 07:50:07 PDT
I think there is another serious problem - and reason for fixing this bug. When
trying to download files > 4GB (for example
ftp://sunsite.informatik.rwth-aachen.de/pub/Linux/suse/i386/9.3/iso/SUSE-9.3-Eval-DVD.iso)
the browser (firefox 1.0.7, running under SuSE Linux 9.1) crashes when reaching
the 4GB border. I suppose the reason might be a "division by zero" as the
average transfer rate swapped to neg. numbers after passing the 2GB and grew
afterwards until it reached zero - and the browser crashed.
Comment 69 Andrew Schultz 2005-10-01 09:36:21 PDT
Heinz: that's a different problem.  please filea new bug if you can reproduce it
with firefox 1.5 beta.
Comment 70 Kevin Brosnan 2005-10-20 06:43:04 PDT
*** Bug 311344 has been marked as a duplicate of this bug. ***
Comment 71 Bojan Antonovic 2005-11-07 23:57:24 PST
Firefox RC1 (Mac) cuts files of length over 4 GB to 4 GB when "downloading" from a  file in the harddisk. The Debian ISOs are over 4 GB.

Bojan
Comment 72 Bojan Antonovic 2005-11-08 00:41:47 PST
Firefox 1.0.7 (Mac) cuts files of length over 4 GB to 4 GB, like Firefox 1.5 RC1, when "downloading" from a file in the harddisk. It shows the negative size of the downloaded part as described elsewhere in this bug report. 

The local download is a good trick to test the download manager. Can please someone, at best with a local network at home, retest the 4 GB limit with all downloading possibilites (HTTP, FTP, harddisk) before Firefox 1.5 is released? Tests with HTTP and FTP seemed to be made and work. However, this bug should be fixed, doesn't matter how curious the download possibilities are.

Bojan

PS: I meant Firefox 1.5 RC1 instead of Firefox RC1 on my previous post. :)
Comment 73 Ryan Flint [:rflint] (ping via IRC for reviews) 2005-11-19 00:16:05 PST
*** Bug 317085 has been marked as a duplicate of this bug. ***
Comment 74 Bojan Antonovic 2005-12-05 07:12:45 PST
Local-to-local downloads are still limited to 4 GB in Firefox 1.5.

Bojan
Comment 75 Ryan Flint [:rflint] (ping via IRC for reviews) 2005-12-13 10:29:29 PST
*** Bug 320136 has been marked as a duplicate of this bug. ***
Comment 76 Mike 2006-03-21 14:25:41 PST
This is still broken in Firefox 1.5... I just spent the past couple of hours trying to download the DVD image of Fedora Core 5 (~3GB in size), and it was running smoothly the entire time. Once it hit 2GB though, it just stopped and Firefox said the download was finished when in fact it really wasn't. The download manager showed the correct status and file size, but gave me no indication that it would quit automatically after 2GB.
Comment 77 Jon Watte 2006-03-22 07:51:35 PST
I'm using FireFox 1.5.0.1 on Windows, and was downloading the purchase of Oblivion (which is slightly bigger than 4 GB). After 4 GB (and three hours), it stopped with a write error that suggested I try saving the file somewhere else.

Two problems:

1) It really needs to support > 32 bits. fpos_t exists for a reason.

2) As long as it doesn't support > 32 bits, it needs to tell me that it won't work, with a good error message, ideally before it tries to download.
Comment 78 Christian :Biesinger (don't email me, ping me on IRC) 2006-03-22 08:47:37 PST
this is the wrong bug. but as everyone keeps posting to it...

What filesystem are you saving to when it doesn't work in 1.5?
Comment 79 cls 2006-03-22 15:37:03 PST
biesi, what's the correct bug?  I noticed that after I downloaded the FC5 iso images that the DVD iso doesn't even show up in the directory lists.  Via http, this could very well be a bug in apache 2.0 but via file://, it's a moz bug.  In fact, via file://, ff 1.5 won't even load the directory that contains the image.  The FS is ext3.  
Comment 80 Jon Watte 2006-03-23 07:39:32 PST
I'm saving to NTFS when it doesn't work in 1.5.0.1, running Windows XP SP2.

Btw: If this is the wrong bug, then what bug should I re-direct to? This bug was what came up when doing a search.
Comment 81 Ken Johanson 2006-03-23 08:18:00 PST
Yes - if this is the 'wrong bug', which one is correct? Is there one that is general to the file & stream interfaces and not just networking or Necko? The interfaces listed earlier appear to be core? I could not find a 'closer' bug...

Or do we need a new bug/RFE for the core file/stream (not network) interfaces? One that gets the attention of the specific owners? My searching did not find another bug/RFE for file/stream that suggests (rightfully) depecating ALL 32 bit file/stream interfaces (deprecate 32 bit and adjunct new 64 bit interfaces - not change existing ones as seems to have been implied here and the reaons for no action).

Otherwise this seems to be closet match for well-doers (who are not product specialists) to express frustration (it is 3-1/2 years old and 'new', after all).
Comment 82 Christian :Biesinger (don't email me, ping me on IRC) 2006-03-23 08:22:47 PST
bug 243974 is right, and FIXED, which is what I thought the state of that issue is. the relevant download interfaces do support this. in fact last I tested this it worked for me. new issues for specific download problems should get new bugs.

comment 54 describes what this bug is about.
Comment 83 Jon Watte 2006-03-24 15:16:42 PST
This is not the right bug for file download disk writing problems with files > 4 GB. Also, bug 243974 was not consistent with the symptoms I saw in 1.5.0.1.

See new bug: https://bugzilla.mozilla.org/show_bug.cgi?id=331647
Comment 84 Wayne Mery (:wsmwk, NI for questions) 2006-04-03 15:49:32 PDT
*** Bug 299598 has been marked as a duplicate of this bug. ***
Comment 85 Valerio Messina 2006-04-15 12:11:35 PDT
until this bug is directly linked from kernel.org FAQ, we got a lot of request to fix.
http://www.kernel.org/faq/#largefiles
Comment 86 Mime Čuvalo 2006-05-29 02:40:15 PDT
This 4GB limit size is a problem that I've run into in developing FireFTP.  It seems the problem lies with nsIBinaryOutputStream.writeBytes:

Error: [Exception... "Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [nsIBinaryOutputStream.writeBytes]" nsresult: "0x80004005 (NS_ERROR_FAILURE)" location: "JS frame :: chrome://fireftp/content/js/connection/dataSocket.js :: anonymous :: line 258" data: no]
Comment 87 Ria Klaassen (not reading all bugmail) 2006-08-31 14:55:53 PDT
*** Bug 350903 has been marked as a duplicate of this bug. ***
Comment 88 Ryan Jones 2006-09-15 06:00:47 PDT
*** Bug 350903 has been marked as a duplicate of this bug. ***
Comment 89 wally 2006-09-26 00:50:29 PDT
Besides DVDs and video media files, I keep running into this bug downloading the English wikipedia for offline processing.  The compressed version of the database (one entry per article without revision history) has been over 2 Gbytes for some time.  A recent image is at http://download.wikipedia.org/enwiki/20060920/enwiki-20060920-pages-meta-current.xml.bz2

I hope SeaMonkey/Firefox/Mozilla would be able to download this before Microsoft fixes XP's builtin ftp utility....
Comment 90 dongjian 2007-04-15 00:58:45 PDT
Hello ,Give my vote to you
Comment 91 Henrik Skupin (:whimboo) 2007-05-10 11:17:35 PDT
*** Bug 304161 has been marked as a duplicate of this bug. ***
Comment 92 Carsten Book [:Tomcat] 2007-06-03 16:42:47 PDT
*** Bug 336001 has been marked as a duplicate of this bug. ***
Comment 93 Anton Lavrentiev 2007-07-12 07:31:47 PDT
Sites that post multi-gig data files do commonly become the reality.
Firefox cannot handle them gracefully (neither can IE, but who cares?).
As of today's mainstream browsers (not concerning command line tools)
only the latest Opera was able to download this file correctly
(i.e. completely):

ftp://ftp.ncbi.nih.gov/pub/geo/DATA/supplementary/series/GSE2109/GSE2109_RAW.tar
(and there are many more similar huge files there)

So I am voting for this bug!
Comment 94 Josef Goldbrunner 2007-07-27 03:41:37 PDT
My download (MSDN Servicepack 1)
http://download.microsoft.com/download/8/5/4/854f7409-47bd-41a2-b3b2-1a4875294550/MSDVDEUDVDX1370478.img
stopped at 2 GB (90 %).
But this download has 2,320,840,704 Bytes.
So I cannot download files greater than 2 GB by Firefox.
 
Comment 95 Doug Turner (:dougt) 2007-10-08 16:15:44 PDT
mass reassigning to nobody.
Comment 96 Robert 2007-10-17 04:15:49 PDT
(In reply to comment #94 by Josef Goldbrunner)
> http://download.microsoft.com/download/8/5/4/854f7409-47bd-41a2-b3b2-1a4875294550/MSDVDEUDVDX1370478.img

This download (2.2G) works for me without problems (Firefox 2.0.0.6, actually Debian's Iceweasel). The download completes without errors, and the DVD image is ok. Which version of Firefox were you using?
Can this possibly be a built difference between Josef's version and mine (Windows vs. Linux)?

Also, I have no problems downloading files larger than 4G via http and ftp. Why am I not hitting this bug?
Comment 97 Jo Hermans 2008-01-14 08:39:45 PST
*** Bug 412262 has been marked as a duplicate of this bug. ***
Comment 98 Jim Michaels 2008-06-20 00:34:20 PDT
I am having this problem with both firefox 3.0 and 2.0.0.14 for windows.  truncates them to 2GiB.  
somebody's using a signed 32-bit integer somewhere...
one of the places I am having this problem with is www.opensuse.com trying to download opensuse 11.0 DVD which is 4.3GB.
I have files on my web site which are also 4.6GB.

*please* fix!
Comment 99 Joe Amenta 2009-10-17 21:00:09 PDT
Created attachment 406896 [details]
Firefox 3.5 passed the 2GB mark successfully

Running Firefox 3.5.3, I was able to download the 10.1 Gentoo Live DVD, a 2.6 GB file found at http://distfiles.gentoo.org/releases/amd64/10.1/livedvd-amd64-multilib-10.1.iso

User Agent string: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.3)  Gecko/20090910 Ubuntu/9.04 (jaunty) Firefox/3.5.3

Package information: 3.5.3+build1+nobinonly-0ubuntu0.9.04.2

This package was built from http://archive.ubuntu.com/ubuntu/pool/universe/f/firefox-3.5/firefox-3.5_3.5.3+build1+nobinonly.orig.tar.gz patched with http://archive.ubuntu.com/ubuntu/pool/universe/f/firefox-3.5/firefox-3.5_3.5.3+build1+nobinonly-0ubuntu0.9.04.2.diff.gz with cosmetic local modifications following the steps on the first post of http://ubuntuforums.org/showthread.php?t=1225754

Is it safe to mark this bug as fixed?
Comment 100 Jim Michaels 2009-10-18 16:45:18 PDT
I know the FF 3.52 download manager handles files over 2GiB.  I just tested it with a local web page and a DVD ISO file.
Comment 101 wally 2009-10-18 16:59:14 PDT
Agreed.  Just downloaded the 10,181.73 MiB (9.94 GiB) file from http://download.wikimedia.org/enwiki/20091009/enwiki-20091009-pages-meta-current.xml.bz2 using FF 3.5.3 on Windows XP SP2.  It's fixed.  It works great.
Comment 102 David Bradley 2009-10-19 09:07:58 PDT
Looks like all the bugs this is depending on are fixed and so this is now working. Marking fixed.
Comment 103 Christian :Biesinger (don't email me, ping me on IRC) 2009-10-19 14:32:51 PDT
I always saw this bug to be about all parts of necko that handle 32-bit file sizes, and not all of those are fixed. in particular, upload isn't...
Comment 104 David Bradley 2009-10-19 15:22:21 PDT
Oh, true, completely forgot about upload. Can't seem to find any place where I could even begin to upload >2gig file.
Comment 105 Guillaume Parent 2009-10-20 06:20:20 PDT
You do it locally, with a .html page saved on your disk (or a local webserver) that has some sort of upload system on it. Unfortunately the only upload system I know is Uber-Uploader, and I've never checked if it handles 2 GB files.
Comment 106 Mime Čuvalo 2009-11-23 15:45:43 PST
(In reply to comment #104)
> Oh, true, completely forgot about upload. Can't seem to find any place where I
> could even begin to upload >2gig file.

You can upload multi-GB files on YouTube (through regular POST, not flash upload): http://www.youtube.com/my_videos_upload?nobeta

Cheers from the YT team - we hope this can be fixed soon obviously :P


For reference, to make large files:
Linux:
dd if=/dev/zero of=4gbfile bs=1024 count=4194304
Windows:
fsutil file createnew d:\temp\4gbfile.txt 4294967296
Comment 107 Guillaume Parent 2009-11-23 16:25:31 PST
That page specifically states that the files are to be up to 2 GB.
Comment 108 Mime Čuvalo 2009-11-23 16:32:04 PST
(In reply to comment #107)
> That page specifically states that the files are to be up to 2 GB.

So, if you try testing with a larger than 2GB file, yes, it will be marked as 'too big'.  But the POST should upload completely (that is, if this bug were fixed :P)

If you need an additional partner account to test with, let me know (those are up to 20GB - email me directly).  Although, I do see that you guys have the http://www.youtube.com/firefox  channel - maybe you can use that one for testing?  (obviously making the test videos just private ones)
Comment 109 Christian :Biesinger (don't email me, ping me on IRC) 2009-11-24 10:09:46 PST
(In reply to comment #106)
> For reference, to make large files:
> Linux:
> dd if=/dev/zero of=4gbfile bs=1024 count=4194304

You really want to use:
dd if=/dev/zero of=/tmp/4gbfile bs=1024 seek=4194304 count=1
faster, and doesn't actually require 4 GB of disk space :)
Comment 110 silviumc 2012-09-12 05:44:41 PDT
So what's up with this bug?

Firefox 15 still truncates downloads to 4 GB

Chromium downloaded all the 4.2 GB. I'm not trolling anything here, just trying to prove that it wasn't a server or filesystem/OS problem (openSUSE 12.3 Factory)
Comment 111 Josh Aas 2012-09-12 06:05:35 PDT
This might work in Firefox 18 now that bug 784912 has been fixed.
Comment 112 Josh Aas 2012-09-12 06:09:01 PDT
Also probably relevant that bug 215450 was fixed recently.
Comment 113 Christian :Biesinger (don't email me, ping me on IRC) 2012-09-12 10:46:35 PDT
Guys, downloads are supposed to work for a long time now. If they're not, please do file a new bug with steps to reproduce.

Since uploads are also fixed, resolving this bug.

Note You need to log in before you can comment on or make changes to this bug.