ftp: very large files transfers (100MB+) use memory




18 years ago
16 years ago


(Reporter: sergiojr, Assigned: law)



Windows 2000

Firefox Tracking Flags

(Not tracked)




18 years ago
Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; 0.8) Gecko/20010215
Steps to reproduce:
Start downloading very large file(>100Mb).

Excepted result:
While downloading Mozilla copies small pieces(several Mb) to destination and
then append them with new downloaded data.

What happens:
Mozilla copies files to memory, that cause lots of swapping and leak of swap file.

Comment 1

18 years ago
Would this be related to caching?

Comment 2

18 years ago
No, there is no cache involved.  this problem (if it were true) would be a bug 
in the xfer code.  Reassigning to law.

bill, is this even what happens?  I thought that you just hooked things up to 
the file transport and wrote out to disk.
Assignee: dougt → law

Comment 3

18 years ago
There may be caching problems.  When we begin the load, we don't know what the
content type is, or that it will be written to disk.  So our load request goes
through the cache, which is good if the file is in the cache.  If the file's not
in the cache, and will be written to disk, then I'm not sure how/if we could
tell the cache not to put it there.  I believe there was/is an old bug about
that problem (downloaded files filling the cache).

How do I observe this "Mozilla copies files to memory" business?

Is the behavior different if one turns off the cache?

Comment 4

18 years ago
Marking NEW.
Severity: normal → enhancement
Ever confirmed: true

Comment 5

17 years ago
Is this bug still valid? (I ask this because bug 82478 "No error on disk full
when downloading a large ftp file." _is_ still valid, and I doubt that mozilla
uses both memory and the /tmp dir at the same time for the same purpose.)

Comment 6

17 years ago
I get the same thing using Mozilla build 2001081715 for Linux (running Debian,
kernel 2.4.4)

Whenever I download a large file using save as... The file fills out the memory
and once not memory is available, it starts using swap. This is very annoying as
anything over 100MB I have to download using another browser because it's just
too painfully slow (I have 128MB of memory) with Mozilla.

Comment 7

17 years ago
Requires investigation.  We should be doing "normal" Necko loads and we're not
putting the data into memory ourselves.  Maybe a memory cache issue?
Target Milestone: --- → mozilla1.0


17 years ago
QA Contact: tever → benc

Comment 8

17 years ago
This is a bug, so I consider this a perf issue, not an enhancement.
Lets get a full set of steps, and a sample URL of a public site w/ a large file, 
so we all can use thes same example.
Severity: enhancement → normal
Keywords: perf
Summary: [RFE] While saving large files via ftp, Mozilla try to keep them in memory, not on disk → ftp: very large files transfers (100MB+) use memory

Comment 9

17 years ago
Mozilla 0.9.5 doesn't have this problem, but still don't have desired behavior.
That cause lots of swapping at the end of the copying process. So i wasn't able
to do anything with Mozilla for about 1 minute, while copying process wasn't
I downloaded 100Mb file from ftp via 100Mbit LAN. OS - Win2k

Comment 10

17 years ago
Presacari, could the behavior you observe be due to the fact that we first 
download to a temporary destination and then move it to the final destination?

Most likely that's the reason you see what you see.  If you take issue with 
that, then it's addressed by bug 55690.

Moving this one to future, unless some abnormal memory use and/or swapping is 
still observed aside from that due to the temp-to-final-destination copy.
Target Milestone: mozilla1.0 → Future

Comment 11

17 years ago
I know this happens in Solaris, but I don't understand how it happens in Win32
systems, TMP isn't mapped to memory is it?

Comment 12

17 years ago
I am seeing this in
Gecko/20011123  Mozilla/5.0 (X11; U; Linux i686; en-US; rv:0.9.6+) 
(daily build from mozilla.org)
Linux 2.4.4 suse kernel. on a basic suse7.0 system.
800 Mz, 256M memory.

I wanted to download the darwin deveopment tools which 
are a 180M ftp download. 

The result is dreadful. During the download
the size and rss values of top drift up to 200M
CPU usage is at 90%. Mozilla takes up to 30 seconds
to react to user input.

The whole system is unbareably slow. Mozilla is unusable
for large FTP downloads. This is NOT just at the end of the
download but the whole 1hour that the file took to download.

Comment 13

17 years ago

Sounds like it goes to mem cache on all systems.

bill, if you own this, should I send this to "File Handling"?
Keywords: mozilla1.0

Comment 14

17 years ago
Last comment was in November. Does this problem still exist?

Comment 15

16 years ago
I think this was addressed in another bug somewhere...
Whiteboard: dupeme or dependsme

Comment 16

16 years ago
This should be fixed in a newer build


*** This bug has been marked as a duplicate of 91795 ***
Last Resolved: 16 years ago
Resolution: --- → DUPLICATE
Whiteboard: dupeme or dependsme

Comment 17

16 years ago
-> cache, because that is where the fix is...

(am I the only person who thinks the real problem is downloads shouldn't go to
memory cache?).
Component: Networking: FTP → Networking: Cache
QA Contact: benc → tever
You need to log in before you can comment on or make changes to this bug.