Open Bug 435380 Opened 13 years ago Updated 4 years ago
estimated time and download speed stop reporting when download speed is zero
User-Agent: Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.4; en-US; rv:1.9) Gecko/2008051202 Firefox/3.0 Build Identifier: Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.4; en-US; rv:1.9) Gecko/2008051202 Firefox/3.0 Sometimes the speed of the download falls to zero, as properly indicated by the Download Statusbar Add-On. However, on the Download Manager, the speed indicates the last non-zero speed. If one does not have the D-Statusbar addon installed, or is looking only at the Download Manager, you would not know that your download has stalled. Reproducible: Always Steps to Reproduce: 1. As a proxy, install Download Statusbar Addon: https://addons.mozilla.org/en-US/firefox/addon/26 2. Restart Firefox 3. Download a reasonably large file 4. When progress is halfway through and clearly still active, unplug ethernet/turn off wireless/turn off modem 5. Observe that progress indicator on FF Download Manager stops, but indicates a non-zero speed. Download Statusbar indicates zero speed as expected. Actual Results: Download Manager indicated non-zero speed in middle of file download when there is no internet connection. Expected Results: Speed indicator should be zero.
Status: UNCONFIRMED → NEW
Ever confirmed: true
OS: Mac OS X → All
Hardware: Macintosh → All
Summary: Inaccurate download speed indicator → Inaccurate download speed indicator if download speed is zero
Firefox 3.6 still can be stalled with the lastest download speed instead to show a null activity. After a while, the user must be warned a download has stalled. More, some websites didn't allow a resume, so it must important to warn him.
When hang, download speed is Totally incorrect, as bug 807084 says. This bug is from 2008 and still nobody bothered to fix it. "inaccurate" sounds like a small problem, but it is not the case. The title of The bug is misleading. "inaccurate" should be replaced with "totally incorrect".
There's two related situations here: 1) If the download fully stalls it continues to report the last known speed and time estimate indefinitely, thus giving no indication that it has stalled. 2) If the download stalls intermittently it only reports a speed when it has one, thus giving a higher than true speed and an underestimated time to completion. I've seen way too many downloads with more stall time than download time and the result is a bogus download speed and an estimated remaining time that is wildly shorter than it's really going to take.
Summary: Inaccurate download speed indicator if download speed is zero → estimated time and download speed stop reporting when download speed is zero
(In reply to Dave Garrett from comment #6) > 2) If the download stalls intermittently it only reports a speed when it has > one, thus giving a higher than true speed and an underestimated time to > completion. That's the case with Uploaded.net / Ul.to, a rather popular file host. To test, download any large file without logging in.
(In reply to :Paolo Amadini from bug 1281668, comment 1) > We use a simple algorithm to calculate the download speed and it may definitely have > some limitations like the one you described. [...] if you have any example of other > programs or algorithms doing a better job in the case you described, we might > definitely investigate and adopt them. Opera 45.0.2552.888 doesn't have this problem (I don't know whether it's something they developed on their own or inherited from Chrome). Though Opera never shows a speed of 0 bytes/s on the Downloads panel, on the Downloads details tab it flips back and forth between 0 bytes/s and non-zero bursts. Another example I've come across is Uploadgig.com.
A nice, open source example should be available in 'scp' (secure copy, part of 'ssh' I believe). It shows percentage and kB downloaded, download speed, and ETA. Download speed updates about twice each second. Download speed tapers to 0 if the transfer is held, and reaches 0 after about 5 seconds. This makes me suspect that it simply maintains a record of the last 10 measurements (at 2/second), and shows the average of those values.
Firefox currently uses exponential smoothing which is a very good strategy, but two changes need to be made: 1) To handle stuck downloads, DS_setProgressBytes should be called periodically, even when there is no new data. It could be called at a fixed interval, or with exponentially increasing intervals to save battery life. The interval should be reset to a small value whenever new data is received. 2) To handle bursty downloads, the smoothing factor must take the length of the update interval into account so that the smoothing time constant remains constant. In DownloadCore.jsm, change this.speed = rawSpeed * 0.1 + this.speed * 0.9; into let k = Math.pow(0.9, intervalMs / kProgressUpdateIntervalMs); this.speed = rawSpeed * (1-k) + this.speed * k; The algorithm is duplicated in urlbarBindings.xml, so do the same thing there.
Thank you very much for the suggestions! They make sense to me, but in the bursty downloads case I'm curious if the estimated time to completion would tend to become very high while the download is stalled, and down again during the next burst, which would be kind of the opposite problem than what we have now? A side note is that, since this code is called often, we should probably test the performance of Math.pow and if necessary replace the formula with an approximate one that is faster to compute.
Priority: -- → P5
All of the major free download sites use a method of bursty download speed regulation that interrupts the download with a period less than 1 second. i.e. start transmitting every 0.5 second and stop after 25kB has been transmitted, giving a bursty transmission at 50kB/second. Several years ago the interval was larger [5 seconds] and a data rate plot would look like a square wave with irregular peaks. Now the interval is faster than the monitor interval and a data rate plot looks like an interference pattern with 2 signals at almost the same frequency beating against each other. In both cases, the download window showed an invalid expected download time where the actual time was 2-5 times the estimate. Any fix needs to be tested by trying actual [free] downloads from multiple sources. rapidgator.com uploaded.net filefactory.com
My proposal definitely handles your use case. Paolo: that depends on the time constant. As Douglas points out, nowadays downloads rarely stall over large periods, so most reasonable time constants should work. If you have a very erratic connection over which you have no influence, it might be a bit better to have a very large time constant, but if you're troubleshooting connection issues, for example moving your laptop around to get good Wifi quality, then large time constants are a nuisance. The only adjustment I would make to my proposal is to force the download speed to 0 if no packets have been received in the last 30 seconds, so that users could distinguish between slow and stuck downloads and to avoid large estimated times. Let's not miss the forest for the trees: currently the download speed estimation is *completely broken* whenever the speed is not smoothly varying. It's more important to fix that than to get every single detail perfect, such as whether the time constant could be 30% larger, or whether Firefox could develop new techniques to become the world's best download speed estimation software. I'll take a "no longer broken" estimator that ships in 6 months over a "perfect" estimator that ships in 6 years. About speed: you can always write let kFilterRate = 0.1 / kProgressUpdateIntervalMs; // precomputed constant let k = intervalMs < 100 ? 1 - kFilterRate * intervalMs : Math.exp(-kFilterRate * intervalMs); which never calls Math.exp more than 10 times per second per active download, but that might be premature optimization.
You need to log in before you can comment on or make changes to this bug.