Open
Bug 464977
Opened 16 years ago
Updated 2 years ago
time remaining calculation for multiple file download, being overestimated.
Categories
(Firefox :: Downloads Panel, enhancement, P5)
Tracking
()
NEW
People
(Reporter: simon.place, Unassigned)
References
Details
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1b2pre) Gecko/20081103 Minefield/3.1b2pre
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1b2pre) Gecko/20081103 Minefield/3.1b2pre
the time remaining indication appears to simply show the longest of the time remainings from the downloading files individual estimates.
however, this often produces a large overestimate, since limited bandwidth doesn't have the same effect on all downloads, leading to, often, one very slow file, which is then quoted as the estimate to complete all downloads.
preferably the TOTAL remaining and the TOTAL bandwidth being used, should be used for this calculation.
Reproducible: Always
Steps to Reproduce:
1.
2.
3.
Actual Results:
i noticed this in a situation where the quote was roughly 3 times the actual.
Summary: time remaining calculation for multiple files being overestimated. → time remaining calculation for multiple file download, being overestimated.
Comment 1•14 years ago
|
||
This is a mass search for bugs which are in the Firefox General component, are
UNCO, have not been changed for 500 days and have an unspecified version.
Reporter, can you please update to Firefox 3.6.10 or later, create a fresh profile, http://support.mozilla.com/en-US/kb/managing+profiles, and test again. If you still see the issue, please update this bug. If the issue is gone, please set the status to RESOLVED > WORKSFORME.
Whiteboard: [CLOSEME 2010-11-01]
Updated•14 years ago
|
Whiteboard: [CLOSEME 2010-11-01]
Comment 2•11 years ago
|
||
That's a good idea.
However, dividing the total remaining by the total speed could lead to an underestimation if some of the servers are slower.
But that case could be fixed by :
- Dividing the total remaining by the total speed to get the combined remaining time.
- Dividing the total speed by the number of downloads to get the average speed.
- Then comparing the downloads speeds to the average speed to find out if they're similar.
- - If they are, use the combined remaining time.
- - If not, check if the remaining time of any slower speed download is longer than the combined remaining time and use the longer remaining time.
The calculation is happening here:
http://mxr.mozilla.org/mozilla/source/browser/base/content/browser.js#6733
6733 // Find the download with the longest remaining time
6734 let numPaused = 0;
6735 let maxTime = -Infinity;
6736 let dls = gDownloadMgr.activeDownloads;
6737 while (dls.hasMoreElements()) {
6738 let dl = dls.getNext().QueryInterface(Ci.nsIDownload);
6739 if (dl.state == gDownloadMgr.DOWNLOAD_DOWNLOADING) {
6740 // Figure out if this download takes longer
6741 if (dl.speed > 0 && dl.size > 0)
6742 maxTime = Math.max(maxTime, (dl.size - dl.amountTransferred) / dl.speed);
6743 else
6744 maxTime = -1;
6745 }
6746 else if (dl.state == gDownloadMgr.DOWNLOAD_PAUSED)
6747 numPaused++;
6748 }
Updated•11 years ago
|
Status: UNCONFIRMED → NEW
Component: General → Downloads Panel
Ever confirmed: true
Comment 3•11 years ago
|
||
Hum, the correct link and code is:
http://mxr.mozilla.org/mozilla-central/source/browser/components/downloads/src/DownloadsCommon.jsm#386
386 for (let dataItem of aDataItems) {
387 summary.numActive++;
388 switch (dataItem.state) {
389 case nsIDM.DOWNLOAD_PAUSED:
390 summary.numPaused++;
391 break;
392 case nsIDM.DOWNLOAD_SCANNING:
393 summary.numScanning++;
394 break;
395 case nsIDM.DOWNLOAD_DOWNLOADING:
396 summary.numDownloading++;
397 if (dataItem.maxBytes > 0 && dataItem.speed > 0) {
398 let sizeLeft = dataItem.maxBytes - dataItem.currBytes;
399 summary.rawTimeLeft = Math.max(summary.rawTimeLeft,
400 sizeLeft / dataItem.speed);
401 summary.slowestSpeed = Math.min(summary.slowestSpeed,
402 dataItem.speed);
403 }
404 break;
405 }
And the comment at line 406 maade me think about another potential problem: unknown download size for one or more downloads.
It seems better to don't take into account those downloads speeds to get the combined remaining time.
i thinks it all relates to if you are locally or remotely limited;
if its remote, ( file servers are at limit), then original algorithm is fine, but if its local, (ISP connection maxed out), then it was poor.
Unfortunately there doesn't seem to be a way to tell, although, low throughput would tend to indicate local limiting,
also, it seems that local limits are generally rising faster than remote ones, (more people have broadband), so the algorithm is now actually more appropriate than it was, but with enough connections then you can also reach local limit.
so, you might be able to do quite well if you had some idea of what the local limit was; then if the total throughput was close to this limit, the second approach could be taken, if it wasn't then the old one could be used.
and with unknown file sizes then why not just indicate unknown time remaining.
Updated•6 years ago
|
Priority: -- → P5
Updated•2 years ago
|
Severity: normal → S3
You need to log in
before you can comment on or make changes to this bug.
Description
•