Possible Duplicate:
Estimating/forecasting download completion time
We've all seen the download time running estimate that initially says something like "7 days", but keeps dropping wildly (e.g. "23 hours", "45 minutes", "1 min. 50 sec", etc) with each successive estimation as the chunks are downloaded.
To avoid these initial (alarming) estimates, there are techniques one could try like suppressing display of the first n estimates, or waiting for the delta between estimates to drop below some threshold before you start displaying them, but these don't seem like a general, robust solution. There are corner cases involving too few samples, or samples that actually are wildly varying...
I think I recall a general solution for this kind of thing in mathematics (statistics?) that reduced or eliminated these wild values.
Does anyone know?
OK, looks like this has already been asked and answered:
Estimating/forecasting download completion time
My question even starts out with the same wording as this one. Funny...
Algo for a stable ‘download-time-remaining’ in a download window
Use filer, moving avarege can be good enough, for calculating speed.
S_filtered=S_filtered_prevous*(1-x) + S_current*x
Where x is inverse value of filtered samples, try different values from 0.1 - 0.01 (10-100)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With