Before you start laughing at such a simple question let me explain:
I am trying to determine how much change (percentage %) there is in an account over various indicators. This is not particularly hard. But how do you generally handle the cases where the current value is zero or the previous value is zero?
i.e.
This week: Earnings = $25.6
Last week: Earnings = $0.0
I currently calculate the % difference by the following formula:
If (CurrentValue > 0.0 && PreviousValue > 0.0) {
return (CurrentValue - PreviousValue) / PreviousValue;
} return 0.0;
if the earnings were zero in the previous week - what should the % difference be? +Infinity?
And inversely if the current week is zero? -Infinity?
Then to complicate things how would you handle this in a Linq-To-SQL query
Upside_Earnings = (statistics.Where(d => d.DateTime > first_startdate && d.DateTime <= first_enddate).Average(e => (double)e.Earnings) > zero &&
statistics.Where(d => d.DateTime > second_startdate && d.DateTime <= second_enddate).Average(e => (double)e.Earnings) > zero) ?
((statistics.Where(d => d.DateTime > first_startdate && d.DateTime <= first_enddate).Average(e => (double)e.Earnings) -
statistics.Where(d => d.DateTime > second_startdate && d.DateTime <= second_enddate).Average(e => (double)e.Earnings)) /
statistics.Where(d => d.DateTime > second_startdate && d.DateTime <= second_enddate).Average(e => (double)e.Earnings)) : zero,
The change can be positive or negative. So formula is:
var change = ((V2 - V1) / Math.Abs(V1)) * 100;
Example:
[ ((V2 - V1) / |V1|) * 100 ]
= ((220 - 110) / |110|) * 100 = (110 / 110) * 100 = 1 * 100 = 100% change
= 100% increase
[ ((V2 - V1) / |V1|) * 100 ]
= ((75 - 150) / |150|) * 100 = (-75 / 150) * 100 = -0.5 * 100 = -50% change
= 50% decrease
NOTE: this doesn't handle zeros. You could do that yourself
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With