So, I'm a tad confused. I was under the impression that this should work: In this case, both a and b are ints (Counters to be exact).
As the result of a / b may possibly contain decimal places, ints obviously won't work. Therefore, I delared a new double and performed the calculation inside it like this:
double texturefactor = ((a / b) * 10);
This doesn't work as I expected, and the result of a / b is always that which I would get if I performed the calculation using an int to store the results. On the other hand, this works:
double calculate1 = a;
double calculate2 = b;
double texturefactor = ((calculate1 / calculate2) * 10);
Couple of perhaps stupid questions- 1. I'm sure this ought to work- I know that in certain situations VS will complain that I've tried to implicitly convert from one type to another- That's what I'm trying to do! Why doesn't it, and have I missed something? :) 2. Should I just convert the counters a and b to doubles and save myself the trouble of the conversion, or is that trouble?
The result of a / b is another integer, so even if you do this, the damage has already been done:
double texturefactor = ((a / b) * 10);
Try casting one of your inner variables to a double:
var texturefactor = (( (double)a / b) * 10);
The result of (double)a / b will be a double, and you won't lose your fraction.
To answer your second question:
Should I just convert the counters a and b to doubles
That'd work too. If you change those to double, then you wouldn't have to perform the above cast.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With