I have this code snippet:
Random rand = new Random();
int chance = rand.Next(1, 101);
if (chance <= 25) // probability of 25%
{
Console.WriteLine("You win");
}
else
{
Console.WriteLine("You lose");
}
My question is, does it really calculate a 25% probability for winning here? Is the chance of winning for the player here is really 25%?
Edit:
I just wrote this:
double total = 0;
double prob = 0;
Random rnd = new Random();
for (int i = 0; i < 100; i++)
{
double chance = rnd.Next(1, 101);
if (chance <= 25) prob++;
total++;
}
Console.WriteLine(prob / total);
Console.ReadKey();
And it's highly inaccurate. It goes from about 0.15 to 0.3.
But when I do more checks (change from (i < 100) to (i < 10000)) it's much more accurate.
Why is this? Why aren't 100 checks enough for it to be accurate?
This is very easy to check for yourself:
Random rand = new Random();
int yes = 0;
const int iterations = 10000000;
for (int i = 0; i < iterations; i++)
{
if (rand.Next(1, 101) <= 25)
{
yes++;
}
}
Console.WriteLine((float)yes/iterations);
the result:
0.2497914
The conslusion: Yes, yes it is.
Edit: Just for fun, the LINQy version:
Random rand = new Random();
const int iterations = 10000000;
int sum = Enumerable.Range(1, iterations)
.Count(i => rand.Next(1, 101) <= 25);
Console.WriteLine(sum / (float)iterations);
For most cases, I would say yes. However, you have to remember that most randomization algorithms use a pseudo-random generator, and so to some extent, you're at the mercy of the idiosyncrasies of that particular generator. I do agree with @AwokeKnowing that you can you also just do a random number between 1 and 4 and get the same result. I assume that the .Net randomization algorithm should suffice for most cases. For more info see:
http://en.wikipedia.org/wiki/Pseudorandom_number_generator
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With