I decided to write a prime number generator as an easy excerise. The code is pretty simple:
static void generatePrimes (long min, long max)
{
bool[] prime = new bool[max + 1];
for (long i=2; i<max+1; i++)
prime [i] = true;
for (long i=2; i<max+1; i++) {
if (prime [i]) {
if (i>=min)
Console.WriteLine (i);
for (long j=i*2; j<max+1; j+=i)
prime [j] = false;
}
}
Console.WriteLine ();
}
It works just fine with input like 1..10000. However, around max=1000000000 it starts to work EXTREMELY slow; also, mono takes about 1Gb of memory. To me, it seems kinda strange: shouldn't the bool[1000000000] take 1000000000 bits, not bytes? Maybe I'm making some stupid mistake that I don't see that makes it so uneffective?
The smallest unit of information a computer can address is a byte. Thus a bool
is stored as a byte. You will need special code to put 8 bools in one byte. The BitArray
class does this for you.
Nope. Contrarily to C++'s vector<bool>
, in C# an array of bool
is, well, an array of bools
.
If you want your values to be packed (8 bits per bool), use a BitArray
instead.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With