Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

rand() function in C is not random even when seeded

Tags:

c

random

This is most likely a machine dependent issue but I can't figure out what could be wrong.

#include <stdio.h>
#include <stdlib.h>
#include <time.h>

int main(int argc, char** argv) {

  srand(time(NULL));
  int r1 = rand();
  int r2 = rand();
  printf("%d %d\n", r1, r2);
}

I compile the above piece of code using

gcc randd.c

Then running it a few times manually and the first numbers seem incredibly similar while the second ones seem random:

1025720610 1435057801
1025737417 1717533050
1025754224 2000008299
1025771031 134999901
1025787838 417475150

This first call to rand() seems strongly co-related to the time and is strictly increasing as time passes. Any ideas as to why this occurs or how to resolve it?

This happens on OSX 10.11

like image 891
ᴘᴀɴᴀʏɪᴏᴛɪs Avatar asked Oct 18 '22 16:10

ᴘᴀɴᴀʏɪᴏᴛɪs


1 Answers

  1. rand() is quite bad, avoid it if possible. In any good RNG the first values will be indistinguishable from random even when the seed is close (hamming distance). In rand this is not the case.
  2. If you must use rand then seed it, preferably with something higher entropy than time, and call rand() multiple times instead of reseeding-calling-reseeding.

For example of 2, consider:

#include <stdio.h>
#include <stdlib.h>
#include <time.h>

int main(int argc, char** argv) {
  int t=time(NULL);
  srand(t);

  for(int i=0; i < 10; i++) {
    float r = (float)rand()/(float)(RAND_MAX);
    printf("%f\n", r);
  }
}

With the result:

0.460600
0.310486
0.339473
0.519799
0.258825
0.072276
0.749423
0.552250
0.665374
0.939103

It's still a bad RNG but at least the range is better when you allow it to use the internal state instead of giving it another similar seed.

like image 63
Thomas M. DuBuisson Avatar answered Oct 21 '22 04:10

Thomas M. DuBuisson