Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Random time delay

I'm trying to send signals between processes and I need to allow a random time delay of between 0.01 and 0.1 seconds in my signal generation loop. So this is what I'm doing and it is most certainly not between 0.01 and 0.1, but comes out as 1 second. Not sure what I'm missing here.

    sleepTime = 100000L+(long)((1e6-1e5)*rand()/(RAND_MAX+1.0));        
    usleep(sleepTime);
like image 668
Mia Avatar asked Sep 28 '11 01:09

Mia


Video Answer


2 Answers

If you've got C++11:

#include <thread>
#include <random>
#include <chrono>

int main()
{
    std::mt19937_64 eng{std::random_device{}()};  // or seed however you want
    std::uniform_int_distribution<> dist{10, 100};
    std::this_thread::sleep_for(std::chrono::milliseconds{dist(eng)});
}

It may not be what your prof is looking for. :-)

like image 64
Howard Hinnant Avatar answered Oct 25 '22 14:10

Howard Hinnant


All of your constants are 10x too large! Try

 sleepTime = 10000L+(long)((1e5-1e4)*rand()/(RAND_MAX+1.0));  
like image 38
Ernest Friedman-Hill Avatar answered Oct 25 '22 15:10

Ernest Friedman-Hill