I'm looking for a set of portable distributions for the standard C++11 engines like `std::mt19937' (see http://en.cppreference.com/w/cpp/numeric/random).
The engine implementations perform consistently (i.e. same sequence generated on different platforms – tested with Clang and MSVC), but the distributions seem to be implemented differently on the different platforms.
So, even though the engines produce the same sequence, it seems that a distribution (for example, std::normal_distribution<double>
) does not use the same number of samples (i.e. produces different results) on the different platforms, which is not acceptable in my case.
Is there maybe a 3rd party lib I can use that follows the C++11 random templates, but that will deliver consistent values across popular platforms (Looking at support across GCC, MSVC and Clang/llvm).
Options I have looked at so far are:
I need uniform, normal, poison and Rayleigh.
In C++11, we can get the random library to generate random numbers. Here we have used random_device once to seed the random number generator object called mt. This random_device is slower than the mt19937, but we do not need to seed it. It requests for random data to the operating system.
The decomposed and pluggable design means that you can customize your random numbers by replacing only a small part of the random number generation pipeline. The standard also provides a wide range of Random Number Engines and distributions [3], so you should be able to do most things you want out of the box.
The problem is that std::random_device is poorly specified, and inscrutable. In theory, it should serve as an abstraction over some external source of entropy. In practice, an implementation is allowed to use any deterministic random number engine to implement it, e.g. a Mersenne twister, and there is no way to find out.
At first glance, <random> seems exceedingly complex for a simple task. You have to pick a templated Uniform Random Bit Generator, possibly seed it, pick a templated Distribution, and then pass an instance of your URBG to the distribution to get a number... This is the C example rewritten using <random>:
I have created my own C++11 distributions:
template <typename T>
class UniformRealDistribution
{
public:
typedef T result_type;
public:
UniformRealDistribution(T _a = 0.0, T _b = 1.0)
:m_a(_a),
m_b(_b)
{}
void reset() {}
template <class Generator>
T operator()(Generator &_g)
{
double dScale = (m_b - m_a) / ((T)(_g.max() - _g.min()) + (T)1);
return (_g() - _g.min()) * dScale + m_a;
}
T a() const {return m_a;}
T b() const {return m_b;}
protected:
T m_a;
T m_b;
};
template <typename T>
class NormalDistribution
{
public:
typedef T result_type;
public:
NormalDistribution(T _mean = 0.0, T _stddev = 1.0)
:m_mean(_mean),
m_stddev(_stddev)
{}
void reset()
{
m_distU1.reset();
}
template <class Generator>
T operator()(Generator &_g)
{
// Use Box-Muller algorithm
const double pi = 3.14159265358979323846264338327950288419716939937511;
double u1 = m_distU1(_g);
double u2 = m_distU1(_g);
double r = sqrt(-2.0 * log(u1));
return m_mean + m_stddev * r * sin(2.0 * pi * u2);
}
T mean() const {return m_mean;}
T stddev() const {return m_stddev;}
protected:
T m_mean;
T m_stddev;
UniformRealDistribution<T> m_distU1;
};
The uniform distribution seems to deliver good results and the normal distribution delivers very good results:
100000 values -> 68.159% within 1 sigma; 95.437% within 2 sigma; 99.747% within 3 sigma
The normal distribution uses the Box-Muller method, which according to what I have read so far, is not the fastest method, but it runs more that fast enough for my application.
Both the uniform and normal distributions should work with any C++11 engine (tested with std::mt19937) and provides the same sequence on all platforms, which is exactly what I wanted.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With