I've been hunting for a convenient way to sample from a multivariate normal distribution. Does anyone know of a readily available code snippet to do that? For matrices/vectors, I'd prefer to use Boost or Eigen or another phenomenal library I'm not familiar with, but I could use GSL in a pinch. I'd also like it if the method accepted nonnegative-definite covariance matrices rather than requiring positive-definite (e.g., as with the Cholesky decomposition). This exists in MATLAB, NumPy, and others, but I've had a hard time finding a ready-made C/C++ solution.
If I have to implement it myself, I'll grumble but that's fine. If I do that, Wikipedia makes it sound like I should
I would like this to work quickly. Does someone have an intuition for when it would be worthwhile to check to see if the covariance matrix is positive, and if so, use Cholesky instead?
To simulate a Multivariate Normal Distribution in the R Language, we use the mvrnorm() function of the MASS package library. The mvrnorm() function is used to generate a multivariate normal distribution of random numbers with a specified mean value in the R Language.
Applications. The multivariate normal distribution is useful in analyzing the relationship between multiple normally distributed variables, and thus has heavy application to biology and economics where the relationship between approximately-normal variables is of great interest.
Since this question has garnered a lot of views, I thought I'd post code for the final answer that I found, in part, by posting to the Eigen forums. The code uses Boost for the univariate normal and Eigen for matrix handling. It feels rather unorthodox, since it involves using the "internal" namespace, but it works. I'm open to improving it if someone suggests a way.
#include <Eigen/Dense> #include <boost/random/mersenne_twister.hpp> #include <boost/random/normal_distribution.hpp> /* We need a functor that can pretend it's const, but to be a good random number generator it needs mutable state. */ namespace Eigen { namespace internal { template<typename Scalar> struct scalar_normal_dist_op { static boost::mt19937 rng; // The uniform pseudo-random algorithm mutable boost::normal_distribution<Scalar> norm; // The gaussian combinator EIGEN_EMPTY_STRUCT_CTOR(scalar_normal_dist_op) template<typename Index> inline const Scalar operator() (Index, Index = 0) const { return norm(rng); } }; template<typename Scalar> boost::mt19937 scalar_normal_dist_op<Scalar>::rng; template<typename Scalar> struct functor_traits<scalar_normal_dist_op<Scalar> > { enum { Cost = 50 * NumTraits<Scalar>::MulCost, PacketAccess = false, IsRepeatable = false }; }; } // end namespace internal } // end namespace Eigen /* Draw nn samples from a size-dimensional normal distribution with a specified mean and covariance */ void main() { int size = 2; // Dimensionality (rows) int nn=5; // How many samples (columns) to draw Eigen::internal::scalar_normal_dist_op<double> randN; // Gaussian functor Eigen::internal::scalar_normal_dist_op<double>::rng.seed(1); // Seed the rng // Define mean and covariance of the distribution Eigen::VectorXd mean(size); Eigen::MatrixXd covar(size,size); mean << 0, 0; covar << 1, .5, .5, 1; Eigen::MatrixXd normTransform(size,size); Eigen::LLT<Eigen::MatrixXd> cholSolver(covar); // We can only use the cholesky decomposition if // the covariance matrix is symmetric, pos-definite. // But a covariance matrix might be pos-semi-definite. // In that case, we'll go to an EigenSolver if (cholSolver.info()==Eigen::Success) { // Use cholesky solver normTransform = cholSolver.matrixL(); } else { // Use eigen solver Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> eigenSolver(covar); normTransform = eigenSolver.eigenvectors() * eigenSolver.eigenvalues().cwiseSqrt().asDiagonal(); } Eigen::MatrixXd samples = (normTransform * Eigen::MatrixXd::NullaryExpr(size,nn,randN)).colwise() + mean; std::cout << "Mean\n" << mean << std::endl; std::cout << "Covar\n" << covar << std::endl; std::cout << "Samples\n" << samples << std::endl; }
Here is a class to generate multivariate normal random variables in Eigen which uses C++11 random number generation and avoids the Eigen::internal
stuff by using Eigen::MatrixBase::unaryExpr()
:
struct normal_random_variable { normal_random_variable(Eigen::MatrixXd const& covar) : normal_random_variable(Eigen::VectorXd::Zero(covar.rows()), covar) {} normal_random_variable(Eigen::VectorXd const& mean, Eigen::MatrixXd const& covar) : mean(mean) { Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> eigenSolver(covar); transform = eigenSolver.eigenvectors() * eigenSolver.eigenvalues().cwiseSqrt().asDiagonal(); } Eigen::VectorXd mean; Eigen::MatrixXd transform; Eigen::VectorXd operator()() const { static std::mt19937 gen{ std::random_device{}() }; static std::normal_distribution<> dist; return mean + transform * Eigen::VectorXd{ mean.size() }.unaryExpr([&](auto x) { return dist(gen); }); } };
It can be used as
int size = 2; Eigen::MatrixXd covar(size,size); covar << 1, .5, .5, 1; normal_random_variable sample { covar }; std::cout << sample() << std::endl; std::cout << sample() << std::endl;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With