I am working on some statistical code and exploring different ways of creating samples from random distributions - starting from a random number generator that generates uniform floating point values from 0 to 1
I know it is possible to generate approximate samples from a normal distribution by adding together a sufficiently large number of independent, identically distributed uniform random variables (by the central limit theorem).
Is it possible to do something similar to create samples from the logistic distribution? I'm assuming the samples to be added would need to be weighted or correlated somehow in order to avoid ending up with a normal.
P.S. I'm also aware there may be more efficient ways of generating random samples, I'm asking the question because I'm more interested in understanding how such a generator would work rather than efficiency....
The inverse of the logistic distribution isn't hard to find, so you can use Inverse transform sampling. The basic algorithm is:
for each random variate x ~ logistic
generate a random variate y ~ Uniform(0, 1)
x := F^-1 (y)
where F^-1 is the inverse CDF for the logistic, or the desired, distribution. Most programming languages will let you generate a Uniform variate between 0 and 1 through some kind of rand function.
Here's some python code which generates 1000 random variates from a logistic distribution:
from random import random
import math
import pylab
loc, scale = 0, 1
randvars = []
for i in range(1000):
x = random()
y = loc + scale * math.log(x / (1-x))
randvars.append(y)
pylab.hist(randvars)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With