Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python difference between randn and normal

Tags:

python

numpy

I'm using the randn and normal functions from Python's numpy.random module. The functions are pretty similar from what I've read in the http://docs.scipy.org manual (they both concern the Gaussian distribution), but are there any subtler differences that I should be aware of? If so, in what situations would I be better off using a specific function?

like image 774
Medulla Oblongata Avatar asked Feb 12 '14 20:02

Medulla Oblongata


People also ask

What does randn mean in Python?

randn() function creates an array of specified shape and fills it with random values as per standard normal distribution.

What is the difference between Rand and randn in Python?

randn generates samples from the normal distribution, while numpy. random. rand from a uniform distribution (in the range [0,1)).

What is the differences you see from the numbers generated from Rand and randn?

The difference between rand and randn is (besides the letter n ) that rand returns random numbers sampled from a uniform distribution over the interval [0,1), while randn instead samples from a normal (a.k.a. Gaussian) distribution with a mean of 0 and a variance of 1.

What is NP random normal () used for?

normal, the Numpy random normal function allows us to create normally distributed data, while specifying important parameters like the mean and standard deviation.


Video Answer


2 Answers

Description

Looking at the docs that you linked in your question, I'll highlight some of the key differences:

normal:

numpy.random.normal(loc=0.0, scale=1.0, size=None) # Draw random samples from a normal (Gaussian) distribution.  # Parameters :   # loc : float -- Mean (“centre”) of the distribution. # scale : float -- Standard deviation (spread or “width”) of the distribution. # size : tuple of ints -- Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. 

So in this case, you're generating a GENERIC normal distribution (more details on what that means later).

randn:

numpy.random.randn(d0, d1, ..., dn) # Return a sample (or samples) from the “standard normal” distribution.  # Parameters :   # d0, d1, ..., dn : int, optional -- The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. # Returns :  # Z : ndarray or float -- A (d0, d1, ..., dn)-shaped array of floating-point samples from the standard normal distribution, or a single such float if no parameters were supplied. 

In this case, you're generating a SPECIFIC normal distribution, the standard distribution.

(Brief) Math

Now some of the math, which is really needed to get at the heart of your question:

A normal distribution is a distribution where the values are more likely to occur near the mean value. There are a bunch of cases of this in nature. E.g., the average high temperature in Dallas in June is, let's say, 95 F. It might reach 100, or even 105 average in one year, but it more typically will be near 95 or 97. Similarly, it might reach as low as 80, but 85 or 90 is more likely.

So, it is fundamentally different from, say, a uniform distribution (rolling an honest 6-sided die).


A standard normal distribution is just a normal distribution where the average value is 0, and the variance (the mathematical term for the variation) is 1.

So,

numpy.random.normal(size= (10, 10)) 

is the exact same thing as writing

numpy.random.randn(10, 10) 

because the default values (loc= 0, scale= 1) for numpy.random.normal are in fact the standard distribution.

History

To make matters more confusing, as the numpy random documentation states:

sigma * np.random.randn(...) + mu 

is the same as

np.random.normal(loc= mu, scale= sigma, ...) 

The problem is really specialization: in statistics, Gaussian distributions are so common that terminology cropped up to enable discussions:

  • Many distributions are Gaussain, so many that Gaussian became considered the normal distribution.
  • Good modeling, especially linear modeling, requires that all elements are "of the same size" (similar mean and variance). So it became standard practice to rescale distributions to mean=0 and variance=1.

*Final note: I used the term variance to mathematically describe variation. Some folks say standard deviation. Variance simply equals the square of standard deviation. Since the variance = 1 for the standard distribution, in this case of the standard distribution, variance == standard deviation.

like image 70
Mike Williamson Avatar answered Sep 21 '22 22:09

Mike Williamson


randn seems to give a distribution from some standardized normal distribution (mean 0 and variance 1). normal takes more parameters for more control. So randn seems to simply be a convenience function.

like image 42
M4rtini Avatar answered Sep 20 '22 22:09

M4rtini