Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

fitting gaussian distribution to data

I have a vector y containing 1440 values (values are between 0-1) that looks like a Gaussian distribution. Therefore I would like to find the best fitting gaussian distribution to have a model.

x=1:1440;
[sigma_,mu_] = gaussfit(x,y);
norm = normpdf(x,mu_,sigma_);

My problem is that the values in norm are way smaller than the values in y, i.e. value in norm are of the order of 10-3 while values in y are between 0 1.

I have then to add an extra step in order to normalize between 0 and 1 the values in norm.

norm_data = (norm - min(norm)) / ( max(norm) - min(norm) );

Is my procedure correct? (estimation of sigma and mu, normpdf, normalization) Is there a way to get directly a fit to the original data expressing the probability?

y can be downloaded here

like image 989
gabboshow Avatar asked Mar 17 '26 14:03

gabboshow


1 Answers

Assuming you are using this gaussfit, if you check the head of the function:

% REMARKS:
% The function does not always converge in which case try to use initial
% values sigma0, mu0. Check also if the data is properly scaled, i.e. p.d.f
% should approx. sum up to 1

Which means that before fitting you need to make sure that sum(y)==1+err being err something small.

Your y has a sum(y) of 470.1964, which is kind of far from 1. Normalize your data so the sum is equal to one before fitting it.

EDIT

Indeed the fucntion does normalize if the data is not (more or less, it accepts data in range 0.5-1.5). And works perfectly fine. As y is normalized inside the function, If you want to compare the result norm to y you need to normalize y or denormalize norm.

 % normalize y
 plot(x,norm,x,y./sum(y))
 % denormalize norm
 plot(x,norm*sum(y),x,y)

in either case (but different scale):

enter image description here

like image 59
Ander Biguri Avatar answered Mar 19 '26 09:03

Ander Biguri



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!