I was looking for a way to perform a linear curve fit in Javascript. I found several libraries, but they don't propagate errors. What I mean is, I have data and associated measurement errors, like:
x = [ 1.0 +/- 0.1, 2.0 +/- 0.1, 3.1 +/- 0.2, 4.0 +/- 0.2 ]
y = [ 2.1 +/- 0.2, 4.0 +/- 0.1, 5.8 +/- 0.4, 8.0 +/- 0.1 ]
Where my notation a +/- b
means { value : a, error : b }
.
I want to fit this into y = mx + b
, and find m
and b
with their propagated errors. I know the Least Square Method algorithm, that I could implement, but it only take errors on the y variable, and I have distinct errors in both.
I also could not find a library in Javascript to do that; but if there is an open source lib in other language, I can inspect it to find out how and implement it in JS.
Programs like Origin or plotly are able to implement this, but I don't know how. The result for this example dataset is:
m = 1.93 +/- 0.11
b = 0.11 +/- 0.30
Typically, Error bars are used to display either the standard deviation, standard error, confidence intervals or the minimum and maximum values in a ranged dataset. To visualise this information, Error Bars work by drawing cap-tipped lines that extend from the centre of the plotted data point (or edge with Bar Charts).
Linear curve fitting, or linear regression, is when the data is fit to a straight line. Although there might be some curve to your data, a straight line provides a reasonable enough fit to make predictions.
The error bars are markers that visually show the uncertainty around each data point. You should expect your best fit line to pass through at least 70% of the error bars.
The very useful book Numerical Recipes provides a method to fit data to a straight line, with uncertainties in both X and Y coordinates. It can be found online in these two versions:
The method is based on minimizing the χ2 (chi-square) which is similar to the least-square but takes into account the individual uncertainty of each data point. When the uncertainty σi is on the Y axis only, a weight proportional to 1/σi2 is assigned to the point in the calculations. When the data has uncertainties in the X and Y coordinates, given by σxi and σyi respectively, the fit to a straight line
y(x) = a + b · x
uses a χ2 where each point has a weight proportional to
1 / (σ2yi + b2 · σ2xi)
The detailed method and the code (in C or Fortran) can be found in the book. Due to copyright, I cannot reproduce them here.
It seems that the least squares (LS) method is indeed a good direction. Given a list of x & y, the least squares return values for m & b that minimize $$\sum_{i} (m*x_{i}+b -y_{i})^{2} $$.
The benefits of the LS method is that you will find the optimal values for the parameter, the computation is fast and you will probably be able to find implantation in java script, like this one.
Now you should take care of the margins of errors that you have. Note that the way that you treat the margin of errors is more of a "business question" than a mathematical question. Meaning that few might choose few treatments based on their needs and they'll all be indifferent from mathematical point of view.
Without more knowledge about your need, I suggest that you will turn each point (x,y) into 4 points based on the margins. (x+e,y+e), (x-e, y+e), (x+e, y-e), (x-e,y-e).
The benefits of this representation is that it is simple, it gives way to the end of the margin boundaries that are typically more sensitive and the best of all - it is a reduction. Hence, once you generate the new values you can use the regular LS implementation without having to implement such algorithm on your own.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With