I have a bunch of x, y points that represent a sigmoidal function:
x=[ 1.00094909 1.08787635 1.17481363 1.2617564 1.34867881 1.43562284
1.52259341 1.609522 1.69631283 1.78276102 1.86426648 1.92896789
1.9464453 1.94941586 2.00062852 2.073691 2.14982808 2.22808316
2.30634034 2.38456905 2.46280126 2.54106611 2.6193345 2.69748825]
y=[-0.10057627 -0.10172142 -0.10320428 -0.10378959 -0.10348456 -0.10312503
-0.10276956 -0.10170055 -0.09778279 -0.08608644 -0.05797392 0.00063599
0.08732999 0.16429878 0.2223306 0.25368884 0.26830932 0.27313931
0.27308756 0.27048902 0.26626313 0.26139534 0.25634544 0.2509893 ]

I use scipy.interpolate.UnivariateSpline() to fit to some cubic spline as follows:
from scipy.interpolate import UnivariateSpline
s = UnivariateSpline(x, y, k=3, s=0)
xfit = np.linspace(x.min(), x.max(), 200)
plt.scatter(x,y)
plt.plot(xfit, s(xfit))
plt.show()
This is what I get: 
Since I specify s=0, the spline adheres completely to the data, but there are too many wiggles. Using a higher k value leads to even more wiggles.
So my questions are --
scipy.interpolate.UnivariateSpline() to fit my data? More precisely, how do I make the spline minimise its wiggling?scipy.optimize.curve_fit() with a trial tanh(x) function instead?There are several options, I list a few below. The last one seems to give the best output. Whether you should use a spline or an actual function depends on what you want to do with the output; I list two analytical functions below that could be used but I don't know in which context the data were derived so it is hard to find the best one for you.
You can play with s, e.g. for s=0.005, the plot looks like this (still not extremely pretty but you could further adjust):

But I would indeed use a "proper" function and fit using e.g. curve_fit. The function below is still not ideal as it is monotonically increasing, so we miss the decrease at the end; the plot looks as follows:

This is the entire code, for both the spline and the actual fit:
from scipy.interpolate import UnivariateSpline
import matplotlib.pyplot as plt
import numpy as np
from scipy.optimize import curve_fit
def func(x, ymax, n, k, c):
return ymax * x ** n / (k ** n + x ** n) + c
x=np.array([ 1.00094909, 1.08787635, 1.17481363, 1.2617564, 1.34867881, 1.43562284,
1.52259341, 1.609522, 1.69631283, 1.78276102, 1.86426648, 1.92896789,
1.9464453, 1.94941586, 2.00062852, 2.073691, 2.14982808, 2.22808316,
2.30634034, 2.38456905, 2.46280126, 2.54106611, 2.6193345, 2.69748825])
y=np.array([-0.10057627, -0.10172142, -0.10320428, -0.10378959, -0.10348456, -0.10312503,
-0.10276956, -0.10170055, -0.09778279, -0.08608644, -0.05797392, 0.00063599,
0.08732999, 0.16429878, 0.2223306, 0.25368884, 0.26830932, 0.27313931,
0.27308756, 0.27048902, 0.26626313, 0.26139534, 0.25634544, 0.2509893 ])
popt, pcov = curve_fit(func, x, y, p0=[y.max(), 2, 2, -0.1], bounds=([0, 0, 0, -0.2], [0.4, 45, 2000, 10]))
xfit = np.linspace(x.min(), x.max(), 200)
plt.scatter(x, y)
plt.plot(xfit, func(xfit, *popt))
plt.show()
s = UnivariateSpline(x, y, k=3, s=0.005)
xfit = np.linspace(x.min(), x.max(), 200)
plt.scatter(x, y)
plt.plot(xfit, s(xfit))
plt.show()
A third option is to use a more advanced function that can also reproduce the decrease at the end and differential_evolution for the fit; that seems to give the best fit:

The code is as follows (using the same data as above):
from scipy.optimize import curve_fit, differential_evolution
def sigmoid_with_decay(x, a, b, c, d, e, f):
return a * (1. / (1. + np.exp(-b * (x - c)))) * (1. / (1. + np.exp(d * (x - e)))) + f
def error_sigmoid_with_decay(parameters, x_data, y_data):
return np.sum((y_data - sigmoid_with_decay(x_data, *parameters)) ** 2)
res = differential_evolution(error_sigmoid_with_decay,
bounds=[(0, 10), (0, 25), (0, 10), (0, 10), (0, 10), (-1, 0.1)],
args=(x, y),
seed=42)
xfit = np.linspace(x.min(), x.max(), 200)
plt.scatter(x, y)
plt.plot(xfit, sigmoid_with_decay(xfit, *res.x))
plt.show()
The fit is quite sensitive regarding the bounds, so be careful when you play with that...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With