Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Curve fit in Python minimizing the uniform norm or L1 norm (not least squares)

I have a set of data points (x_i,y_i) and would like to fit a curve. I have a parametrization of the function, so f(x;a) is defined by the parameters a.

I'm interested in the parameters minimizing the uniform norm, i.e., the objective is min_a max_i |f(x_i;a)-y_i|. If not possible, I'm also interested in the L1 norm, i.e. with the objective \min_a \sum_i |f(x_i;a)-y_i|.

I'm aware of curve_fit from scipy.optimize, but the library only works for the objective of least squares, i.e., L2 norm.

The problems I want to solve are of small size, approx 100-200 data points and 4-5 parameters, so if the algorithm is super slow is not a big deal. Any library recommendations would be much appreciated.

Thanks!

like image 572
Rostov Avatar asked Feb 03 '26 03:02

Rostov


1 Answers

One way to approach this (i only tackle the L1-norm here):

Convert:

  • non-differentiable (because of L1-norm) unconstrained optimization problem
  • to: differentiable constrained optimization problem!

Basic idea:

  • instead of Min(Sum(L1(x))):
    • introduce variables y with shape of x (aux-vars)
    • introduce constraints: -y <= x <= y
    • solve: Min(Sum(y))

A basic example using scipy:

import numpy as np
import matplotlib.pyplot as plt
from scipy import optimize
np.random.seed(0)

""" Example function and curve_fit """

def f(t, omega, phi):
    return np.cos(omega * t + phi)

x_ = np.linspace(0, 3, 50)
y = f(x_, 1.5, 1) + .5*np.random.normal(size=50)
y[5] += 5  # synthetic outlier
y[10] += 5 # """

params, params_cov = optimize.curve_fit(f, x_, y)
t = np.linspace(0, 3, 1000)

""" Prototype l1 solver """
N = 2  # n-vars
M = x_.shape[0]  # number of samples

def g(x):
    return sum(x[N+M:])

cons = ({'type': 'eq',
         'fun': lambda x: x[N:N+M] - (f(x_, *x[:N]) - y),
        }
        ,
        {'type': 'ineq',
         'fun': lambda x: x[N+M:] + x[N:M+2],                    # -u_i <= x_i
        },
        {'type': 'ineq',
         'fun': lambda x: -x[N:M+2] + x[N+M:]}                   # x_i <= u_i
       )

res = optimize.minimize(g, np.random.uniform(high=0.05, size=N+M+M), constraints=cons, options={'disp': True})
params_ = res.x[:N]

print(res.x[N:N+M])  # losses
print(res.x[N+M:])   # l1(losses)

""" Plot results """

plt.plot(x_, y, color='red', label='orig')
plt.plot(t, f(t, *params), color='blue', label='curve_fit')
plt.plot(t, f(t, *params_), color='black', label='l1')
plt.legend()
plt.show()

Output

enter image description here

Remarks

  • As general solvers for the non-convex case are used, the problem is highly dependent on initial values / not robust
like image 144
sascha Avatar answered Feb 04 '26 18:02

sascha



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!