Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Optimization Algorithm vs Regression Models

Currently, I'm dealing with forecasting problems. I have a reference that used linear function to represent the input and output data.

y = po + p1.x1 + p2.x2

Both of x1 and x2 are known input; y is output; p0, p1, and p2 are the coefficient. Then, he used all the training data and Least Square Estimation (LSE) method to find the optimal coefficient (p0, p1, p2) to build the model.

My question is if he already used the LSE algorithm, can I try to improve his method by using any optimization algorithm (PSO or GA for example) to try find better coefficient value?

like image 420
Eldeanor Avatar asked Oct 12 '25 07:10

Eldeanor


1 Answers

You answered this yourself:

Blockquote Then, he used all the training data and Least Square Estimation (LSE) method to find the optimal coefficient (p0, p1, p2) to build the model.

Because a linear-model is quite easy to optimize, the LSE method obtained a global optimum (ignoring subtle rounding-errors and early-stopping/tolerance errors). Without changing the model, there is no gain in terms of using other coefficients, independent on the usage of meta-heuristics lika GA.

So you may modify the model, or add additional data (feature-engineering: e.g. product of two variables; kernel-methods).

One thing to try: Support-Vector machines. These are also convex and can be trained efficiently (with not too much data). They are also designed to work well with kernels. An additional advantage (compared with more complex models: e.g. non-convex): they are quite good regarding generalization which seems to be important here because you don't have much data (sounds like a very small dataset).

See also @ayhan's comment!

like image 97
sascha Avatar answered Oct 15 '25 12:10

sascha