I am working currently on a problem where I have to solve either an L2-regularized logistic regression or L2-reg linear SVM problem, where I have an added affine term.
So my problem for example is:
min_ w {C*sum_i max(1-w*x_i*y_i,0) + 0.5*||w||^2_2 + w * v }
where v is a constant vector.
Of course this is a convex problem and can be solved with the usual methods, but I have to solve many large problems of this type, so I would very much like to use a standard library such as liblinear.
My question is, is there a way to transform the data x, the labels y, or the weighing factor C (perhaps into a different C_i for each instance), such that this problem will be equivalent to a standard hinge-loss SVM or logistic regression problem?
I can't think of a way to turn it into something which can be processed by something like liblinear. However, you could easily solve this optimization problem with one of the general purpose cutting plane optimization libraries. All you have to do is write code to compute an element of the subgradient (which is just w + v - Csum_i x_iy_i in your case) and the value of the objective. Then a cutting plane routine can find the optimal w.
There is a CPA optimizer in Shogun and also one in dlib. I haven't used Shogun's version but I have used the one in dlib for a lot of problems (I'm also the author of dlib).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With