Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Convert sklearn.svm SVC classifier to Keras implementation

I'm trying to convert some old code from using sklearn to Keras implementation. Since it is crucial to maintain the same way of operation, I want to understand if I'm doing it correctly.

I've converted most of the code already, however I'm having trouble with sklearn.svm SVC classifier conversion. Here is how it looks right now:

from sklearn.svm import SVC
model = SVC(kernel='linear', probability=True)
model.fit(X, Y_labels)

Super easy, right. However, I couldn't find the analog of SVC classifier in Keras. So, what I've tried is this:

from keras.models import Sequential
from keras.layers import Dense

model = Sequential()
model.add(Dense(64, activation='relu'))
model.add(Dense(1, activation='softmax'))
model.compile(loss='squared_hinge',
              optimizer='adadelta',
              metrics=['accuracy'])
model.fit(X, Y_labels)

But, I think that it is not correct by any means. Could you, please, help me find an alternative of the SVC classifier from sklearn in Keras?

Thank you.

like image 758
none32 Avatar asked Jan 29 '19 05:01

none32


2 Answers

If you are making a classifier, you need squared_hinge and regularizer, to get the complete SVM loss function as can be seen here. So you will also need to break your last layer to add regularization parameter before performing activation, I have added the code here.

These changes should give you the output

from keras.regularizers import l2
from keras.models import Sequential
from keras.layers import Dense

model = Sequential()
model.add(Dense(64, activation='relu'))
model.add(Dense(1), kernel_regularizer=l2(0.01))
model.add(activation('softmax'))
model.compile(loss='squared_hinge',
              optimizer='adadelta',
              metrics=['accuracy'])
model.fit(X, Y_labels)

Also hinge is implemented in keras for binary classification, so if you are working on a binary classification model, use the code below.

from keras.regularizers import l2
from keras.models import Sequential
from keras.layers import Dense

model = Sequential()
model.add(Dense(64, activation='relu'))
model.add(Dense(1), kernel_regularizer=l2(0.01))
model.add(activation('linear'))
model.compile(loss='hinge',
              optimizer='adadelta',
              metrics=['accuracy'])
model.fit(X, Y_labels)

If you cannot understand the article or have issues with the code, feel free to comment. I had this same issue a while back, and this GitHub thread helped me understand, maybe go through it too, some of the ideas here are directly from here https://github.com/keras-team/keras/issues/2588

like image 86
anand_v.singh Avatar answered Oct 01 '22 12:10

anand_v.singh


If you are using Keras 2.0 then you need to change the following lines of anand v sing's answer.

W_regularizer -> kernel_regularizer

Github link

model.add(Dense(nb_classes, kernel_regularizer=regularizers.l2(0.0001)))
model.add(Activation('linear'))
model.compile(loss='squared_hinge',
                      optimizer='adadelta', metrics=['accuracy'])

Or You can use follow

top_model = bottom_model.output
  top_model = Flatten()(top_model)
  top_model = Dropout(0.5)(top_model)
  top_model = Dense(64, activation='relu')(top_model)
  top_model = Dense(2, kernel_regularizer=l2(0.0001))(top_model)
  top_model = Activation('linear')(top_model)
  
  model = Model(bottom_model.input, top_model)
  model.compile(loss='squared_hinge',
                      optimizer='adadelta', metrics=['accuracy'])
  

like image 30
2 revs Avatar answered Oct 01 '22 13:10

2 revs