Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I add orthogonality regularization in Keras?

I would like to regularize the layers of a CNN with

|(W^T * W - I)|

How can I do that in Keras?

like image 884
Martin Thoma Avatar asked Mar 20 '17 18:03

Martin Thoma


1 Answers

From the docs:

Any function that takes in a weight matrix and returns a loss contribution tensor can be used as a regularizer

Here is the example implemented:

from keras import backend as K

def l1_reg(weight_matrix):
    return 0.01 * K.sum(K.abs(weight_matrix))

model.add(Dense(64, input_dim=64,
                kernel_regularizer=l1_reg)

the loss in your post would be:

from keras import backend as K
def fro_norm(w):
    return K.sqrt(K.sum(K.square(K.abs(w))))

def cust_reg(w):
    m = K.dot(K.transpose(w), w) - np.eye(w.shape)
    return fro_norm(m)

Here is a minimal example:

import numpy as np
from keras import backend as K
from keras.models import Sequential
from keras.layers import Dense, Activation

X = np.random.randn(100, 100)
y = np.random.randint(2, size=(100, 1))

model = Sequential()

# apply regularization here. applies regularization to the 
# output (activation) of the layer
model.add(Dense(32, input_shape=(100,), 
                activity_regularizer=fro_norm))
model.add(Dense(1))
model.add(Activation('softmax'))

model.compile(loss="binary_crossentropy",
              optimizer='sgd',
              metrics=['accuracy'])

model.fit(X, y, epochs=1, batch_size=32)

Below wouldn't work as hinted by @Marcin's comment LA.norm wouldn't work since the regularizer must return a Tensor LA.norm() doesn't.

def orth_norm(w)
    m = K.dot(k.transpose(w), w) - np.eye(w.shape)
    return LA.norm(m, 'fro')

from keras import backend as K

import numpy as np

def orth_norm(w)
    m = K.dot(k.transpose(w), w) - np.eye(w.shape)
    return LA.norm(m, 'fro')

Keras regularizers

Frobenias norm

like image 192
parsethis Avatar answered Dec 04 '22 23:12

parsethis