Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras - how to get unnormalized logits instead of probabilities

I am creating a model in Keras and want to compute my own metric (perplexity). This requires using the unnormalized probabilities/logits. However, the keras model only returns the softmax probabilties:

model = Sequential()
model.add(embedding_layer)
model.add(LSTM(n_hidden, return_sequences=False))
model.add(Dropout(dropout_keep_prob))
model.add(Dense(vocab_size))
model.add(Activation('softmax'))
optimizer = RMSprop(lr=self.lr)

model.compile(optimizer=optimizer, 
loss='sparse_categorical_crossentropy')

The Keras FAQ have a solution to get the output of intermediate layers here. Another solution is given here. However, these answers store the intermediate outputs in a different model which is not what I need. I want to use the logits for my custom metric. The custom metric should be included in the model.compile() function such that it's evaluated and displayed during training. So I don't need the output of the Dense layer separated in a different model, but as part of my original model.

In short, my questions are:

  • When defining a custom metric as outlined here using def custom_metric(y_true, y_pred), does the y_pred contain logits or normalized probabilities?

  • If it contains normalized probabilities, how can I get the unnormalized probabilities, i.e. the logits output by the Dense layer?

like image 599
Lemon Avatar asked Oct 31 '17 13:10

Lemon


1 Answers

try to change last activation from softmax to linear

model = Sequential()
model.add(embedding_layer)
model.add(LSTM(n_hidden, return_sequences=False))
model.add(Dropout(dropout_keep_prob))
model.add(Dense(vocab_size))
model.add(Activation('linear'))
optimizer = RMSprop(lr=self.lr)

model.compile(optimizer=optimizer, loss='sparse_categorical_crossentropy')
like image 67
Ioannis Nasios Avatar answered Oct 04 '22 02:10

Ioannis Nasios