Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Sudden 50% accuracy drop while training convolutional NN

Training convolutional neural network from scratch on my own dataset with Keras and Tensorflow.

learning rate = 0.0001, 5 classes to sort, no Dropout used, dataset checked twice, no wrong labels found

Model:

model = models.Sequential()
model.add(layers.Conv2D(16,(2,2),activation='relu',input_shape=(75,75,3)))
model.add(layers.MaxPooling2D((2,2)))
model.add(layers.Conv2D(16,(2,2),activation='relu'))
model.add(layers.MaxPooling2D((2,2)))
model.add(layers.Conv2D(32,(2,2),activation='relu'))
model.add(layers.MaxPooling2D((2,2)))
model.add(layers.Flatten())
model.add(layers.Dense(128,activation='relu'))
model.add(layers.Dense(5,activation='sigmoid'))

model.compile(optimizer=optimizers.adam(lr=0.0001),
             loss='categorical_crossentropy',
             metrics=['acc'])

history = model.fit_generator(train_generator,
                              steps_per_epoch=100,
                              epochs=50,
                              validation_data=val_generator,
                              validation_steps=25)

Everytime when model achieves 25-35 epochs (80-90% accuracy) this happens:

Epoch 31/50
100/100 [==============================] - 3s 34ms/step - loss: 0.3524 - acc: 0.8558 - val_loss: 0.4151 - val_acc: 0.7992
Epoch 32/50
100/100 [==============================] - 3s 34ms/step - loss: 0.3393 - acc: 0.8700 - val_loss: 0.4384 - val_acc: 0.7951
Epoch 33/50
100/100 [==============================] - 3s 34ms/step - loss: 0.3321 - acc: 0.8702 - val_loss: 0.4993 - val_acc: 0.7620
Epoch 34/50
100/100 [==============================] - 3s 33ms/step - loss: 1.5444 - acc: 0.3302 - val_loss: 1.6062 - val_acc: 0.1704
Epoch 35/50
100/100 [==============================] - 3s 34ms/step - loss: 1.6094 - acc: 0.2935 - val_loss: 1.6062 - val_acc: 0.1724

There is some similar problems with answers, but mostly they recommend to lower learning rate, but it doesnt help at all.

Accuracy Drop

UPD: almost all weights and biases in network became nan. Network somehow died inside

like image 521
Derfaut Avatar asked Mar 16 '19 15:03

Derfaut


Video Answer


1 Answers

Solution in this case:

I changed sigmoid function in last layer to softmax function and drops are gone

Why this worked out?

sigmoid activation function is used for binary (two-class) classifications. In multiclassification problems we should use softmax function - special extension of sigmoid function for multiclassification problems.

More information: Sigmoid vs Softmax

Special thanks to @desertnaut and @Shubham Panchal for error indication

like image 91
Derfaut Avatar answered Oct 17 '22 05:10

Derfaut