I am using a big dataset, and so I'm trying to use train_on_batch(or fit with epoch = 1)
model = Sequential()
model.add(LSTM(size,input_shape=input_shape,return_sequences=False))
model.add(Dense(output_dim))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=["accuracy"])
for e in range(nb_epoch):
for batch_X, batch_y in batches:
model.train_on_batch(batch_X,batch_y)
# or
# model.fit(batch_X,batch_y,batch_size=batch_size,nb_epoch=1,verbose=1,shuffle=True,)
But when training starts, this happens:
(0, 128)
Epoch 1/1
128/128 [==============================] - 2s - loss: 0.3262 - acc: 0.1130
(129, 257)
Epoch 1/1
128/128 [==============================] - 2s - loss: -0.0000e+00 - acc: 0.0000e+00
It doesn't matter how many epochs I wait, it doesn't change. Even If I change the batch size, the same thing happens: The first batch has good values and then it just goes to "loss: -0.0000e+00 - acc: 0.0000e+00" again.
Can someone maybe help in understanding what's happening here?
train_on_batch methodRuns a single gradient update on a single batch of data. Arguments. x: Input data. It could be: A Numpy array (or array-like), or a list of arrays (in case the model has multiple inputs).
Epoch: an arbitrary cutoff, generally defined as "one pass over the entire dataset", used to separate training into distinct phases, which is useful for logging and periodic evaluation. When using validation_data or validation_split with the fit method of Keras models, evaluation will be run at the end of every epoch.
Here we are training our network for 10 epochs along with the default batch size of 32. For small and less complex datasets it is recommended to use keras.
By default Keras' model. fit() returns a History callback object. This object keeps track of the accuracy, loss and other training metrics, for each epoch, in the memory.
This seems like the exploding/vanishing gradient problem. Like someone said try tuning your learning rate and/or the depth/width of your NN layers
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With