Using Anaconda Python 2.7 Windows 10.
I am training a language model using the Keras exmaple:
print('Build model...') model = Sequential() model.add(GRU(512, return_sequences=True, input_shape=(maxlen, len(chars)))) model.add(Dropout(0.2)) model.add(GRU(512, return_sequences=False)) model.add(Dropout(0.2)) model.add(Dense(len(chars))) model.add(Activation('softmax')) model.compile(loss='categorical_crossentropy', optimizer='rmsprop') def sample(a, temperature=1.0): # helper function to sample an index from a probability array a = np.log(a) / temperature a = np.exp(a) / np.sum(np.exp(a)) return np.argmax(np.random.multinomial(1, a, 1)) # train the model, output generated text after each iteration for iteration in range(1, 3): print() print('-' * 50) print('Iteration', iteration) model.fit(X, y, batch_size=128, nb_epoch=1) start_index = random.randint(0, len(text) - maxlen - 1) for diversity in [0.2, 0.5, 1.0, 1.2]: print() print('----- diversity:', diversity) generated = '' sentence = text[start_index: start_index + maxlen] generated += sentence print('----- Generating with seed: "' + sentence + '"') sys.stdout.write(generated) for i in range(400): x = np.zeros((1, maxlen, len(chars))) for t, char in enumerate(sentence): x[0, t, char_indices[char]] = 1. preds = model.predict(x, verbose=0)[0] next_index = sample(preds, diversity) next_char = indices_char[next_index] generated += next_char sentence = sentence[1:] + next_char sys.stdout.write(next_char) sys.stdout.flush() print()
According to Keras documentation, the model.fit
method returns a History callback, which has a history attribute containing the lists of successive losses and other metrics.
hist = model.fit(X, y, validation_split=0.2) print(hist.history)
After training my model, if I run print(model.history)
I get the error:
AttributeError: 'Sequential' object has no attribute 'history'
How do I return my model history after training my model with the above code?
UPDATE
The issue was that:
The following had to first be defined:
from keras.callbacks import History history = History()
The callbacks option had to be called
model.fit(X_train, Y_train, nb_epoch=5, batch_size=16, callbacks=[history])
But now if I print
print(history.History)
it returns
{}
even though I ran an iteration.
It's been solved.
The losses only save to the History over the epochs. I was running iterations instead of using the Keras built in epochs option.
so instead of doing 4 iterations I now have
model.fit(......, nb_epoch = 4)
Now it returns the loss for each epoch run:
print(hist.history) {'loss': [1.4358016599558268, 1.399221191623641, 1.381293383180471, 1.3758836857303727]}
Just an example started from
history = model.fit(X, Y, validation_split=0.33, nb_epoch=150, batch_size=10, verbose=0)
You can use
print(history.history.keys())
to list all data in history.
Then, you can print the history of validation loss like this:
print(history.history['val_loss'])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With