After a training procedure, I wanted to check the accuracy by loading the created model.h5
and executing an evaluation procedure. However, I am getting a following warning:
/usr/local/lib/python3.5/dist-packages/keras/engine/saving.py:269: UserWarning: No training configuration found in save file: the model was not compiled. Compile it manually. warnings.warn('No training configuration found in save file:
This dist-packages/keras/engine/saving.py
file
so the problem in loading created model -> this line of code
train_model = load_model('model.h5')
Problem indicates that the model was not compiled, however, I did it.
optimizer = Adam(lr=lr, clipnorm=0.001) train_model.compile(loss=dummy_loss, optimizer=optimizer)
I can't understand what I am doing wrong . . . Please help me! SOS :-(
I'd like to add to olejorgenb's answer - for a specific scenario, where you don't want to train the model, just use it (e.g. in production).
"Compile" means "prepare for training", which includes mainly setting up the optimizer. It could also have been saved before, and then you can continue the "same" training after loading the saved model.
But, what about the scenario - I want to just run the model? Well, use the compile=False
argument to load_model
like that:
trained_model = load_model('model.h5', compile=False)
You won't be able to .fit()
this model without using trained_model.compile(...)
first, but most importantly - the warning will go away.
Btw, in my Keras version, the argument include_optimizer
has a default of True
. This should work also for trainig callbacks like Checkpoint
. This means, when loading a model saved by Keras, you can usually count on the optimizer being included (except for the situation: see Hull Gasper's answer).
But, when you have a model which was not trained by Keras (e.g. when converting a model trained by Darknet), the model is saved un-compiled. This produces the warning, and you can get rid of it in the way described above.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With