I have trained a Sequential model in Keras using Google Colab to work on the Google Doodle data set. I am doing some simple image classification over here.
The following function defines the architecture of my model:
def create_model(input_shape):
model = keras.Sequential()
model.add(layers.Conv2D(16, (3, 3), padding = 'same', input_shape = input_shape, activation = 'relu'))
model.add(layers.BatchNormalization(axis = 3))
model.add(layers.MaxPooling2D(pool_size = (2, 2)))
model.add(layers.Conv2D(32, (3, 3), padding = 'same', activation = 'relu'))
model.add(layers.BatchNormalization(axis = 3))
model.add(layers.MaxPooling2D(pool_size = (2, 2)))
model.add(layers.Conv2D(64, (3, 3), padding = 'same', activation = 'relu'))
model.add(layers.BatchNormalization(axis = 3))
model.add(layers.MaxPooling2D(pool_size = (2,2)))
model.add(layers.Flatten())
model.add(layers.Dense(128, activation = 'relu'))
model.add(layers.Dense(28, activation = 'softmax'))
return model
and the following piece of code creates, compiles, and fits it:
doodle_model = create_model((image_size, image_size, 1)) #image_size = 28
doodle_model.compile (optimizer = "Adam", loss = "categorical_crossentropy", metrics = ["accuracy"])
doodle_model.fit (x = X_train, y = Y_train, epochs = 4, batch_size = 256)
I was getting good results with this model and so I decided to save it to deploy it later on in a web app. I have used:
doodle_model.save("my_model.h5")
to save the model
However, when I tried to load the model again with a:
from keras.models import load_model
model = load_model ("my_model.h5")
to make sure if it is saved correctly, I get the following error...
KeyError Traceback (most recent call last)
<ipython-input-61-6cde554a8add> in <module>()
1 from keras.models import load_model
2
----> 3 model = load_model ("my_model.h5")
/usr/local/lib/python3.6/dist-packages/keras/models.py in load_model(filepath, custom_objects, compile)
268 raise ValueError('No model found in config file.')
269 model_config = json.loads(model_config.decode('utf-8'))
--> 270 model = model_from_config(model_config, custom_objects=custom_objects)
271
272 # set weights
/usr/local/lib/python3.6/dist-packages/keras/models.py in model_from_config(config, custom_objects)
345 'Maybe you meant to use '
346 '`Sequential.from_config(config)`?')
--> 347 return layer_module.deserialize(config, custom_objects=custom_objects)
348
349
/usr/local/lib/python3.6/dist-packages/keras/layers/__init__.py in deserialize(config, custom_objects)
53 module_objects=globs,
54 custom_objects=custom_objects,
---> 55 printable_module_name='layer')
/usr/local/lib/python3.6/dist-packages/keras/utils/generic_utils.py in deserialize_keras_object(identifier, module_objects, custom_objects, printable_module_name)
142 return cls.from_config(config['config'],
143 custom_objects=dict(list(_GLOBAL_CUSTOM_OBJECTS.items()) +
--> 144 list(custom_objects.items())))
145 with CustomObjectScope(custom_objects):
146 return cls.from_config(config['config'])
/usr/local/lib/python3.6/dist-packages/keras/models.py in from_config(cls, config, custom_objects)
1404 @classmethod
1405 def from_config(cls, config, custom_objects=None):
-> 1406 if 'class_name' not in config[0] or config[0]['class_name'] == 'Merge':
1407 return cls.legacy_from_config(config)
1408
KeyError: 0
As far as I know, KeyErrors are related to python dictionaries but I am not sure why I am encountering a KeyError over here. Any help on why I am getting this error and how can it be resolved would be greatly appreciated.
Every piece of code, after the line
----> 3 model = load_model ("my_model.h5")
of this error message is some internal code of Keras and is not written by me.
It sounds like you may be trying to load a saved model using a version of Keras incompatible with the version you used for saving the model.
What versions are you using in each case? You can check using:
import keras
print(keras.__version__)
It may be that the fix for you would be to upgrade you version of Keras.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With