Hi I am trying to make a super resolution model on keras.
I am referring to https://github.com/titu1994/Image-Super-Resolution.
But after I compile and save a new model, when I load the model, the metric error is occurred
Traceback (most recent call last):
File "autoencoder2.py", line 56, in <module>
load_model("./ani.model")
File "/home/simmani91/anaconda2/lib/python2.7/site-packages/keras/models.py", line 155, in load_model
sample_weight_mode=sample_weight_mode)
File "/home/simmani91/anaconda2/lib/python2.7/site-packages/keras/engine/training.py", line 665, in compile
metric_fn = metrics_module.get(metric)
File "/home/simmani91/anaconda2/lib/python2.7/site-packages/keras/metrics.py", line 84, in get
return get_from_module(identifier, globals(), 'metric')
File "/home/simmani91/anaconda2/lib/python2.7/site-packages/keras/utils/generic_utils.py", line 14, in get_from_module
str(identifier))
Exception: Invalid metric: PSNRLoss
and here is my code for metric(PSNRLoss), create model, execution
def PSNRLoss(y_true, y_pred):
return -10. * np.log10(K.mean(K.square(y_pred - y_true)))
def create_model():
shape = (360,640,3)
input_img = Input(shape=shape)
x = Convolution2D(64, shape[0],shape[1], activation='relu', border_mode='same', name='level1')(input_img)
x = Convolution2D(32,shape[0],shape[1], activation='relu', border_mode='same', name='level2')(x)
out = Convolution2D(3, shape[0],shape[1], border_mode='same', name='output')(x)
model = Model(input_img, out)
#model.compile(optimizer='adadelta', loss='binary_crossentropy')
adam = optimizers.Adam(lr=1e-3)
model.compile(optimizer=adam, loss='mse', metrics=[PSNRLoss])
return model
path = "./picture/"
if not os.path.exists("./ani.model"):
ani_model = create_model()
ani_model.save("./ani.model")
load_model("./ani.model")
Is there any way to load a model with PSNR metric?
Thank you for reading.
Load the model with load_model("ani.model", custom_objects={"PSNRLoss": PSNRLoss})
instead.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With