I'm studying gan with keras-gan/wgan-gp example with my own dataset. I save models with
wgan.generator.save('generator.h5')
wgan.critic.save('critic.h5')
and load with
model = load_model('generator.h5')
model = load_model('critic.h5')
But this only works fine at the fist time.When I saved the models again after the second training and run
model = load_model('generator.h5')
model = load_model('critic.h5')
again, the error occur :
ValueError Traceback (most recent call last) in () ----> 1 model = load_model('generator.h5')
D:\keras\engine\saving.py in load_model(filepath, custom_objects, compile) 262 263 # set weights --> 264 load_weights_from_hdf5_group(f['model_weights'], model.layers) 265 266 if compile:
D:\keras\engine\saving.py in load_weights_from_hdf5_group(f, layers, reshape) 914 original_keras_version, 915 original_backend, --> 916 reshape=reshape) 917 if len(weight_values) != len(symbolic_weights): 918 raise ValueError('Layer #' + str(k) +
D:\keras\engine\saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape) 555 weights = convert_nested_time_distributed(weights) 556 elif layer.class.name in ['Model', 'Sequential']: --> 557 weights = convert_nested_model(weights) 558 559 if original_keras_version == '1':
D:\keras\engine\saving.py in convert_nested_model(weights) 543 weights=weights[:num_weights], 544 original_keras_version=original_keras_version, --> 545 original_backend=original_backend)) 546 weights = weights[num_weights:] 547 return new_weights
D:\keras\engine\saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape) 555 weights = convert_nested_time_distributed(weights) 556 elif layer.class.name in ['Model', 'Sequential']: --> 557 weights = convert_nested_model(weights) 558 559 if original_keras_version == '1':
D:\keras\engine\saving.py in convert_nested_model(weights) 531 weights=weights[:num_weights], 532 original_keras_version=original_keras_version, --> 533 original_backend=original_backend)) 534 weights = weights[num_weights:] 535
D:\keras\engine\saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape) 673 weights[0] = np.reshape(weights[0], layer_weights_shape) 674 elif layer_weights_shape != weights[0].shape: --> 675 weights[0] = np.transpose(weights[0], (3, 2, 0, 1)) 676 if layer.class.name == 'ConvLSTM2D': 677 weights1 = np.transpose(weights1, (3, 2, 0, 1))
c:\users\administrator\appdata\local\programs\python\python35\lib\site-packages\numpy\core\fromnumeric.py in transpose(a, axes) 596 597 """ --> 598 return _wrapfunc(a, 'transpose', axes) 599 600
c:\users\administrator\appdata\local\programs\python\python35\lib\site-packages\numpy\core\fromnumeric.py in _wrapfunc(obj, method, *args, **kwds) 49 def _wrapfunc(obj, method, *args, **kwds): 50 try: ---> 51 return getattr(obj, method)(*args, **kwds) 52 53 # An AttributeError occurs if the object does not have
ValueError: axes don't match array`
I'm using
Python 3.5.3
Keras 2.2.2
h5py 2.8.0
tensorflow-gpu 1.9.0
keras-contrib 2.0.8
Keras-Applications 1.0.4
Keras-Preprocessing 1.0.2
Any advice and suggestions will be appreciated.
Looks like the issue described in:
https://github.com/keras-team/keras/pull/11847
and
https://github.com/tensorflow/tensorflow/issues/27769
While the bug has yet to be fixed, the problem only occurs when there are both trainable and non-trainable weights in the model. If you do not need to train the model further, you can work around the problem by freezing all the weights prior to saving:
from keras import models
def freeze(model):
"""Freeze model weights in every layer."""
for layer in model.layers:
layer.trainable = False
if isinstance(layer, models.Model):
freeze(layer)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With