I'm trying to follow this example with my own model which looks like this:
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_2 (InputLayer) (None, 150, 150, 3) 0
_________________________________________________________________
block1_conv1 (Conv2D) (None, 150, 150, 64) 1792
_________________________________________________________________
block1_conv2 (Conv2D) (None, 150, 150, 64) 36928
_________________________________________________________________
block1_pool (MaxPooling2D) (None, 75, 75, 64) 0
_________________________________________________________________
block2_conv1 (Conv2D) (None, 75, 75, 128) 73856
_________________________________________________________________
block2_conv2 (Conv2D) (None, 75, 75, 128) 147584
_________________________________________________________________
block2_pool (MaxPooling2D) (None, 37, 37, 128) 0
_________________________________________________________________
block3_conv1 (Conv2D) (None, 37, 37, 256) 295168
_________________________________________________________________
block3_conv2 (Conv2D) (None, 37, 37, 256) 590080
_________________________________________________________________
block3_conv3 (Conv2D) (None, 37, 37, 256) 590080
_________________________________________________________________
block3_pool (MaxPooling2D) (None, 18, 18, 256) 0
_________________________________________________________________
block4_conv1 (Conv2D) (None, 18, 18, 512) 1180160
_________________________________________________________________
block4_conv2 (Conv2D) (None, 18, 18, 512) 2359808
_________________________________________________________________
block4_conv3 (Conv2D) (None, 18, 18, 512) 2359808
_________________________________________________________________
block4_pool (MaxPooling2D) (None, 9, 9, 512) 0
_________________________________________________________________
block5_conv1 (Conv2D) (None, 9, 9, 512) 2359808
_________________________________________________________________
block5_conv2 (Conv2D) (None, 9, 9, 512) 2359808
_________________________________________________________________
block5_conv3 (Conv2D) (None, 9, 9, 512) 2359808
_________________________________________________________________
block5_pool (MaxPooling2D) (None, 4, 4, 512) 0
_________________________________________________________________
sequential_1 (Sequential) (None, 1) 2097665
=================================================================
But I get this error:
AttributeError: Layer sequential_2 has multiple inbound nodes, hence the notion of "layer output" is ill-defined. Use
get_output_at(node_index)
instead.
I am clueless where to start. After some searching I think it has to do with the last layer being a Sequential layer instead of a Dense layer which it is in the VGG16 model in the example.
The model is made like the Cat or Dog example from Keras with finetuning.
Any help or ideas how I could proceed from here would be much appreciated!
EDIT: In case it helps to see the code:
model = load_model('final_finetuned_model.h5')
layer_idx = utils.find_layer_idx(model, 'sequential_1')
model.layers[layer_idx].activation = activations.linear
model = utils.apply_modifications(model)
plt.rcParams['figure.figsize'] = (18, 6)
img1 = utils.load_img('test1/cat/5.jpg', target_size=(150, 150))
img2 = utils.load_img('test1/cat/6.jpg', target_size=(150, 150))
for modifier in [None, 'guided', 'relu']:
plt.figure()
f, ax = plt.subplots(1, 2)
plt.suptitle("vanilla" if modifier is None else modifier)
for i, img in enumerate([img1, img2]):
# 20 is the imagenet index corresponding to `ouzel`
grads = visualize_cam(model, layer_idx, filter_indices=20,
seed_input=img, backprop_modifier=modifier)
# Lets overlay the heatmap onto original image.
jet_heatmap = np.uint8(cm.jet(grads)[..., :3] * 255)
ax[i].imshow(overlay(jet_heatmap, img))
plt.show()
I had a similar error for a very similar network with two nodes of output, dense_1_1/Relu:0 and sequential_2/dense_1/Relu:0 . The solution for me was to go to losses.py and change layer_output = self.layer.output
to layer_output = self.layer.get_output_at(-1)
. This is more of a workaround than a solution. When there is one output node, taking the last one [-1] is fine, and when there are two nodes taking the last one worked for me. But this should give you the lead. Try also layer_output = self.layer.get_output_at(0) or other nodes if you have them.
There is a relevant open issue here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With