Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras - Retrieve layers that the layer connected to

Tags:

I have a model built in Keras that can be sequential or functional. Model is accessible through the model variable. I want to implement the method that would walk through the model from the output to the input and do something with the weights of the model.

Is there any way to get a predecessor layer of the specific layer? I would like to do something like this:

x = <some number>
layer_x = model.layers[x] 
predecessor_layers = ???
like image 213
Primoz Avatar asked Jun 12 '18 10:06

Primoz


1 Answers

The solution suggested by @Mitiku returns only the input tensor but we need a predecessor layer. Predecessor layer can be found following way:

x = <some number>
layer_x = model.layers[x] 
int_node = layer_x._inbound_nodes[0]
predecessor_layers = int_node.inbound_layers[0]

In the proposed solution, we assume that the layer_x has only one predecessor layer. To get that layer we first access to the node which connects those two layers: int_node and then takes the layer which is on its input: int_node.inbound_layers[0].

Note: This solution is not nice since it access to the protected attribute but it works.

like image 68
Primoz Avatar answered Sep 28 '22 19:09

Primoz