I have a model built in Keras that can be sequential or functional. Model is accessible through the model
variable. I want to implement the method that would walk through the model from the output to the input and do something with the weights of the model.
Is there any way to get a predecessor layer of the specific layer? I would like to do something like this:
x = <some number>
layer_x = model.layers[x]
predecessor_layers = ???
The solution suggested by @Mitiku returns only the input tensor but we need a predecessor layer. Predecessor layer can be found following way:
x = <some number>
layer_x = model.layers[x]
int_node = layer_x._inbound_nodes[0]
predecessor_layers = int_node.inbound_layers[0]
In the proposed solution, we assume that the layer_x
has only one predecessor layer. To get that layer we first access to the node which connects those two layers: int_node
and then takes the layer which is on its input: int_node.inbound_layers[0]
.
Note: This solution is not nice since it access to the protected attribute but it works.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With