I'm having differences of the outputs when comparing a model with its stored protobuf version (via this conversion script). For debugging I'm comparing both layers respectively. For the weights and the actual layer output during a test sequence I receive the identical outputs, thus I'm not sure how to access the hidden layers.
Here is how I load the layers
    input  = graph.get_tensor_by_name("lstm_1_input_1:0")
    layer1 = graph.get_tensor_by_name("lstm_1_1/kernel:0")
    layer2 = graph.get_tensor_by_name("lstm_1_1/recurrent_kernel:0")
    layer3 = graph.get_tensor_by_name("time_distributed_1_1/kernel:0")
    output = graph.get_tensor_by_name("activation_1_1/div:0")
Here is the way what I thought to show the respective elements.
show weights:
with tf.Session(graph=graph) as sess:
       print sess.run(layer1)
       print sess.run(layer2)
       print sess.run(layer3)
show outputs:
with tf.Session(graph=graph) as sess:
    y_out, l1_out, l2_out, l3_out = sess.run([output, layer1, layer2, layer3], feed_dict={input: X_test})
With this code sess.run(layer1) == sess.run(layer1,feed_dict={input:X_test}) which shouldn't be true.
Can someone help me out?
When you run sess.run(layer1), you're telling tensorflow to compute the value of layer1 tensor, which is ...
layer1 = graph.get_tensor_by_name("lstm_1_1/kernel:0")
... according to your definition. Note that LSTM kernel is the weights variable. It does not depend on the input, that's why you get the same result with sess.run(layer1, feed_dict={input:X_test}). It's not like tensorflow is computing the output if the input is provided -- it's computing the specified tensor(s), in this case layer1.
When does input matter then? When there is a dependency on it. For example:
sess.run(output). It simply won't work without an input, or any tensor that will allow to compute the input.tf.train.AdapOptimizer(...).minimize(loss). Running this op will change layer1, but it also needs the input to do so.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With