Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I use tensor board with tf.layers?

As the weights are not explicitly defined, how can I pass them to a summary writer?

For exemple:

conv1 = tf.layers.conv2d(
    tf.reshape(X,[FLAGS.batch,3,160,320]),
    filters = 16,
    kernel_size = (8,8),
    strides=(4, 4),
    padding='same',
    kernel_initializer=tf.contrib.layers.xavier_initializer(),
    bias_initializer=tf.zeros_initializer(),
    kernel_regularizer=None,
    name = 'conv1',
    activation = tf.nn.elu
    )

=>

summarize_tensor(
    ??????
)

Thanks!

like image 988
Malo Marrec Avatar asked Jun 24 '17 01:06

Malo Marrec


People also ask

How do you visualize a TensorFlow model?

TensorBoard's Graphs dashboard is a powerful tool for examining your TensorFlow model. You can quickly view a conceptual graph of your model's structure and ensure it matches your intended design. You can also view a op-level graph to understand how TensorFlow understands your program.

Is TensorBoard part of TensorFlow?

TensorBoard: TensorFlow's visualization toolkit Viewing histograms of weights, biases, or other tensors as they change over time. Projecting embeddings to a lower dimensional space. Displaying images, text, and audio data. Profiling TensorFlow programs.


2 Answers

While Da Tong's answer is complete, it took me a while to realize how to use it. To save time for another beginner, you need to add the following to you code to add all trainable variables to the tensorboard summary:

for var in tf.trainable_variables():
    tf.summary.histogram(var.name, var)
merged_summary = tf.summary.merge_all()
like image 178
John Avatar answered Nov 03 '22 00:11

John


That depends on what you are going to record in TensorBoard. If you want to put every variables into TensorBoard, call tf.all_variables() or tf.trainable_variables() will give you all the variables. Note that the tf.layers.conv2d is just a wrapper of creating a Conv2D instance and call apply method of it. You can unwrap it like this:

conv1_layer = tf.layers.Conv2D(
    filters = 16,
    kernel_size = (8,8),
    strides=(4, 4),
    padding='same',
    kernel_initializer=tf.contrib.layers.xavier_initializer(),
    bias_initializer=tf.zeros_initializer(),
    kernel_regularizer=None,
    name = 'conv1',
    activation = tf.nn.elu
)

conv1 = conv1_layer.apply(tf.reshape(X,[FLAGS.batch,3,160,320]))

Then you can use conv1_layer.kernel to access the kernel weights.

like image 33
Da Tong Avatar answered Nov 03 '22 01:11

Da Tong