I want to use keras+tensorboard. My architecture looks like this:
tbCallBack = TensorBoard(log_dir='./logs', histogram_freq=2, batch_size=32, write_graph=True, write_grads=True, write_images=True)
K.clear_session()
sess = tf.Session()
K.set_session(sess)
input_img = Input(shape=(augmented_train_data[0].shape[0], augmented_train_data[0].shape[1], 3))
x = Conv2D(8, (1, 1), padding='same', activation='relu', name="1x1_1")(input_img)
x = Conv2D(16, (3, 3), padding='same', activation='relu', name="3x3_1")(x)
x = Conv2D(32, (3, 3), padding='same', activation='relu', name="3x3_2")(x)
x = Conv2D(1, (1, 1), padding='same', activation='relu', name="1x1_2")(x)
x = Flatten()(x)
x = Dense(16, activation='relu')(x)
output = Dense(2)(x)
model = Model(inputs=input_img, outputs=output)
model.compile(optimizer='adam', loss='mean_squared_error')
#tbCallBack.set_model(model)
print(model.summary())
history = model.fit(augmented_train_data, augmented_train_label, validation_data=[augmented_validation_data, augmented_validation_label] ,epochs=20, batch_size=32, callbacks=[tbCallBack])
When looking at the tensorboard image tab, it looks like this I cant quite interpret that though, I thought this tab would show how the weights of my convolutions develop over the epochs. So, how to interpret these images. Or did I do a mistake in setting up tensorboard?
Visualizing histograms histogram(data, name) . A histogram is basically a collection of values represented by the frequency/density that the value has in the collection. On Tensorboard, they are used to visualize the weights over time.
It looks like that is exactly what you are getting. The grayscale of the image shows the weights. The slider on top can be used to go back and forth in epochs and hence look at the training progression.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With