Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hparams plugin with tf.keras (tensorflow 2.0)

I try to follow the example from the tensorflow docs and setup hyperparameter logging. It also mentions that, if you use tf.keras, you can just use the callback hp.KerasCallback(logdir, hparams). However, if I use the callback I don't get my metrics (only the outcome).

like image 926
Roelant Avatar asked Oct 16 '19 11:10

Roelant


People also ask

What version of TensorFlow is used with Keras?

Keras with model_to_estimator TensorFlow 2.0 was released in 2019, with tight integration of Keras, eager execution by default, and Pythonic function execution, among other new features and improvements. This guide provides a comprehensive technical overview of TF 2.x in TFX. Which version to use?

How do I start using tensorboard hparams with TF?

Start by installing TF 2.0 and loading the TensorBoard notebook extension: Clear any logs from previous runs: Import TensorFlow and the TensorBoard HParams plugin: Download the MNIST dataset and scale it: 1. Experiment Setup and HParams Experiment Summary

How do I use TFX with Keras model?

Here is an end-to-end TFX example using pure Estimator: Taxi example (Estimator) Keras models can be wrapped with the tf.keras.estimator.model_to_estimator function, which allows them to work as if they were Estimators. To use this: Build a Keras model. Pass the compiled model into model_to_estimator.

How do I add hyperparameters to my keras model?

Instead, the hyperparameters are provided in an hparams dictionary and used throughout the training function: Now that we have defined our feature columns, we will use a DenseFeatures layer to input them to our Keras model. For each run, log an hparams summary with the hyperparameters and final accuracy:


2 Answers

The trick is to define the Hparams config with the path in which TensorBoard saves its validation logs.

So, if your TensorBoard callback is set up as:

log_dir = 'path/to/training-logs'
tensorboard_cb = TensorBoard(log_dir=log_dir)

Then you should set up Hparams like this:

hparams_dir = os.path.join(log_dir, 'validation')

with tf.summary.create_file_writer(hparams_dir).as_default():
    hp.hparams_config(
        hparams=HPARAMS,
        metrics=[hp.Metric('epoch_accuracy')]  # metric saved by tensorboard_cb
    )

hparams_cb = hp.KerasCallback(
    writer=hparams_dir,
    hparams=HPARAMS
)
like image 190
Julian Ferry Avatar answered Sep 28 '22 13:09

Julian Ferry


I managed but not entirely sure what was the magic word. Here my flow in case it helps.

callbacks.append(hp.KerasCallback(log_dir, hparams))

HP_NUM_LATENT = hp.HParam('num_latent_dim', hp.Discrete([2, 5, 100])) 
hparams = {
   HP_NUM_LATENT: num_latent,
}

model = create_simple_model(latent_dim=hparams[HP_NUM_LATENT])  # returns compiled model
model.fit(x, y, validation_data=validation_data, 
          epochs=4,
          verbose=2,
          callbacks=callbacks) 
like image 22
Roelant Avatar answered Sep 28 '22 11:09

Roelant