To log hparams without using Keras, I'm doing the following as suggested in the tf code here:
with tf.summary.create_file_writer(model_dir).as_default():
hp_learning_rate = hp.HParam("learning_rate", hp.RealInterval(0.00001, 0.1))
hp_distance_margin = hp.HParam("distance_margin", hp.RealInterval(0.1, 1.0))
hparams_list = [
hp_learning_rate,
hp_distance_margin
]
metrics_to_monitor = [
hp.Metric("metrics_standalone/auc", group="validation"),
hp.Metric("loss", group="train", display_name="training loss"),
]
hp.hparams_config(hparams=hparams_list, metrics=metrics_to_monitor)
hparams = {
hp_learning_rate: params.learning_rate,
hp_distance_margin: params.distance_margin,
}
hp.hparams(hparams)
Note that params
is a dictionary object here that I'll pass to the estimator.
Then I train the estimator as usual,
config = tf.estimator.RunConfig(model_dir=params.model_dir)
estimator = tf.estimator.Estimator(model_fn, params=params, config=config)
train_spec = tf.estimator.TrainSpec(...)
eval_spec = tf.estimator.EvalSpec(...)
tf.estimator.train_and_evaluate(estimator, train_spec=train_spec, eval_spec=eval_spec)
After training, when I launch tensorboard, I do have the hparams logged, but I do not see any metrics logged against them
I further confirmed that they show up in the scalars
page with the same tag name for both train and validation i.e. .
and ./eval
, but the hparams page doesn't see those logged tensors.
How do I use hparams with estimators?
I'm using
tensorboard 2.1.0
tensorflow 2.1.0
tensorflow-estimator 2.1.0
tensorflow-metadata 0.15.2
on Python 3.7.5
Attempt 1:
After some googling, I saw some older tf code where they passed hparams
to params
argument of Estimator, so just to make sure if tf2 logs those hparams by itself when given, I checked the Estimator docs and it says:
The
params
argument contains hyperparameters. It is passed to themodel_fn
, if themodel_fn
has a parameter named "params", and to the input functions in the same manner.Estimator
only passes params along, it does not inspect it. The structure ofparams
is therefore entirely up to the developer.
So using hparams as params will not be useful.
Attempt 2:
I doubt that since estimators use tensorflow.python.summary
instead of tf.summary
which is the default in v2, tensors logged by v1 was probably not accessible and so, I also tried to use
with tensorflow.python.summary.FileWriter(model_dir).as_default()
However that failed with RuntimeError: tf.summary.FileWriter is not compatible with eager execution. Use tf.contrib.summary instead
.
Update: I ran it with eager execution disabled. Now, even the hparam initial logging did not happen. There was no hparams
tab in tensorboard as it failed with error
E0129 13:03:07.656290 21584 hparams_plugin.py:104] HParams error: Can't find an HParams-plugin experiment data in the log directory. Note that it takes some time to scan the log directory; if you just started Tensorboard it could be that we haven't finished scanning it yet. Consider trying again in a few seconds.
Is there a way to make tensorboard read already logged metric tensors and link them with hparams?
The culprit seems to be
# This doesn't seem to compatible with Estimator API
hp.hparams_config(hparams=hparams_list, metrics=metrics_to_monitor)
Simply calling hparams logs all metrics logged with tf.summary
. Then in tensorboard, you can filter only the metrics you need and then compare trials.
with tf.summary.create_file_writer(train_folder).as_default():
# params is a dict which contains
# { 'learning_rate': 0.001, 'distance_margin': 0.5,...}
hp.hparams(hparams=params))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With