Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hyperparameter tuning using tensorboard.plugins.hparams api with custom loss function

I am building a neural network with my own custom loss function (pretty long and complicated). My network is unsupervised so my input and expected output are identical and also at the moment I am using one single input (just trying to optimize the loss for a single input).

I am trying to use tensorboard.plugins.hparams api for hyperparameter tuning and don't know how to incorporate my custom loss function there. I'm trying to follow the code suggested on the Tensorflow 2.0 website.

This is what the website suggests:

    HP_NUM_UNITS = hp.HParam('num_units', hp.Discrete([16, 32]))
    HP_DROPOUT = hp.HParam('dropout', hp.RealInterval(0.1, 0.2))
    HP_OPTIMIZER = hp.HParam('optimizer', hp.Discrete(['adam', 'sgd']))

    METRIC_ACCURACY = 'accuracy'

    with tf.summary.create_file_writer('logs/hparam_tuning').as_default():
      hp.hparams_config(
        hparams=[HP_NUM_UNITS, HP_DROPOUT, HP_OPTIMIZER],
        metrics=[hp.Metric(METRIC_ACCURACY, display_name='Accuracy')],
          )

I need to change that as I don't want to use the dropout layer, so I can just delete that. In terms of the METRIC_ACCURACY, I don't want to use accuracy as that has no use in my model but rather use my custom loss function. If I were to do the regular fit model it would look like this:

    model.compile(optimizer=adam,loss=dl_tf_loss, metrics=[dl_tf_loss])

So I tried to change the suggested code into the following code but I get an error and am wondering how I should change it so that it suits my needs. Here is what I tried:

    HP_NUM_UNITS = hp.HParam('num_units', hp.Discrete([16, 32]))
    HP_OPTIMIZER = hp.HParam('optimizer', hp.Discrete(['adam', 'sgd']))

    #METRIC_LOSS = dl_tf_loss

    with tf.summary.create_file_writer('logs/hparam_tuning').as_default():
      hp.hparams_config(hparams=[HP_NUM_UNITS, HP_OPTIMIZER],metrics= 
       [hp.Metric(dl_tf_loss, display_name='Loss')])

It gives me the following error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-26-27d079c6be49> in <module>()
      5 
      6 with tf.summary.create_file_writer('logs/hparam_tuning').as_default():
----> 7   hp.hparams_config(hparams=[HP_NUM_UNITS, HP_OPTIMIZER],metrics=[hp.Metric(dl_tf_loss, display_name='Loss')])
      8 

3 frames
/usr/local/lib/python3.6/dist-packages/tensorboard/plugins/hparams/summary_v2.py in hparams_config(hparams, metrics, time_created_secs)
    127       hparams=hparams,
    128       metrics=metrics,
--> 129       time_created_secs=time_created_secs,
    130   )
    131   return _write_summary("hparams_config", pb)

/usr/local/lib/python3.6/dist-packages/tensorboard/plugins/hparams/summary_v2.py in hparams_config_pb(hparams, metrics, time_created_secs)
    161       domain.update_hparam_info(info)
    162     hparam_infos.append(info)
--> 163   metric_infos = [metric.as_proto() for metric in metrics]
    164   experiment = api_pb2.Experiment(
    165       hparam_infos=hparam_infos,

/usr/local/lib/python3.6/dist-packages/tensorboard/plugins/hparams/summary_v2.py in <listcomp>(.0)
    161       domain.update_hparam_info(info)
    162     hparam_infos.append(info)
--> 163   metric_infos = [metric.as_proto() for metric in metrics]
    164   experiment = api_pb2.Experiment(
    165       hparam_infos=hparam_infos,

/usr/local/lib/python3.6/dist-packages/tensorboard/plugins/hparams/summary_v2.py in as_proto(self)
    532         name=api_pb2.MetricName(
    533             group=self._group,
--> 534             tag=self._tag,
    535         ),
    536         display_name=self._display_name,

TypeError: <tensorflow.python.eager.def_function.Function object at 0x7f9f3a78e5c0> has type Function, but expected one of: bytes, unicode

I also tried running the following code:

    with tf.summary.create_file_writer('logs/hparam_tuning').as_default():
      hp.hparams_config(hparams=[HP_NUM_UNITS, HP_OPTIMIZER],metrics= 
      [dl_tf_loss])

but got the following error:

AttributeError                            Traceback (most recent call last)
<ipython-input-28-6778bdf7f1b1> in <module>()
      8 
      9 with tf.summary.create_file_writer('logs/hparam_tuning').as_default():
---> 10   hp.hparams_config(hparams=[HP_NUM_UNITS, HP_OPTIMIZER],metrics=[dl_tf_loss])

2 frames
/usr/local/lib/python3.6/dist-packages/tensorboard/plugins/hparams/summary_v2.py in <listcomp>(.0)
    161       domain.update_hparam_info(info)
    162     hparam_infos.append(info)
--> 163   metric_infos = [metric.as_proto() for metric in metrics]
    164   experiment = api_pb2.Experiment(
    165       hparam_infos=hparam_infos,

AttributeError: 'Function' object has no attribute 'as_proto'

Would greatly appreciate any help. Thanks in advance!

like image 255
Keren Avatar asked Apr 13 '26 00:04

Keren


1 Answers

I figured it out.

The original METRIC_ACCURACY that I changed to METRIC_LOSS is apparently just the name, I needed to write 'tf_dl_loss' as a string and not as the function.

In the proceeding parts of the tuning, I needed to anyway write my fit command, there I inserted the actual loss function as I showed in my example of the regular fit function.

Highly recommend this as a way of tuning the hyperparameters.

like image 91
Keren Avatar answered Apr 17 '26 22:04

Keren



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!