In TF 1.12 or TF 2.0 is there going to be a replacement for the following function:
from tensorflow.contrib.training.python.training import hparam
I read that contrib
module will go away or merge into core.
HParams is a thoughtful approach to configuration management for machine learning projects. It enables you to externalize your hyperparameters into a configuration file. In doing so, you can reproduce experiments, iterate quickly, and reduce errors. Features: Approachable and easy-to-use API.
With Keras Tuner, you can do both data-parallel and trial-parallel distribution. That is, you can use tf. distribute. Strategy to run each Model on multiple GPUs, and you can also search over multiple different hyperparameter combinations in parallel on different workers.
The Keras Tuner is a library that helps you pick the optimal set of hyperparameters for your TensorFlow program. The process of selecting the right set of hyperparameters for your machine learning (ML) application is called hyperparameter tuning or hypertuning.
In TF 2.0 there is a new API tensorboard.plugins.hparams.api
that includes a class HParam
Usage of the new API is described in this guide: Hyperparameter Tuning with the HParams Dashboard
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With