I'm looking for a way to implement a search of learning rate as described here: https://arxiv.org/pdf/1506.01186.pdf .
My network is implemented using estimator api and I'd like to stick to that, but unfortunately I'm not able to force estimator to skip saving checkpoints. Do you know a way to simply run a one epoch o training without saving the checkpoints?
According to the docs tf.estimator.RunConfig:
If both save_checkpoints_steps and save_checkpoints_secs are None, then checkpoints are disabled
So the code is following:
run_config = tf.estimator.RunConfig(save_summary_steps=None,
save_checkpoints_secs=None)
estimator = tf.estimator.Estimator(model_fn=model_fn, config=run_config)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With