Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras Training warm_start

Is it possible to continue training a Keras estimator with all the hyperparameters (including decreasing learning rate) and weights saved from previous epochs, as one does in scikit-learn with the warm_start parameter? Something like this:

estimator = KerasRegressor(build_fn=create_model, epochs=20, batch_size=40, warm_start=True)

Specifically, warm start should do this:

warm_start : bool, optional, default False When set to True, reuse the solution of the previous call to fit as initialization, otherwise, just erase the previous solution.

Is there anything like that in Keras?

like image 946
Nick Avatar asked Jul 25 '17 21:07

Nick


1 Answers

Yes - it's possible. But rather cumbersome. You need to use train_on_batch function which keeps all model parameters (also optimizer ones).

This is cumbersome because you need to divide your dataset to batches on your own and you are also losing the possibility to apply Callbacks and to use automatic progbar. I hope that in new Keras version this option would be added to a fit method.

like image 75
Marcin Możejko Avatar answered Sep 30 '22 10:09

Marcin Możejko