I have realized that I do not quite understand the difference between calling either the __call__
, call
, or predict
method of a Keras' model.
For example, we have a trained keras model. After calling the code:
# After training.
y_pred_1 = model(X_new)
y_pred_2 = model.call(X_new)
y_pred_3 = model.predict(X_new)
I expected that y_pred_1
, y_pred_2
, and y_pred_3
are all the same.
But it turned out that they are not the same.
Could you please explain to me the difference?
predict() returns the final output of the model, i.e. answer. While model. evaluate() returns the loss. The loss is used to train the model (via backpropagation) and it is not the answer.
Keras model predicts is the method of function provided in Keras that helps in the predictions of output depending on the specified samples of input to the model.
keras. Model . To call a model on an input, always use the __call__() method, i.e. model(inputs) , which relies on the underlying call() method. Input tensor, or dict/list/tuple of input tensors.
Just to complement the answer as I was also searching for this. When you need to specify the training flag of the model for the inference phase, such as, model(X_new, training=False)
when you have a batch normalization layer, for example, both predict
and predict_on_batch
already do that when they are executed.
So, model(X_new, training=False)
and model.predict_on_batch(X_new)
are equivalent.
The difference between predict
and predict_on_batch
is that the latter runs over a single batch, and the former runs over a dataset that is splitted into batches and the results merged to produce the final numpy array of predictions.
Beyond the difference mentioned by @Dmitry Kabanov, the functions generate different types of output,
__call__
generates a Tensor, and predict
and predict_on_batch
generate numpy.ndarray
, and
according to the documentation, __call__
is faster than the predict
function for small scale inputs, i.e., which fit in one batch.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With