What is an epoch in TensorFlow and what is it used for?
https://github.com/tensorflow/tensorflow/blob/754048a0453a04a761e112ae5d99c149eb9910dd/tensorflow/contrib/learn/python/learn/datasets/mnist.py#L173
An epoch consists of one full cycle through the training data. This is usually many steps. As an example, if you have 2,000 images and use a batch size of 10 an epoch consists of: 2,000 images / (10 images / step) = 200 steps.
An epoch is a term used in machine learning and indicates the number of passes of the entire training dataset the machine learning algorithm has completed. Datasets are usually grouped into batches (especially when the amount of data is very large).
Epoch: an arbitrary cutoff, generally defined as "one pass over the entire dataset", used to separate training into distinct phases, which is useful for logging and periodic evaluation. When using validation_data or validation_split with the fit method of Keras models, evaluation will be run at the end of every epoch.
Epoch can be understood as the number of times the algorithm scans the entire data. For example if we set epoch = 10 then the algorithm will scan the entire data ten times. Whereas, Iteration is the number of times a certain batch is passed via an algorithm.
An epoch, in Machine Learning, is the entire processing by the learning algorithm of the entire train-set.
The MNIST train set is composed by 55000 samples. Once the algorithm processed all those 55000 samples an epoch is passed.
An epoch is a full iteration over samples. The number of epochs is how many times the algorithm is going to run. The number of epochs affects directly (or not) the result of the training step (with just a few epochs you can reach only a local minimum, but with more epochs, you can reach a global minimum or at least a better local minimum).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With