Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Batch_size in tensorflow? Understanding the concept

My question is simple and stright forward. What does a batch size specify while training and predicting a neural network. How to visualize it so as to get a clear picture of how data is being feed to the network.

Suppose I have an autoencoder

encoder = tflearn.input_data(shape=[None, 41])
encoder = tflearn.fully_connected(encoder, 41,activation='relu')

and I am taking an input as csv file with 41 features, So as to what I understand is it will take each feature from csv file and feed it to the 41 neurons of the first layer when my batch size is 1.

But when I increase the batch size to 100 how is this 41 features of 100 batches are going to be feed to this network?

model.fit(test_set, test_labels_set, n_epoch=1, validation_set=(valid_set, valid_labels_set),
          run_id="auto_encoder", batch_size=100,show_metric=True, snapshot_epoch=False)

Will there be any normalization on the batches or some operations on them?

The number of epocs are same for both the cases

like image 232
WiLL_K Avatar asked Mar 24 '17 12:03

WiLL_K


People also ask

What is Batch_size in Tensorflow?

The batch size is a number of samples processed before the model is updated. The number of epochs is the number of complete passes through the training dataset. The size of a batch must be more than or equal to one and less than or equal to the number of samples in the training dataset.

What is Batch_size in LSTM?

Batch Size is the number of samples we send to the model at a time. In this example, we have batch size = 2 but you can take it 4, 8,16, 32, 64 etc depends on the memory (basically in 2's power)

Why do we need batch size?

Batch size controls the accuracy of the estimate of the error gradient when training neural networks. Batch, Stochastic, and Minibatch gradient descent are the three main flavors of the learning algorithm. There is a tension between batch size and the speed and stability of the learning process.

What is meant by batch size?

Batch size is a term used in machine learning and refers to the number of training examples utilized in one iteration. The batch size can be one of three options: batch mode: where the batch size is equal to the total dataset thus making the iteration and epoch values equivalent.


1 Answers

The batch size is the amount of samples you feed in your network. For your input encoder you specify that you enter an unspecified(None) amount of samples with 41 values per sample.

The advantage of using None is that you can now train with batches of 100 values at once (which is good for your gradient), and test with a batch of only one value (one sample for which you want a prediction).

If you don't specify normalization per batch there is no normalization per batch ;)

Hope I explained it well enough! If you have more questions feel free to ask them!

like image 156
rmeertens Avatar answered Oct 30 '22 11:10

rmeertens