Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

tensorflow: what's the difference between tf.nn.dropout and tf.layers.dropout

Tags:

tensorflow

I'm quite confused about whether to use tf.nn.dropout or tf.layers.dropout.

many MNIST CNN examples seems to use tf.nn.droput, with keep_prop as one of params.

but how is it different with tf.layers.dropout? is the "rate" params in tf.layers.dropout similar to tf.nn.dropout?

Or generally speaking, is the difference between tf.nn.dropout and tf.layers.dropout applies to all other similar situations, like similar functions in tf.nn and tf.layers.

like image 565
ntuty Avatar asked Jun 06 '17 16:06

ntuty


People also ask

What is a Dropout layer Tensorflow?

The Dropout layer randomly sets input units to 0 with a frequency of rate at each step during training time, which helps prevent overfitting. Inputs not set to 0 are scaled up by 1/(1 - rate) such that the sum over all inputs is unchanged.

What is the use of Dropout in Tensorflow?

The primary purpose of dropout is to minimize the effect of overfitting within a trained network. Dropout technique works by randomly reducing the number of interconnecting neurons within a neural network.


2 Answers

A quick glance through tensorflow/python/layers/core.py and tensorflow/python/ops/nn_ops.py reveals that tf.layers.dropout is a wrapper for tf.nn.dropout.

The only differences in the two functions are:

  1. The tf.nn.dropout has parameter keep_prob: "Probability that each element is kept"
    tf.layers.dropout has parameter rate: "The dropout rate"
    Thus, keep_prob = 1 - rate as defined here
  2. The tf.layers.dropout has training parameter: "Whether to return the output in training mode (apply dropout) or in inference mode (return the input untouched)."
like image 91
nikpod Avatar answered Sep 21 '22 22:09

nikpod


The idea is the same, the parameters are slightly different. In nn.dropout, keep_prob is the probability that each element is kept. In layers.dropout rate=0.1 would drop out 10% of input units.

So keep_prob = 1 - rate. Also layers.dropout allows training parameter.

In general, just read carefully documentation about the functions you care about and you will see the differences.

like image 25
Salvador Dali Avatar answered Sep 22 '22 22:09

Salvador Dali