Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to turn off dropout for testing in Tensorflow?

I am fairly new to Tensorflow and ML in general, so I hereby apologize for a (likely) trivial question.

I use the dropout technique to improve learning rates of my network, and it seems to work just fine. Then, I would like to test the network on some data to see if it works like this:

   def Ask(self, image):         return self.session.run(self.model, feed_dict = {self.inputPh: image}) 

Obviously, it yields different results each time as the dropout is still in place. One solution I can think of is to create two separate models - one for a training and the other one for an actual later use of the network, however, such a solution seems impractical to me.

What's the common approach to solving this problem?

like image 744
G. Mesch Avatar asked Jul 07 '17 12:07

G. Mesch


People also ask

How do I turn off dropout in Keras?

In Keras dropout is disabled in test mode. You can use the dropped input in training and the actual input while testing. You have to build your own training function from the layers and specify the training flag to predict with dropout (e.g. it's not possible to specify a training flag for the predict functions).

What is Tensorflow dropout?

Dropout consists in randomly setting a fraction rate of input units to 0 at each update during training time, which helps prevent overfitting. The units that are kept are scaled by 1 / (1 - rate) , so that their sum is unchanged at training time and inference time.

Is dropout used in training or testing?

Dropout is only used during training to make the network more robust to fluctuations in the training data. At test time, however, you want to use the full network in all its glory. In other words, you do not apply dropout with the test data and during inference in production.

How do I get rid of dropout Pytorch?

In general, if you wanna deactivate your dropout layers, you'd better define the dropout layers in __init__ method using nn. Dropout module.


2 Answers

The easiest way is to change the keep_prob parameter using a placeholder_with_default:

prob = tf.placeholder_with_default(1.0, shape=()) layer = tf.nn.dropout(layer, prob) 

in this way when you train you can set the parameter like this:

sess.run(train_step, feed_dict={prob: 0.5}) 

and when you evaluate the default value of 1.0 is used.

like image 137
nessuno Avatar answered Sep 24 '22 19:09

nessuno


With the new tf.estimator API you specify a model function, that returns different models, based on whether you are training or testing, but still allows you to reuse your model code. In your model function you would do something similar to:

def model_fn(features, labels, mode):      training = (mode == tf.estimator.ModeKeys.TRAIN)     ...     t = tf.layers.dropout(t, rate=0.25, training=training, name='dropout_1')     ... 

The mode argument is automatically passed depending on whether you call estimator.train(...) or estimator.predict(...).

like image 43
Jarno Avatar answered Sep 23 '22 19:09

Jarno