Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Dropconnect in Tensorflow

I'm looking for so-called "DropConnect" in TensorFlow. I know how use "dropout" in Tensorflow Neural Network, but I couldn't figure out which method is for "DropConnect", or if it's not available, can anyone suggest me how to implement?
Anyway, I've tried "dropout", "weight decay" and "early stopping" but I'm still suffered from overfitting. Is there any better solution for overfitting in TensorFlow?

like image 725
lenhhoxung Avatar asked May 10 '16 10:05

lenhhoxung


1 Answers

DropConnect sets a randomly selected subset of weights within the network to zero. You can implement it by considering fraction of weights in each layer as zero during training, and updating all the weights during backpropagation.

Moreover, you can also use classic techniques in order to avoid overfitting.

  1. Collect more data, or create it via various data transformations method.
  2. Reduce the number of features.
  3. Reduce the complexity of the model.
like image 170
Reet Awwsum Avatar answered Nov 04 '22 03:11

Reet Awwsum