I'm looking for so-called "DropConnect" in TensorFlow. I know how use "dropout" in Tensorflow Neural Network, but I couldn't figure out which method is for "DropConnect", or if it's not available, can anyone suggest me how to implement?
Anyway, I've tried "dropout", "weight decay" and "early stopping" but I'm still suffered from overfitting. Is there any better solution for overfitting in TensorFlow?
DropConnect sets a randomly selected subset of weights within the network to zero. You can implement it by considering fraction of weights in each layer as zero during training, and updating all the weights during backpropagation.
Moreover, you can also use classic techniques in order to avoid overfitting.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With