I have seen the weights of neural networks initialized to random numbers so I am curious why the weights of logistic regression get initialized to zeros?
Incase of Neural Networks there n neurons in each layer. So if you initialize weight of each neuron with 0 then after back propogation each of them will have same weights :
Neurons a1 and a2 in the first layer will have same weights no matter how long you iterate. Since they are calculating the same function.
Which is not the case with logistic regression its simply y = Wx + b.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With