Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Step function versus Sigmoid function

I don't quite understand why a sigmoid function is seen as more useful (for neural networks) than a step function... hoping someone can explain this for me. Thanks in advance.

like image 614
Jay Schauer Avatar asked Dec 26 '15 08:12

Jay Schauer


People also ask

Why sigmoid function is used instead of step function?

It depends on the problem you are dealing with. In case of simple binary classification, a step function is appropriate. Sigmoids can be useful when building more biologically realistic networks by introducing noise or uncertainty.

What is the difference between sigmoid and logistic function?

Sigmoid Function: A general mathematical function that has an S-shaped curve, or sigmoid curve, which is bounded, differentiable, and real. Logistic Function: A certain sigmoid function that is widely used in binary classification problems using logistic regression.

Why is sigmoid function better?

The main reason why we use sigmoid function is because it exists between (0 to 1). Therefore, it is especially used for models where we have to predict the probability as an output.Since probability of anything exists only between the range of 0 and 1, sigmoid is the right choice.

What is the difference of step function with threshold function?

Step Function is one of the simplest kind of activation functions. In this, we consider a threshold value and if the value of net input say y is greater than the threshold then the neuron is activated.

Why dont we use the step activation function?

There are steep shifts from 0 to 1, which may not fit the data well. The network is not differentiable, so gradient-based training is impossible.

Why is the disadvantage of using the sigmoid function as an activation function?

Disadvantages of Sigmoid functions:It is most prone to gradient vanishing problem. Function output is not zero-centred.


2 Answers

The (Heaviside) step function is typically only useful within single-layer perceptrons, an early type of neural networks that can be used for classification in cases where the input data is linearly separable.

However, multi-layer neural networks or multi-layer perceptrons are of more interest because they are general function approximators and they are able to distinguish data that is not linearly separable.

Multi-layer perceptrons are trained using backpropapagation. A requirement for backpropagation is a differentiable activation function. That's because backpropagation uses gradient descent on this function to update the network weights.

The Heaviside step function is non-differentiable at x = 0 and its derivative is 0 elsewhere. This means gradient descent won't be able to make progress in updating the weights and backpropagation will fail.

The sigmoid or logistic function does not have this shortcoming and this explains its usefulness as an activation function within the field of neural networks.

like image 80
Eric Avatar answered Sep 24 '22 11:09

Eric


It depends on the problem you are dealing with. In case of simple binary classification, a step function is appropriate. Sigmoids can be useful when building more biologically realistic networks by introducing noise or uncertainty. Another but compeletely different use of sigmoids is for numerical continuation, i.e. when doing bifurcation analysis with respect to some parameter in the model. Numerical continuation is easier with smooth systems (and very tricky with non-smooth ones).

like image 31
itsok-dontworry Avatar answered Sep 22 '22 11:09

itsok-dontworry