Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Derivative of sigmoid

Tags:

I'm creating a neural network using the backpropagation technique for learning.

I understand we need to find the derivative of the activation function used. I'm using the standard sigmoid function

f(x) = 1 / (1 + e^(-x)) 

and I've seen that its derivative is

dy/dx = f(x)' = f(x) * (1 - f(x)) 

This may be a daft question, but does this mean that we have to pass x through the sigmoid function twice during the equation, so it would expand to

dy/dx = f(x)' = 1 / (1 + e^(-x)) * (1 - (1 / (1 + e^(-x)))) 

or is it simply a matter of taking the already calculated output of f(x), which is the output of the neuron, and replace that value for f(x)?

like image 561
rflood89 Avatar asked May 16 '12 20:05

rflood89


People also ask

Is sigmoid function differentiable?

The sigmoid function is defined as a strictly increasing and continuously differentiable function.

What is the derivative of a sigmoid activation function with respect to the net input I?

The derivative of the sigmoid function is the sigmoid function times one minus itself.

What are odds derivation of the sigmoid function?

As we can see at left side of the plot where X=-10, There is very less change in sigmoid with respect to change in X. That is why slop or derivative of sigmoid is nearly 0. But if we look at the center of the plot, The minor change in X tends to huge change sigmoid(x).

What will be the range of derivative of sigmoid activation function?

The sigmoid function is also called a squashing function as its domain is the set of all real numbers, and its range is (0, 1). Hence, if the input to the function is either a very large negative number or a very large positive number, the output is always between 0 and 1.


1 Answers

Dougal is correct. Just do

f = 1/(1+exp(-x)) df = f * (1 - f) 
like image 59
Bruno Kim Avatar answered Nov 12 '22 03:11

Bruno Kim