Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What exactly is gradient checking?

I am a beginner in Deep Learning. I came through the concept of 'Gradient Checking'.

I just want to know, what is it and how it could help to improve the training process?

like image 313
jaykay Avatar asked Nov 27 '17 08:11

jaykay


People also ask

What are the 3 types of gradient?

In fact, there are three types of gradients: linear, radial, and conic.

What does gradient descent tell you?

Gradient Descent is an optimization algorithm for finding a local minimum of a differentiable function. Gradient descent is simply used in machine learning to find the values of a function's parameters (coefficients) that minimize a cost function as far as possible.

What is gradient check in PyBrain?

PyBrain already includes a function that does just that, gradientCheck(). You can pass it to any network containing a structural component that you have programmed. It will check if the numeric gradients are (roughly) equal to the gradient specified by the _backwardImplementation() methods.


1 Answers

Why do we need Gradient Checking?

Back prop as an algorithm has a lot of details and can be a little bit tricky to implement. And one unfortunate property is that there are many ways to have subtle bugs in back prop. So that if you run it with gradient descent or some other optimizational algorithm, it could actually look like it's working. And your cost function, J of theta may end up decreasing on every iteration of gradient descent. But this could prove true even though there might be some bug in your implementation of back prop. So that it looks J of theta is decreasing, but you might just wind up with a neural network that has a higher level of error than you would with a bug free implementation. And you might just not know that there was this subtle bug that was giving you worse performance. So, what can we do about this? There's an idea called gradient checking that eliminates almost all of these problems.

What is Gradient Checking?

We describe a method for numerically checking the derivatives computed by your code to make sure that your implementation is correct. Carrying out the derivative checking procedure significantly increase your confidence in the correctness of your code.

If I have to say in short than Gradient Checking is kind of debugging your back prop algorithm. Gradient Checking basically carry out the derivative checking procedure.

How to implement Gradient Checking?

You can find this procedure here.

like image 179
Tushar Gupta Avatar answered Nov 26 '22 11:11

Tushar Gupta