Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unit testing backpropagation neural network code

I am writing a backprop neural net mini-library from scratch and I need some help with writing meaningful automated tests. Up until now I have automated tests that verify that weight and bias gradients are calculated correctly by the backprop algorithm, but no test on whether the training itself actually works.

The code I have up until now lets me do the following:

  • Define a neural net with any number of layers and neurons per layer.
  • It can use any layer activation functions.
  • Using biases is also possible.
  • Layers of neurons can only be fully connected at the moment.
  • Training is only BP with gradient descent.
  • Must use train, validation and test sets (none of these sets can be empty at the moment).

Given all of these, what kind of automated test can I write to ensure that the training algorithm is implemented correctly. What function (sin, cos, exp, quadratic, etc) should I try to approximate? In what range and how densely should I sample data from this function? What architecture should the NN have?

Ideally, the function should be fairly simple to learn so the test wouldn't last very long (1-3 seconds), but also complicated enough to provide some degree of certainty that the algorithm is implemented correctly.

like image 882
Paul Manta Avatar asked Feb 28 '15 22:02

Paul Manta


1 Answers

I'm in the middle of doing something similar for my degree. What you are looking for is integration tests, not unit tests.

Unit test only tells you if the code works the way you want it to. To check if the algorithm actually works correctly, you should write integration tests where you create your network with all the needed dependencies (real, not mocks).

Having created a network, you can simply try to perform learning. Test of simple mathematical functions are good for start. For more dimensional functions you can try e.q. Rosenbrock function. It's very nice, as you can change it's dimensionality with just one parameter. I used it only for GA benchmarks though.

You can also test it on real data sets. I recommend Iris Dataset. It's quite small and easy to learn. If the tests pass you can see that your network really works, not just solves a mathematical function. I personally find it comforting. :)

To make sure your tests do not run too long set a reasonable max epochs number. Also note that you want your test to pass every time until you mess something up, so do not make them too hard to pass.

As far as I remember I use 10 hidden neurons for iris dataset. Within about 5 iterations you should be able to easily get at least 95% of correct answers.

Such tests may take a few seconds, but it's good to have some. You do not have to run them every time. But if you do a massive refactoring and they still pass you just do not have to worry. Trust me - I've been there.

like image 129
Andrzej Gis Avatar answered Nov 15 '22 07:11

Andrzej Gis