Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Gradient from Theano expression for filter visualization in Keras

For a ConvNet it can be interesting to find the norm-bounded input that maximizes the activity of a single conv. filter as a way to visualize the filters. I'd like to do this in the deep learning package Keras. This could be done using a black box optimization algorithm with the code from the FAQ.

# with a Sequential model
get_3rd_layer_output = theano.function([model.layers[0].input],
                                       model.layers[3].get_output(train=False))
layer_output = get_3rd_layer_output(X)

However, it would be a substantially easier optimization task if I had the gradient. How can I extract the gradient from the Theano expression and input it into a Python optimization library such as Scipy?

like image 722
pir Avatar asked Oct 30 '22 12:10

pir


1 Answers

You can print out the gradient as described here and hand-code it into Scipy. You can also do the optimization in Theano - see this question.

However, probably the most straight-forward approach is to create a function get_gradients() that uses theano.grad() to return the gradients of the filters with respect to an input, then call scipy.optimize.minimize with jac=get_gradients. According to the documentation:

jac : bool or callable, optional Jacobian (gradient) of objective function. [...] jac can also be a callable returning the gradient of the objective. In this case, it must accept the same arguments as fun.

like image 147
1'' Avatar answered Nov 09 '22 07:11

1''