Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pruning in Keras

Tags:

I'm trying to design a neural network using Keras with priority on prediction performance, and I cannot get sufficiently high accuracy by further reducing the number of layers and nodes per layer. I have noticed that very large portion of my weights are effectively zero (>95%). Is there a way to prune dense layers in hope of reducing prediction time?

like image 897
Mirac7 Avatar asked Jan 31 '17 13:01

Mirac7


People also ask

What is pruning Keras?

In this blog, we will be understanding the concept of weight pruning with Keras. Basically, weight pruning is a model optimization technique. In weight pruning, it gradually zeroes out model weight during the training process to achieve model sparsity. This technique brings improvements via model compression.

What is pruning a model?

Pruning is one model compression technique that allows the model to be optimized for real-time inference for resource-constrained devices. It was shown that large-sparse models often outperform small-dense models across various different architectures.

What is the difference between pruning and dropout?

Dropout drops certain activations stochastically (i.e. a new random subset of them for any data passing through the model). Typically this is undone after training (although there is a whole theory about test-time-dropout). Pruning drops certain weights, i.e. permanently drops some parts deemed “uninteresting”.


1 Answers

Not a dedicated way :(

There's currently no easy (dedicated) way of doing this with Keras.

A discussion is ongoing at https://groups.google.com/forum/#!topic/keras-users/oEecCWayJrM.

You may also be interested in this paper: https://arxiv.org/pdf/1608.04493v1.pdf.

like image 130
grovina Avatar answered Sep 20 '22 19:09

grovina