Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What layers should I use for Keras?

I am building a sample project in Keras. The project is to identify the difference between cats and dogs. I found an example online with the model as such:

model = Sequential()
model.add(Conv2D(32, (3, 3), input_shape=(3, 150, 150)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Conv2D(32, (3, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Conv2D(64, (3, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

My question is, how do people know which layers to use? Are there guidelines or rules of thumb when to use Conv2D vs a Conv1D vs a another layer?

like image 909
Alexis Avatar asked Jan 04 '23 08:01

Alexis


1 Answers

In short - they don't. Coming up with the good architecture is a majority of current deep learning research. There are some rules of thumbs, intuitions, but mostly - experience or coping existing ones that were reported to work.

In really short words:

  • convolutions are used when you have spatial and/or temporal structure in data thus images, videos, sound etc.
  • pooling has similar use cases to convolutions, it still requires spatial and/or temporal structure (unless it is applied to the whole channel/dimension) and provides a way of removing "details" (usually noise) and reduce dimension of the signal
  • recurrent when your data has sequential character
  • fully connected are needed to "force" a given dimension (thus often used as a last layer) or when one does not really know any structure that can be exploited (since they are pretty much the most generic ones)

However the question how to compose, what hyperparameters to use, how many to use is a huge open research question, and at the very beginning the best approach is to copy someone else's architectures and gain some experience/intuition what works and what does not for the data you are working with.

like image 80
lejlot Avatar answered Jan 14 '23 13:01

lejlot