Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to have parallel convolutional layers in keras?

Tags:

I am a little new to neural networks and keras. I have some images with size 6*7 and the size of the filter is 15. I want to have several filters and train a convolutional layer separately on each and then combine them. I have looked at one example here:

model = Sequential() model.add(Convolution2D(nb_filters, kernel_size[0], kernel_size[1],                     border_mode='valid',                     input_shape=input_shape)) model.add(Activation('relu')) model.add(Convolution2D(nb_filters, kernel_size[0], kernel_size[1])) model.add(Activation('relu')) model.add(MaxPooling2D(pool_size=pool_size)) model.add(Dropout(0.25)) model.add(Flatten(input_shape=input_shape)) model.add(Dense(128)) model.add(Activation('relu')) model.add(Dense(128)) model.add(Activation('relu')) model.add(Dropout(0.5)) model.add(Dense(nb_classes)) model.add(Activation('tanh')) 

This model works with one filter. Can anybody give me some hints on how to modify the model to work with parallel convolutional layers.

Thanks

like image 208
ida Avatar asked Apr 01 '17 01:04

ida


People also ask

What is parallel convolution?

The training of convolutional neural networks with large inputs on GPUs is lim- ited by the available GPU memory capacity. In this work, we describe spatially parallel convolutions, which sidestep the memory capacity limit of a single GPU by partitioning tensors along their spatial axes across multiple GPUs.

How do multiple convolutional layers work?

Multiple Layers Convolutional layers are not only applied to input data, e.g. raw pixel values, but they can also be applied to the output of other layers. The stacking of convolutional layers allows a hierarchical decomposition of the input.

What is parallel convolutional neural network?

Parallel Deep Convolutional Neural Network Training by Exploiting the Overlapping of Computation and Communication. Abstract: Training Convolutional Neural Network (CNN) is a computationally intensive task whose parallelization has become critical in order to complete the training in an acceptable time.


2 Answers

Here is an example of designing a network of parallel convolution and sub sampling layers in keras version 2. I hope this resolves your problem.

rows, cols = 100, 15 def create_convnet(img_path='network_image.png'):     input_shape = Input(shape=(rows, cols, 1))      tower_1 = Conv2D(20, (100, 5), padding='same', activation='relu')(input_shape)     tower_1 = MaxPooling2D((1, 11), strides=(1, 1), padding='same')(tower_1)      tower_2 = Conv2D(20, (100, 7), padding='same', activation='relu')(input_shape)     tower_2 = MaxPooling2D((1, 9), strides=(1, 1), padding='same')(tower_2)      tower_3 = Conv2D(20, (100, 10), padding='same', activation='relu')(input_shape)     tower_3 = MaxPooling2D((1, 6), strides=(1, 1), padding='same')(tower_3)      merged = keras.layers.concatenate([tower_1, tower_2, tower_3], axis=1)     merged = Flatten()(merged)      out = Dense(200, activation='relu')(merged)     out = Dense(num_classes, activation='softmax')(out)      model = Model(input_shape, out)     plot_model(model, to_file=img_path)     return model 

The image of this network will look like enter image description here

like image 187
durjoy Avatar answered Sep 20 '22 10:09

durjoy


My approach is to create other model that defines all parallel convolution and pulling operations and concat all parallel result tensors to single output tensor. Now you can add this parallel model graph in your sequential model just like layer. Here is my solution, hope it solves your problem.

# variable initialization  from keras import Input, Model, Sequential from keras.layers import Conv2D, MaxPooling2D, Concatenate, Activation, Dropout, Flatten, Dense  nb_filters =100 kernel_size= {} kernel_size[0]= [3,3] kernel_size[1]= [4,4] kernel_size[2]= [5,5] input_shape=(32, 32, 3) pool_size = (2,2) nb_classes =2 no_parallel_filters = 3  # create seperate model graph for parallel processing with different filter sizes # apply 'same' padding so that ll produce o/p tensor of same size for concatination # cancat all paralle output  inp = Input(shape=input_shape) convs = [] for k_no in range(len(kernel_size)):     conv = Conv2D(nb_filters, kernel_size[k_no][0], kernel_size[k_no][1],                     border_mode='same',                          activation='relu',                     input_shape=input_shape)(inp)     pool = MaxPooling2D(pool_size=pool_size)(conv)     convs.append(pool)  if len(kernel_size) > 1:     out = Concatenate()(convs) else:     out = convs[0]  conv_model = Model(input=inp, output=out)  # add created model grapg in sequential model  model = Sequential() model.add(conv_model)        # add model just like layer model.add(Conv2D(nb_filters, kernel_size[1][0], kernel_size[1][0])) model.add(Activation('relu')) model.add(MaxPooling2D(pool_size=pool_size)) model.add(Dropout(0.25)) model.add(Flatten(input_shape=input_shape)) model.add(Dense(128)) model.add(Activation('relu')) model.add(Dense(128)) model.add(Activation('relu')) model.add(Dropout(0.5)) model.add(Dense(nb_classes)) model.add(Activation('tanh')) 

For more information refer similar question: Combining the outputs of multiple models into one model

like image 44
Nilesh Birari Avatar answered Sep 20 '22 10:09

Nilesh Birari