Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras : How to merge a dense layer and an embedding layer

I use Keras and I try to concatenate two different layers into a vector (first values of the vector would be values of the first layer, and the other part would be the values of the second layer). One of these layers is a Dense layer and the other layer is a Embedding layer.

I know how to merge two embedding layers or two dense layers but I don't know how to merge a embedding layer and a dense layer (dimensional problem).

A simple example would be like this:

L_branch = Sequential()
L_branch.add(Dense(10, input_shape =  (4,) , activation = 'relu'))
L_branch.add(BatchNormalization())

R_branch = Sequential()
R_branch.add(Embedding(1000, 64, input_length=5))

final_branch.add(Merge([L_branch, R_branch], mode = 'concat'))

But this will not work because you can't merge layers with different dimensionalities.

PS : Sorry, english is not my native languague and I hope you will understand my problem.

Best regards.

like image 239
Ed Nio Avatar asked Jan 30 '17 10:01

Ed Nio


People also ask

Is embedding layer a dense layer?

An embedding layer is faster, because it is essentially the equivalent of a dense layer that makes simplifying assumptions. A Dense layer will treat these like actual weights with which to perform matrix multiplication.

What is the purpose of embedding layer and dense layer What does their size parameter help in the model?

Embedding layer enables us to convert each word into a fixed length vector of defined size. The resultant vector is a dense one with having real values instead of just 0's and 1's. The fixed length of word vectors helps us to represent words in a better way along with reduced dimensions.

What does concatenate layer do in Keras?

Concatenate class Layer that concatenates a list of inputs. It takes as input a list of tensors, all of the same shape except for the concatenation axis, and returns a single tensor that is the concatenation of all inputs.


1 Answers

Use Flatten layer.

L_branch = Sequential()
L_branch.add(Dense(10, input_shape =  (4,) , activation = 'relu'))
L_branch.add(BatchNormalization())

R_branch = Sequential()
R_branch.add(Embedding(1000, 64, input_length=5))
R_branch.add(Flatten()) # <--

final_branch = Sequential() # <--
final_branch.add(Merge([L_branch, R_branch], mode = 'concat'))
like image 73
Alexey Golyshev Avatar answered Sep 21 '22 15:09

Alexey Golyshev