With pretrained embeddings, we can specify them as weights in keras' embedding layer. To use multiple embeddings, would specifying multiple embedding layer be suitable? i.e.
embedding_layer1 = Embedding(len(word_index) + 1,
EMBEDDING_DIM,
weights=[embedding_matrix_1],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)
embedding_layer2 = Embedding(len(word_index) + 1,
EMBEDDING_DIM,
weights=[embedding_matrix_2],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)
model.add(embedding_layer1)
model.add(embedding_layer2)
This suggests to sum them up and represent them into a single layer, which is not what I am after.
I have come across the same issue.Is it because keras.Embedding layer internally uses some kind of object (lets call it x_object ) ,that gets initialized in keras.backend global session K. Hence the second embedding layer throws an exception saying the x_object name already exists in graph and cannot be added again.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With