I am trying to implement the word2vec algorithm with Keras but I am getting
ValueError: You called `set_weights(weights)` on layer "i2h" with a weight list of length 3418, but the layer was expecting 2 weights. Provided weights: [[ 0.07142857 0.07142857 0.07142857 ..., 0.0714...
as I try to set the weights for the shared matrix from the input to the hidden layer i2h
:
class Word2Vec:
def __init__(self, window_size, word_vectors):
vocab_size = word_vectors.shape[0]
embedding_size = word_vectors.shape[1]
i2h = Dense(embedding_size, activation='linear', name='i2h')
inputs = list()
h_activations = list()
for i in range(window_size):
in_x = Input(shape=(vocab_size, 1), name='in_{:d}'.format(i))
inputs.append(in_x)
h_activation = i2h(in_x)
h_activations.append(h_activation)
i2h.set_weights(word_vectors)
h = merge(h_activations, mode='ave')
h2out = Dense(vocab_size, activation='softmax', name='out')(h)
self.model = Model(input=inputs, output=[h2out])
self.model.compile(optimizer='adam', loss='mse')
I don't quite understand how I can set this weight matrix.
I have also tried to use the Dense()
layer as input
i2h = Dense(embedding_size, input_dim=vocab_size, activation='linear', name='i2h')
i2h.set_weights(word_vectors)
but I am getting the same error.
How can I set the shared weights in this case?
I have faced a similar problem and found the solution is to add the layer to an existing model first, and then invoke set_weights
. So for your example I propose to move the line i2h.set_weights(word_vectors)
to be after the line self.model = Model(input=inputs, output=[h2out])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With