I have seen such kind of code as follow:
embed_word = Embedding(params['word_voc_size'], params['embed_dim'], weights=[word_embed_matrix], input_length = params['word_max_size']
, trainable=False, mask_zero=True)
When I look up the document in Keras website [https://faroit.github.io/keras-docs/2.1.5/layers/embeddings/][1]
I didnt see weights argument,
keras.layers.Embedding(input_dim, output_dim, embeddings_initializer='uniform', embeddings_regularizer=None, activity_regularizer=None, embeddings_constraint=None, mask_zero=False, input_length=None)
So I am confused,why we can use the argument weights which was not defined the in Keras document?
My keras version is 2.1.5. Hope someone can help me.
When working with text data, the output of an Embedding layer would be 2 sentences consisting of 10 words each, where each word is mapped to a 7-dimensional vector.
An embedding layer is faster, because it is essentially the equivalent of a dense layer that makes simplifying assumptions. A Dense layer will treat these like actual weights with which to perform matrix multiplication.
Keras' Embedding
layer subclasses the Layer
class (every Keras layer does this). The weights
attribute is implemented in this base class, so every subclass will allow to set this attribute through a weights
argument. This is also why you won't find it back in the documentation or the implementation of the Embedding
layer itself.
You can check the base layer implementation here (Ctrl + F for 'weight').
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With