Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras Embedding ,where is the "weights" argument?

I have seen such kind of code as follow:

embed_word = Embedding(params['word_voc_size'], params['embed_dim'], weights=[word_embed_matrix], input_length = params['word_max_size']
                        , trainable=False, mask_zero=True)

When I look up the document in Keras website [https://faroit.github.io/keras-docs/2.1.5/layers/embeddings/][1]

I didnt see weights argument,

keras.layers.Embedding(input_dim, output_dim, embeddings_initializer='uniform', embeddings_regularizer=None, activity_regularizer=None, embeddings_constraint=None, mask_zero=False, input_length=None)

So I am confused,why we can use the argument weights which was not defined the in Keras document?

My keras version is 2.1.5. Hope someone can help me.

like image 943
Johnny Avatar asked Dec 05 '18 07:12

Johnny


People also ask

What is the output of embedding layer?

When working with text data, the output of an Embedding layer would be 2 sentences consisting of 10 words each, where each word is mapped to a 7-dimensional vector.

Is embedding layer a dense layer?

An embedding layer is faster, because it is essentially the equivalent of a dense layer that makes simplifying assumptions. A Dense layer will treat these like actual weights with which to perform matrix multiplication.


1 Answers

Keras' Embedding layer subclasses the Layer class (every Keras layer does this). The weights attribute is implemented in this base class, so every subclass will allow to set this attribute through a weights argument. This is also why you won't find it back in the documentation or the implementation of the Embedding layer itself.

You can check the base layer implementation here (Ctrl + F for 'weight').

like image 182
sdcbr Avatar answered Oct 05 '22 22:10

sdcbr