I need to feed variable length sequences into my model.
My model is Embedding + LSTM + Conv1d + Maxpooling + softmax
.
When I set mask_zero = True
in Embedding
, I fail to compile at Conv1d
.
How can I input mask value in Conv1d
or is there another solution?
Masking is a way to tell sequence-processing layers that certain timesteps in an input are missing, and thus should be skipped when processing the data.
Actually, setting mask_zero=True for the Embedding layer does not result in returning a zero vector. Rather, the behavior of the Embedding layer would not change and it would return the embedding vector with index zero.
Masking class. Masks a sequence by using a mask value to skip timesteps. For each timestep in the input tensor (dimension #1 in the tensor), if all values in the input tensor at that timestep are equal to mask_value , then the timestep will be masked (skipped) in all downstream layers (as long as they support masking).
First, we'll load the dataset and check the x input dimensions. The next important step is to reshape the x input data. We'll create one-dimensional vectors from each row of x input data. We'll check the labels of y output data and find out the class numbers that will be defined in a model output layer.
The Masking
layer expects every downstream layer to support masking, which is not the case of the Conv1D
layer. Fortunately, there is another way to apply masking, using the Functional API:
inputs = Input(...)
mask = Masking().compute_mask(inputs) # <= Compute the mask
embed = Embedding(...)(inputs)
lstm = LSTM(...)(embed, mask=mask) # <= Apply the mask
conv = Conv1D(...)(lstm)
...
model = Model(inputs=[inputs], outputs=[...])
Conv1D layer does not support masking at this time. Here is an open issue on the keras repo.
Depending on the task you might be able to get away with embedding the mask_value
just like the other values in the sequence and apply global pooling (as you're doing now).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With