I am trying to use the keras.layers.Permute(dims)
in Keras core layers.
According to docs:
dims: Tuple of integers. Permutation pattern, does not include the samples dimension. Indexing starts at 1. For instance, (2, 1) permutes the first and second dimension of the input."
And it gives an example code as in the following,
The question is that: what does this (2,1) do? If my input features have 10 dimensions, and I need to change the order of the 1,3,5th features to be (5,1,3), then should I just use (5,1,3) as the value for parameter 'dim' of this function?
model = Sequential()
model.add(Permute((2, 1), input_shape=(10, 64)))
# now: model.output_shape == (None, 64, 10)
# note: `None` is the batch dimension
Permute(dims, **kwargs) Permutes the dimensions of the input according to a given pattern. Useful e.g. connecting RNNs and convnets. Example. model = Sequential() model.
Permutes the dimensions of the input according to a given pattern. Inherits From: Layer , Module.
Just do a model. summary() . It will print all layers and their output shapes.
The permute function just switches the positions of the axis and the dims
argument tells Keras how you want the final positions to be. For example, if x
is 4-dimensional and of the shape (None, 2, 4, 5, 8)
- (None is the batch size here) and if you specify dims = (3, 2, 1, 4)
, then the following four steps will take place:
Remember, the indexing starts at 1
and not 0
. The dimension zero is the batch size. So finally the output of the permute layer will have shape (5, 4, 2, 8)
. The function np.moveaxis
does similar things in NumPy.
For your example, dims should be equal to (5, 2, 1, 4, 3, 6, 7, 8, 9, 10)
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With