Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras multiply layer output with scalar

I have a layer output I want to multiply by a scalar. I can do this with a lambda layer ie

sc_mult = Lambda(lambda x: x * 2)(layer)

which works fine. But if I want to use a different scalar for each example, I try to supply these as a second input, with shape (Examples, 1)

input_scalar = Input(shape = (1L,))

therefore my lambda layer becomes

sc_mult = Lambda(lambda x: x * input_scalar)(layer)

But this now throws an error at train time. Note, 32 is the batch-size, and 128 is a dimension of the layer input (and output) - the layer input being multiplied by the scalar is (batch_size x 32(filters in previous layer) x 128(spatial dim) x 128(spatial dim)).

GpuElemwise. Input dimension mis-match. Input 5 (indices start at 0) has shape[2] == 32, but the output's size on that axis is 128.

I assume I am not feeding the right shape in via the input layer, but can't work out why.

like image 802
Luke_radio Avatar asked Jan 14 '17 03:01

Luke_radio


1 Answers

Not sure if it is useful to answer an old question, but maybe someone else has run into the same problem.

The issue is indeed the shape of your scalar versus the shape of your input (or x). You should reshape your scalar to have as many dimensions as the matrix you're multiplying with, using np.reshape, e.g.:

from keras import *
from keras.layers import *
import numpy as np

# inputs
X = np.ones((32,32,128,128))
s = np.arange(32).reshape(-1,1,1,1) # 1 different scalar per batch example, reshaped
print(X.shape, s.shape)

# model
input_X = Input(shape=(32,128,128))
input_scalar = Input(shape = (1,1,1))
sc_mult = Lambda(lambda x: x * input_scalar)(input_X)
model = Model(inputs=[input_X, input_scalar], outputs=sc_mult)

out = model.predict([X,s])
out

Now out[0,:,:,:] is all zeros, out[1,:,:,:] is all ones, out[31,:,:,:] is all 31s, et cetera.

like image 191
Torec Avatar answered Oct 22 '22 06:10

Torec