Actually we cant use tf.var as bool in if statement and instead of use tf.cond. I write this code for normalization input data and get confusing error,where I do wrong?
def global_contrast_normalize(X, scale=1., subtract_mean=True,use_std=False,
sqrt_bias=0., min_divisor=1e-8):
mean=tf.reduce_mean(X,axis=1)
if subtract_mean:
X = X - mean[:, numpy.newaxis] # Makes a copy.
else:
X = tf.copy.copy(X)
if X.get_shape()[1]==1:
#ddof = 0
mean, var = tf.nn.moments(X, axes=[1])
normalizers = tf.sqrt(sqrt_bias + var) / scale
else:
normalizers = tf.sqrt(sqrt_bias + tf.reduce_sum((X ** 2),axis=1)) / scale
Normalizers= tf.Variable(normalizers,'float32')
M=tf.Variable(min_divisor,'float32')
tf.cond( tf.less_equal(Normalizers,M),lambda:tf.assign(Normalizers, [1]),lambda:tf.assign(Normalizers,normalizers))
X /= Normalizers[:, tf.newaxis] # Does not make a copy.
return X
error:
in _call_cpp_shape_fn_impl raise ValueError(err.message)
ValueError: Shape must be rank 0 but is rank 1 for 'cond_11/Switch' (op: 'Switch') with input shapes: [1], [1].
The error is stating that the expected input is a scalar (rank 0), but has a shape of ([1],[1]). Usually you can get around this by reshaping the input to a scalar value (using tf.reshape(Normalizers, [])
).
For this case, it looks like you want to conditionally set the values of Normalizers depending on whether they are <= M. tf.where
does exactly that.
(note, you don't have to convert normalizers or min_divisor to tf.Variable)
Example usage of tf.where:
def global_contrast_normalize(...):
...
comparison = tf.less_equal(normalizers,M)
normalizers = tf.where(comparison, tf.ones_like(normalizers), normalizers
X /= normalizers[:, tf.newaxis]
return X
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With