I have two embeddings tensor A
and B
, which looks like
[
[1,1,1],
[1,1,1]
]
and
[
[0,0,0],
[1,1,1]
]
what I want to do is calculate the L2 distance d(A,B)
element-wise.
First I did a tf.square(tf.sub(lhs, rhs))
to get
[
[1,1,1],
[0,0,0]
]
and then I want to do an element-wise reduce which returns
[
3,
0
]
but tf.reduce_sum
does not allow my to reduce by row. Any inputs would be appreciated. Thanks.
reduce_sum() is used to find sum of elements across dimensions of a tensor. Syntax: tensorflow.math.reduce_sum( input_tensor, axis, keepdims, name) Parameters: input_tensor: It is numeric tensor to reduce. axis(optional): It represent the dimensions to reduce.
math. reduce_sum. Computes the sum of elements across dimensions of a tensor.
expand_dims() is used to insert an addition dimension in input Tensor. Parameters: input: It is the input Tensor. axis: It defines the index at which dimension should be inserted.
The tf. sum() function is used to calculate sum of the elements of a specified Tensor across its dimension. It reduces the given input elements along the dimensions of axes. If the parameter “keepDims” is true, the reduced dimensions are retained with length 1 else the rank of Tensor is reduced by 1.
Add the reduction_indices
argument with a value of 1, eg.:
tf.reduce_sum( tf.square( tf.sub( lhs, rhs) ), 1 )
That should produce the result you're looking for. Here is the documentation on reduce_sum()
.
According to TensorFlow documentation, reduce_sum
function which takes four arguments.
tf.reduce_sum(input_tensor, axis=None, keep_dims=False, name=None, reduction_indices=None).
But reduction_indices
has been deprecated. Better to use axis instead of. If the axis is not set, reduces all its dimensions.
As an example,this is taken from the documentation,
# 'x' is [[1, 1, 1]
# [1, 1, 1]]
tf.reduce_sum(x) ==> 6
tf.reduce_sum(x, 0) ==> [2, 2, 2]
tf.reduce_sum(x, 1) ==> [3, 3]
tf.reduce_sum(x, 1, keep_dims=True) ==> [[3], [3]]
tf.reduce_sum(x, [0, 1]) ==> 6
Above requirement can be written in this manner,
import numpy as np
import tensorflow as tf
a = np.array([[1,7,1],[1,1,1]])
b = np.array([[0,0,0],[1,1,1]])
xtr = tf.placeholder("float", [None, 3])
xte = tf.placeholder("float", [None, 3])
pred = tf.reduce_sum(tf.square(tf.subtract(xtr, xte)),1)
# Initializing the variables
init = tf.global_variables_initializer()
# Launch the graph
with tf.Session() as sess:
sess.run(init)
nn_index = sess.run(pred, feed_dict={xtr: a, xte: b})
print nn_index
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With