I'm working on re-implementing this paper and the key operation is a bilinear tensor product. I hardly know what that means, but the paper has a nice little graphic, which I understand.
The key operation is e_1 * W * e_2, and I want to know how to implement it in tensorflow, because the rest should be easy.
Basically, given 3D tensor W, slice it into matrices, and for the j'th slice (a matrix), multiply it on each side by e_1 and e_2, resulting in a scalar, which is the jth entry in the resulting vector (the output of this operation).
So I want to perform a product of e_1, a d-dimensional vector, W, the d x d x k tensor, and e_2, another d-dimensional vector. Could this product be expressed concisely in TensorFlow as it is now, or would I have to define my own op somehow?
EARLIER EDITS
Why doesn't multiplying these tensors work, and is there some way to define it more explicitly so that it works?
>>> import tensorflow as tf
>>> tf.InteractiveSession()
>>> a = tf.ones([3, 3, 3])
>>> a.eval()
array([[[ 1., 1., 1.],
[ 1., 1., 1.],
[ 1., 1., 1.]],
[[ 1., 1., 1.],
[ 1., 1., 1.],
[ 1., 1., 1.]],
[[ 1., 1., 1.],
[ 1., 1., 1.],
[ 1., 1., 1.]]], dtype=float32)
>>> b = tf.ones([3, 1, 1])
>>> b.eval()
array([[[ 1.]],
[[ 1.]],
[[ 1.]]], dtype=float32)
>>>
The error message is
ValueError: Shapes TensorShape([Dimension(3), Dimension(3), Dimension(3)]) and TensorShape([Dimension(None), Dimension(None)]) must have the same rank
CURRENTLY
Turns out that multiplying two 3D tensors doesn't work either with tf.matmul
, so but tf.batch_matmul
does. tf.batch_matmul
will also do 3D tensors and matrices. Then I tried 3D and a vector:
ValueError: Dimensions Dimension(3) and Dimension(1) are not compatible
View aliasesTensordot (also known as tensor contraction) sums the product of elements from a and b over the indices specified by axes . This operation corresponds to numpy. tensordot(a, b, axes) . Example 1: When a and b are matrices (order 2), the case axes=1 is equivalent to matrix multiplication.
The tf. dot() function is used to compute the dot product of two given matrices or vectors, t1 and t2.
An axis of a tensor is a specific dimension of a tensor. If we say that a tensor is a rank 2 tensor, we mean that the tensor has 2 dimensions, or equivalently, the tensor has two axes. Elements are said to exist or run along an axis. This running is constrained by the length of each axis.
You can do this with a simple reshape. For the first of the two matrix multiplies, you have k*d, length d vectors to dot product with.
This should be close:
temp = tf.matmul(E1,tf.reshape(Wddk,[d,d*k]))
result = tf.matmul(E2,tf.reshape(temp,[d,k]))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With