I have previously asked this question Create boolean mask on TensorFlow about how to get a tensor only with certain indices set to 1, and the rest of them to 0.
I thought the answer given by @MZHm would entirely solve my problem. Although, the argument dense_shape of tf.SparseTensor only accepts lists, and I want to pass a shape which is inferred from the graph (from the shape of another tensor which has variable shape). So in my case in specific, I want to do something like this:
# The tensor from which the shape of the sparse tensor is to be inferred
reference_t = tf.zeros([32, 50, 11])
# The indices that will be 1
indices = [[0, 0],
[3, 0],
[5, 0],
[6, 0]]
# Just setting all the values for the sparse tensor to be 1
values = tf.ones([reference_t.shape[-1]])
# The 2d shape I want the sparse tensor to have
sparse_2d_shape = [reference_t.shape[-2],
reference_t.shape[-1]]
st = tf.SparseTensor(indices, values, sparse_2d_shape)
From this I get the error:
TypeError: Expected int64, got Dimension(50) of type 'Dimension' instead.
How to I dynamically set the shape of a sparse tensor? Is there a better alternative to achieve what I'm aiming to do?
Here is what you can do to have a dynamic shape:
import tensorflow as tf
import numpy as np
indices = tf.constant([[0, 0],[1, 1]], dtype=tf.int64)
values = tf.constant([1, 1])
dynamic_input = tf.placeholder(tf.float32, shape=[None, None])
s = tf.shape(dynamic_input, out_type=tf.int64)
st = tf.SparseTensor(indices, values, s)
st_ordered = tf.sparse_reorder(st)
result = tf.sparse_tensor_to_dense(st_ordered)
sess = tf.Session()
An input with (dynamic) shape [5, 3]:
sess.run(result, feed_dict={dynamic_input: np.zeros([5, 3])})
Will output:
array([[1, 0, 0],
[0, 1, 0],
[0, 0, 0],
[0, 0, 0],
[0, 0, 0]], dtype=int32)
An input with (dynamic) shape [3, 3]:
sess.run(result, feed_dict={dynamic_input: np.zeros([3, 3])})
Will output:
array([[1, 0, 0],
[0, 1, 0],
[0, 0, 0]], dtype=int32)
So there you go... dynamic sparse shape.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With