I am playing around with the map_fn function, and noticed that it outputs a TensorArray, which should mean it is capable of outputting "jagged" tensors (where the tensors on the inside have different first dimensions.
I tried to see this in action with this code:
import tensorflow as tf
import numpy as np
NUM_ARRAYS = 1000
MAX_LENGTH = 1000
lengths = tf.placeholder(tf.int32)
tArray = tf.map_fn(lambda x: tf.random_normal((x,), 0, 1),
lengths,
dtype=tf.float32) # Should return a TensorArray.
# startTensor = tf.random_normal((tf.reduce_sum(lengths),), 0, 1)
# tArray = tf.TensorArray(tf.float32, NUM_ARRAYS)
# tArray = tArray.split(startTensor, lengths)
# outArray = tArray.concat()
with tf.Session() as sess:
outputArray, l = sess.run(
[tArray, lengths],
feed_dict={lengths: np.random.randint(MAX_LENGTH, size=NUM_ARRAYS)})
print outputArray.shape, l
however got the error:
"TensorArray has inconsistent shapes. Index 0 has shape: [259] but index 1 has shape: [773]"
This of course comes as a surprise to me since I am under the impression that TensorArrays should be able to handle it. Am I wrong?
While the tf.map_fn()
does use tf.TensorArray
objects internally, and a tf.TensorArray
can hold objects of different size, this program won't work as-is because tf.map_fn()
converts its tf.TensorArray
result back to a tf.Tensor
by stacking the elements together, and it is this operation that fails.
You can however implement the tf.TensorArray
-based using the lower-lever tf.while_loop()
op instead:
lengths = tf.placeholder(tf.int32)
num_elems = tf.shape(lengths)[0]
init_array = tf.TensorArray(tf.float32, size=num_elems)
def loop_body(i, ta):
return i + 1, ta.write(i, tf.random_normal((lengths[i],), 0, 1))
_, result_array = tf.while_loop(
lambda i, ta: i < num_elems, loop_body, [0, init_array])
Building upon mrry's answer, some more examples that can be run under TF2.x
import tensorflow as tf
# ================= example 1 ==================
num_elems = 5
init_array = tf.TensorArray(tf.float32, size=num_elems, infer_shape=False)
lengths = tf.range(0, 5)
def loop_body(i, ta):
return i + 1, ta.write(i, tf.random.normal((lengths[i],), 0, 1))
_, result_array = tf.while_loop(
lambda i, ta: i < num_elems, loop_body, [0, init_array])
for i in range(num_elems):
print(result_array.read(i))
# ================== example 2 ==================
# TensorArray whose size is known at run time and shapes of elements
# are not necessarily the same
ta = tf.TensorArray(tf.float32, size=0, dynamic_size=True, infer_shape=False)
# init ta with some mock data
ta = ta.write(0, 0.0)
ta = ta.write(1, 1.0)
ta = ta.write(2, tf.constant([2.0, 2.0]))
# loop body
def loop_body(i, t):
val = t.read(i)
# do something
t = t.write(i, tf.multiply(2.0, val))
return i+1, t
# stop condition for while loop
index = tf.constant(0)
cond = lambda i, t: tf.less(i, t.size())
# results
i = tf.constant(0)
_, result_array = tf.while_loop(cond, loop_body, [i, ta])
for i in range(result_array.size()):
print(result_array.read(i))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With