Is there a good reason to use tf.concat
instead of tf.stack
? They seem very similar. Is it just to guarantee that the resulting tensor will have the same number of dimensions as the input list of tensors?
concat. Concatenates tensors along one dimension.
tf.stack( values, axis=0, name='stack' ) Defined in tensorflow/python/ops/array_ops.py. Stacks a list of rank- R tensors into one rank- (R+1) Packs the list of tensors in values into a tensor with rank one higher than each tensor in values , by packing them along the dimension.
unstack. Unpacks the given dimension of a rank- R tensor into rank- (R-1) tensors.
Actually, I've misunderstood how tf.stack
works. If the axis
parameter is within the range of the existing dimensions, a new axis will be inserted at that index.
Example:
import tensorflow as tf t1 = tf.random_normal([1, 3]) t2 = tf.random_normal([1, 3]) tf.stack([t1, t2], axis=1).shape.as_list() == [1, 2, 3] tf.concat([t1, t2], axis=1).shape.as_list() == [1, 6]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With