For example, I have a tensor A = tf.Variable([a, b, c, d, e])
and through
tf.tile()
, it can give a tensor like [a, b, c, d, e, a, b, c, d, e]
But I want to reform A
into something like: [a, a, b, b, c, c, d, d, e]
, where the elements are duplicated at the original place.
What is the most efficient way (less operations) to achieve that (through different-able ops)?
You can achieve that using tf. tile. You pass it a list of length equal to the number of dimensions in the tensor to be replicated. Each value in this list corresponds to how many times you want to replicate along the specific dimension.
clone() function is used to create a copy of a tensor. The tf. clone() function creates a new tensor of the same shape and value of another tensor.
Use Lambda to split a tensor of shape (64,16,16) into (64,1,1,256) and then subset any indexes you need.
The tf. tile() function is used to create a Tensor by repeating the number of times given by reps. Note: This function creates a new tensor by replicating the input reps times. For example, tiling [1, 2, 3, 4] by [3] produces [1, 2, 3, 4,1, 2, 3, 4,1, 2, 3, 4].
You can do it by adding a dimension, tiling along that dimension, and removing it:
import tensorflow as tf
A = tf.constant([1, 2, 3, 4, 5])
B = tf.expand_dims(A, axis=-1)
C = tf.tile(B, multiples=[1,2])
D = tf.reshape(C, shape=[-1])
with tf.Session() as sess:
print('A:\n{}'.format(A.eval()))
print('B:\n{}'.format(B.eval()))
print('C:\n{}'.format(C.eval()))
print('D:\n{}'.format(D.eval()))
gives
A:
[1 2 3 4 5]
B: # Add inner dimension
[[1]
[2]
[3]
[4]
[5]]
C: # Tile along inner dimension
[[1 1]
[2 2]
[3 3]
[4 4]
[5 5]]
D: # Remove innermost dimension
[1 1 2 2 3 3 4 4 5 5]
Edit: as pointed out in the comments, using tf.stack
allows to specify the additional dimension on the go:
F = tf.stack([A, A], axis=1)
F = tf.reshape(F, shape=[-1])
with tf.Session() as sess:
print(F.eval())
[1 1 2 2 3 3 4 4 5 5]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With