Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Looping over a tensor

I am trying to process a tensor of variable size, in a python way that would be something like:

# X is of shape [m, n]
for x in X:
    process(x)

I have tried to use tf.scan, the thing is that I want to process every sub-tensor, so I have tried to use a nested scan, but I was enable to do it, because tf.scan work with the accumulator, if not found it will take the first entry of the elems as initializer, which I don't want to do. As an example, suppose I want to add one to every element of my tensor (this is just an example), and I want to process it element by element. If I run the code bellow, I will only have one added to a sub-tensor, because scan consider the first tensor as initializer, along with the first element of every sub-tensor.

import numpy as np
import tensorflow as tf

batch_x = np.random.randint(0, 10, size=(5, 10))
x = tf.placeholder(tf.float32, shape=[None, 10])

def inner_loop(x_in):
    return tf.scan(lambda _, x_: x_ + 1, x_in)

outer_loop = tf.scan(lambda _, input_: inner_loop(input_), x, back_prop=True)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    rs = sess.run(outer_loop, feed_dict={x: batch_x})

Any suggestions ?

like image 989
Mohamed Lakhal Avatar asked Apr 10 '17 15:04

Mohamed Lakhal


People also ask

Are tensors iterable?

Tensor` objects are not iterable when eager execution is not enabled.

Can tensors have strings?

# Tensors can be strings, too here is a scalar string.

Are tensors immutable?

The tensors, like python numbers and strings, are immutable and can only be created new.

What does TF unstack do?

Used in the notebooks. Unpacks tensors from value by chipping it along the axis dimension. This is the opposite of stack. The shape of each result tensor is equal to the shape of the input tensor, with the target axis removed.


2 Answers

To loop over a tensor you could try tf.unstack

Unpacks the given dimension of a rank-R tensor into rank-(R-1) tensors.

So adding 1 to each tensor would look something like:

import tensorflow as tf
x = tf.placeholder(tf.float32, shape=(None, 10))
x_unpacked = tf.unstack(x) # defaults to axis 0, returns a list of tensors

processed = [] # this will be the list of processed tensors
for t in x_unpacked:
    # do whatever
    result_tensor = t + 1
    processed.append(result_tensor)

output = tf.concat(processed, 0)

with tf.Session() as sess:
    print(sess.run([output], feed_dict={x: np.zeros((5, 10))}))

Obviously you can further unpack each tensor from the list to process it, down to single elements. To avoid lots of nested unpacking though, you could maybe try flattening x with tf.reshape(x, [-1]) first, and then loop over it like

flattened_unpacked = tf.unstack(tf.reshape(x, [-1])
for elem in flattened_unpacked:
    process(elem)

In this case elem is a scalar.

like image 196
Dzjkb Avatar answered Sep 21 '22 12:09

Dzjkb


Most of tensorflow built-in functions could be applied elementwise. So you could just pass a tensor into a function. Like:

outer_loop = inner_loop(x)

However, if you have some function that could not be applied this way (it's really tempting to see that function), you could use map_fn.

Say, your function simply adds 1 to every element of a tensor (or whatever):

inputs = tf.placeholder...

def my_elementwise_func(x):
    return x + 1

def recursive_map(inputs):
   if tf.shape(inputs).ndims > 0:
       return tf.map_fn(recursive_map, inputs)
   else:
       return my_elementwise_func(inputs)

result = recursive_map(inputs)  
like image 43
Dmitriy Danevskiy Avatar answered Sep 20 '22 12:09

Dmitriy Danevskiy