Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Issue feeding a list into feed_dict in TensorFlow

I'm trying to pass a list into feed_dict, however I'm having trouble doing so. Say I have:

inputs = 10 * [tf.placeholder(tf.float32, shape=(batch_size, input_size))] 

where inputs is fed into some function outputs that I want to compute. So to run this in tensorflow, I created a session and ran the following:

sess.run(outputs, feed_dict = {inputs: data})  #data is my list of inputs, which is also of length 10 

but I get an error, TypeError: unhashable type: 'list'. However, I'm able to pass the data element-wise like so:

sess.run(outputs, feed_dict = {inputs[0]: data[0], ..., inputs[9]: data[9]})  

So I'm wondering if there's a way I can solve this issue. I've also tried to construct a dictionary(using a for loop), however this results in a dictionary with a single element, where they key is: tensorflow.python.framework.ops.Tensor at 0x107594a10

like image 987
d-roy Avatar asked Nov 13 '15 01:11

d-roy


2 Answers

There are two issues that are causing problems here:

The first issue is that the Session.run() call only accepts a small number of types as the keys of the feed_dict. In particular, lists of tensors are not supported as keys, so you have to put each tensor as a separate key.* One convenient way to do this is using a dictionary comprehension:

inputs = [tf.placeholder(...), ...] data = [np.array(...), ...] sess.run(y, feed_dict={i: d for i, d in zip(inputs, data)}) 

The second issue is that the 10 * [tf.placeholder(...)] syntax in Python creates a list with ten elements, where each element is the same tensor object (i.e. has the same name property, the same id property, and is reference-identical if you compare two elements from the list using inputs[i] is inputs[j]). This explains why, when you tried to create a dictionary using the list elements as keys, you ended up with a dictionary with a single element - because all of the list elements were identical.

To create 10 different placeholder tensors, as you intended, you should instead do the following:

inputs = [tf.placeholder(tf.float32, shape=(batch_size, input_size))           for _ in xrange(10)] 

If you print the elements of this list, you'll see that each element is a tensor with a different name.


EDIT: * You can now pass tuples as the keys of a feed_dict, because these may be used as dictionary keys.

like image 198
mrry Avatar answered Sep 16 '22 16:09

mrry


Here is a correct example:

batch_size, input_size, n = 2, 3, 2 # in your case n = 10 x = tf.placeholder(tf.types.float32, shape=(n, batch_size, input_size)) y = tf.add(x, x)  data = np.random.rand(n, batch_size, input_size)  sess = tf.Session() print sess.run(y, feed_dict={x: data}) 

And here is a strange things I see in your approach. For some reason you use 10 * [tf.placeholder(...)], which creates 10 tensors of size (batch_size, input_size). No idea why do you do this, if you can just create on Tensor of rank 3 (where the first dimension is 10).

Because you have a list of tensors (and not a tensor), you can not feed your data to this list (but in my case I can feed to my tensor).

like image 36
Salvador Dali Avatar answered Sep 18 '22 16:09

Salvador Dali