Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to understand the term `tensor` in TensorFlow?

I am new to TensorFlow. While I am reading the existing documentation, I found the term tensor really confusing. Because of it, I need to clarify the following questions:

  1. What is the relationship between tensor and Variable, tensor
    vs. tf.constant, 'tensor' vs. tf.placeholder?
  2. Are they all types of tensors?
like image 558
ZijunLost Avatar asked Jun 16 '16 03:06

ZijunLost


People also ask

What does tensor mean in TensorFlow?

A tensor is a generalization of vectors and matrices to potentially higher dimensions. Internally, TensorFlow represents tensors as n-dimensional arrays of base datatypes. Each element in the Tensor has the same data type, and the data type is always known.

Why is TensorFlow called tensor?

These multi-dimensional arrays are also called tensors. To run operations on the data set, you construct a computational graph similar to a flow chart that determines how data flows from one operation to the next. So it's called TensorFlow because you're defining how data or tensors will flow through the system.

How do you evaluate a tensor in TensorFlow?

The easiest[A] way to evaluate the actual value of a Tensor object is to pass it to the Session. run() method, or call Tensor. eval() when you have a default session (i.e. in a with tf.

How do you define a tensor?

In mathematics, a tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects related to a vector space. Objects that tensors may map between include vectors and scalars, and even other tensors.


3 Answers

TensorFlow doesn't have first-class Tensor objects, meaning that there are no notion of Tensor in the underlying graph that's executed by the runtime. Instead the graph consists of op nodes connected to each other, representing operations. An operation allocates memory for its outputs, which are available on endpoints :0, :1, etc, and you can think of each of these endpoints as a Tensor. If you have tensor corresponding to nodename:0 you can fetch its value as sess.run(tensor) or sess.run('nodename:0'). Execution granularity happens at operation level, so the run method will execute op which will compute all of the endpoints, not just the :0 endpoint. It's possible to have an Op node with no outputs (like tf.group) in which case there are no tensors associated with it. It is not possible to have tensors without an underlying Op node.

You can examine what happens in underlying graph by doing something like this

tf.reset_default_graph()
value = tf.constant(1)
print(tf.get_default_graph().as_graph_def())

So with tf.constant you get a single operation node, and you can fetch it using sess.run("Const:0") or sess.run(value)

Similarly, value=tf.placeholder(tf.int32) creates a regular node with name Placeholder, and you could feed it as feed_dict={"Placeholder:0":2} or feed_dict={value:2}. You can not feed and fetch a placeholder in the same session.run call, but you can see the result by attaching a tf.identity node on top and fetching that.

For variable

tf.reset_default_graph()
value = tf.Variable(tf.ones_initializer()(()))
value2 = value+3
print(tf.get_default_graph().as_graph_def())

You'll see that it creates two nodes Variable and Variable/read, the :0 endpoint is a valid value to fetch on both of these nodes. However Variable:0 has a special ref type meaning it can be used as an input to mutating operations. The result of Python call tf.Variable is a Python Variable object and there's some Python magic to substitute Variable/read:0 or Variable:0 depending on whether mutation is necessary. Since most ops have only 1 endpoint, :0 is dropped. Another example is Queue -- close() method will create a new Close op node which connects to Queue op. To summarize -- operations on python objects like Variable and Queue map to different underlying TensorFlow op nodes depending on usage.

For ops like tf.split or tf.nn.top_k which create nodes with multiple endpoints, Python's session.run call automatically wraps output in tuple or collections.namedtuple of Tensor objects which can be fetched individually.

like image 111
Yaroslav Bulatov Avatar answered Oct 21 '22 11:10

Yaroslav Bulatov


From the glossary:

A Tensor is a typed multi-dimensional array. For example, a 4-D array of floating point numbers representing a mini-batch of images with dimensions [batch, height, width, channel].

Basically, every data is a Tensor in TensorFlow (hence the name):

  • placeholders are Tensors to which you can feed a value (with the feed_dict argument in sess.run())
  • Variables are Tensors which you can update (with var.assign()). Technically speaking, tf.Variable is not a subclass of tf.Tensor though
  • tf.constant is just the most basic Tensor, which contains a fixed value given when you create it

However, in the graph, every node is an operation, which can have Tensors as inputs or outputs.

like image 15
Olivier Moindrot Avatar answered Oct 21 '22 10:10

Olivier Moindrot


As already mentioned by others, yes they are all tensors.

The way I understood those is to first visualize and understand 1D, 2D, 3D, 4D, 5D, and 6D tensors as in the picture below. (source: knoldus)

tensor-definition

Now, in the context of TensorFlow, you can imagine a computation graph like the one below,

computation-graph

Here, the Ops take two tensors a and b as input; multiplies the tensors with itself and then adds the result of these multiplications to produce the result tensor t3. And these multiplications and addition Ops happen at the nodes in the computation graph.

And these tensors a and b can be constant tensors, Variable tensors, or placeholders. It doesn't matter, as long as they are of the same data type and compatible shapes(or broadcastable to it) to achieve the operations.

like image 12
kmario23 Avatar answered Oct 21 '22 12:10

kmario23