Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the utility of `Tensor` (as opposed to `EagerTensor`) in Tensorflow 2.0?

In Tensorflow 2.0, the main "tensors" we see are in fact EagerTensors (tensorflow.python.framework.ops.EagerTensor to be more precise):

x = [[2.]]
m = tf.matmul(x, x)
type(m)
# returns tensorflow.python.framework.ops.EagerTensor

But, in some cases, we have the symbolic Tensor object (tensorflow.python.framework.ops.Tensor), as in TF1.X.
For example in keras:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense

model = Sequential()
model.add(Dense(units=64, activation='relu', input_dim=100))
model.add(Dense(units=10, activation='softmax'))
type(model.outputs[0])
# returns tensorflow.python.framework.ops.Tensor

So, what are the use for these symbolic:tensorflow.python.framework.ops.Tensor in Tensorflow:

  • In the TF library internals: Keras is at least using these tensors, but is it used at other places (which are using a graph, like tf.function, or tf.data.Dataset)?
  • In the API: is there an actual use for end-users of these?
like image 884
Phylliade Avatar asked Aug 26 '19 14:08

Phylliade


People also ask

What is the difference between Eagertensor and tensor?

EagreTensor represents a tensor who's value has been calculated in eager mode whereas Tensor represents a tensor node in a graph that may not yet have been calculated.

How are tensors used in TensorFlow?

A tensor is a generalization of vectors and matrices to potentially higher dimensions. Internally, TensorFlow represents tensors as n-dimensional arrays of base datatypes. Each element in the Tensor has the same data type, and the data type is always known.

What does from tensor slices do?

In NLP applications, you can use tensor slicing to perform word masking while training. For example, you can generate training data from a list of sentences by choosing a word index to mask in each sentence, taking the word out as a label, and then replacing the chosen word with a mask token.

Why are TensorFlow tensors immutable?

Exactly, when you build a tensor, TensorFlow basicaly builds a math model to compute whatever you want your output to be, hence, it's immutability, because the value of the Tensor itself is not the result of what you set, it is the "what do i need to compute in order to output the result that the user told me to", and ...


1 Answers

In the TF library internals: Keras is at least using these tensors, but is it used at other places (which are using a graph, like tf.function, or tf.data.Dataset)?

Well, yes. Your instinct is correct here. EagreTensor represents a tensor who's value has been calculated in eager mode whereas Tensor represents a tensor node in a graph that may not yet have been calculated.

In the API: is there an actual use for end-users of these?

Well, on some level we use them all the time. We create keras models, tf.data.Dataset pipelines etc. but, really, for the vast majority of use-cases we don't tend to instantiate or interact directly with the Tensor object itself though so probably just want to not worry about the object type and consider them to be an implementation detail internal to tensorflow.

like image 59
Stewart_R Avatar answered Oct 22 '22 14:10

Stewart_R