In Tensorflow 2.0, the main "tensors" we see are in fact EagerTensors
(tensorflow.python.framework.ops.EagerTensor
to be more precise):
x = [[2.]]
m = tf.matmul(x, x)
type(m)
# returns tensorflow.python.framework.ops.EagerTensor
But, in some cases, we have the symbolic Tensor
object (tensorflow.python.framework.ops.Tensor
), as in TF1.X.
For example in keras:
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
model = Sequential()
model.add(Dense(units=64, activation='relu', input_dim=100))
model.add(Dense(units=10, activation='softmax'))
type(model.outputs[0])
# returns tensorflow.python.framework.ops.Tensor
So, what are the use for these symbolic:tensorflow.python.framework.ops.Tensor
in Tensorflow:
EagreTensor represents a tensor who's value has been calculated in eager mode whereas Tensor represents a tensor node in a graph that may not yet have been calculated.
A tensor is a generalization of vectors and matrices to potentially higher dimensions. Internally, TensorFlow represents tensors as n-dimensional arrays of base datatypes. Each element in the Tensor has the same data type, and the data type is always known.
In NLP applications, you can use tensor slicing to perform word masking while training. For example, you can generate training data from a list of sentences by choosing a word index to mask in each sentence, taking the word out as a label, and then replacing the chosen word with a mask token.
Exactly, when you build a tensor, TensorFlow basicaly builds a math model to compute whatever you want your output to be, hence, it's immutability, because the value of the Tensor itself is not the result of what you set, it is the "what do i need to compute in order to output the result that the user told me to", and ...
In the TF library internals: Keras is at least using these tensors, but is it used at other places (which are using a graph, like tf.function, or tf.data.Dataset)?
Well, yes. Your instinct is correct here. EagreTensor
represents a tensor who's value has been calculated in eager mode whereas Tensor
represents a tensor node in a graph that may not yet have been calculated.
In the API: is there an actual use for end-users of these?
Well, on some level we use them all the time. We create keras models, tf.data.Dataset pipelines etc. but, really, for the vast majority of use-cases we don't tend to instantiate or interact directly with the Tensor
object itself though so probably just want to not worry about the object type and consider them to be an implementation detail internal to tensorflow.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With