The Tensorflow documentation states that a Variable
can be used any place a Tensor
can be used, and they seem to be fairly interchangeable. For example, if v
is a Variable
, then x = 1.0 + v
becomes a Tensor
.
What is the difference between the two, and when would I use one over the other?
So, the most important difference between Variables and Tensors is mutability. The values in a Variable object can be updated (e.g., with the assign() function) as opposed to Tensors. “The values of tensor objects cannot be updated, and you can only create a new Tensor object with the new values.”
A variable is a state or value that can be modified by performing operations on it. In TensorFlow variables are created using the Variable() constructor. The Variable() constructor expects an initial value for the variable, which can be any kind or shape of Tensor.
According to the official PyTorch document, Both classes are a multi-dimensional matrix containing elements of a single data type, have the same API, almost any operation provided by tensor can be also done in Variable. The difference between Tensor and Variable is that the Variable is a wrapper of Tensor.
placeholder is used for input data, and tf. Variable is used to store the state of data.
It's true that a Variable can be used any place a Tensor can, but the key differences between the two are that a Variable maintains its state across multiple calls to run() and a variable's value can be updated by backpropagation (it can also be saved, restored etc as per the documentation).
These differences mean that you should think of a variable as representing your model's trainable parameters (for example, the weights and biases of a neural network), while you can think of a Tensor as representing the data being fed into your model and the intermediate representations of that data as it passes through your model.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With