Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is .data still useful in pytorch?

I'm new to pytorch. I read much pytorch code which heavily uses tensor's .data member. But I search .data in the official document and Google, finding little. I guess .data contains the data in the tensor, but I don't know when we need it and when not?

like image 568
Maybe Avatar asked Aug 08 '18 09:08

Maybe


People also ask

What is .data in PyTorch?

. data was an attribute of Variable (object representing Tensor with history tracking e.g. for automatic update), not Tensor . Actually, . data was giving access to the Variable 's underlying Tensor . However, since PyTorch version 0.4.

What is Torch Tensor data?

A torch.Tensor is a multi-dimensional matrix containing elements of a single data type.

What is PyTorch detach?

PyTorchServer Side ProgrammingProgramming. Tensor. detach() is used to detach a tensor from the current computational graph. It returns a new tensor that doesn't require a gradient. When we don't need a tensor to be traced for the gradient computation, we detach the tensor from the current computational graph.

What is Torch stack?

torch. stack (tensors, dim=0, *, out=None) → Tensor. Concatenates a sequence of tensors along a new dimension. All tensors need to be of the same size.


1 Answers

.data was an attribute of Variable (object representing Tensor with history tracking e.g. for automatic update), not Tensor. Actually, .data was giving access to the Variable's underlying Tensor.

However, since PyTorch version 0.4.0, Variable and Tensor have been merged (into an updated Tensor structure), so .data disappeared along the previous Variable object (well Variable is still there for backward-compatibility, but is deprecated).


Paragraph from Release Notes for version 0.4.0 (I recommend reading the whole section about Variable/Tensor updates):

What about .data?

.data was the primary way to get the underlying Tensor from a Variable. After this merge, calling y = x.data still has similar semantics. So y will be a Tensor that shares the same data with x, is unrelated with the computation history of x, and has requires_grad=False.

However, .data can be unsafe in some cases. Any changes on x.data wouldn't be tracked by autograd, and the computed gradients would be incorrect if x is needed in a backward pass. A safer alternative is to use x.detach(), which also returns a Tensor that shares data with requires_grad=False, but will have its in-place changes reported by autograd if x is needed in backward.

like image 182
benjaminplanche Avatar answered Oct 01 '22 08:10

benjaminplanche