Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the purpose of torch.autograd.Variable?

I load features and labels from my training dataset. Both of them are originally numpy arrays, but I change them to the torch tensor using torch.from _numpy(features.copy()) and torch.tensor(labels.astype(np.bool)).

And I notice that torch.autograd.Variable is something like placeholder in tensorflow.

When I train my network, first I tried

features = features.cuda()
labels = labels.cuda()

outputs = Config.MODEL(features)
loss = Config.LOSS(outputs, labels)

Then I tried

features = features.cuda()
labels = labels.cuda()

input_var = Variable(features)
target_var = Variable(labels)
outputs = Config.MODEL(input_var)
loss = Config.LOSS(outputs, target_var)

Both blocks succeed in activating training, but I worried that there might be trivial difference.

like image 705
yujuezhao Avatar asked Aug 20 '19 19:08

yujuezhao


People also ask

What does Variable () do PyTorch?

A PyTorch Variable is a wrapper around a PyTorch Tensor, and represents a node in a computational graph. If x is a Variable then x. data is a Tensor giving its value, and x. grad is another Variable holding the gradient of x with respect to some scalar value.

How does PyTorch do Autograd work?

How autograd encodes the history. Autograd is reverse automatic differentiation system. Conceptually, autograd records a graph recording all of the operations that created the data as you execute operations, giving you a directed acyclic graph whose leaves are the input tensors and roots are the output tensors.

Does PyTorch use Autograd?

torch. autograd is PyTorch's automatic differentiation engine that powers neural network training. In this section, you will get a conceptual understanding of how autograd helps a neural network train.

What is Autograd in Python?

autograd. • Autograd is a Python package for automatic differentiation. • To install Autograd: pip install autograd. • Autograd can automatically differentiate Python and Numpy code.


Video Answer


1 Answers

According to this question you no longer need variables to use Pytorch Autograd.

Thanks to @skytree, we can make this even more explizit: Variables have been deprecated, i.e. you're not supposed to use them anymore.

Autograd automatically supports Tensors with requires_grad set to True.

And more importantly

Variable(tensor) and Variable(tensor, requires_grad) still work as expected, but they return Tensors instead of Variables.

This means that if your features and labels are tensors already (which they seem to be in your example) your Variable(features) and Variable(labels) does only return a tensor again.

The original purpose of Variables was to be able to use automatic differentiation (Source):

Variables are just wrappers for the tensors so you can now easily auto compute the gradients.

like image 184
Florian Blume Avatar answered Oct 21 '22 12:10

Florian Blume