As of PyTorch 0.4 this question is no longer valid. In 0.4 Tensor
s and Variable
s were merged.
How can I perform element-wise multiplication with a variable and a tensor in PyTorch? With two tensors works fine. With a variable and a scalar works fine. But when attempting to perform element-wise multiplication with a variable and tensor I get:
XXXXXXXXXXX in mul
assert not torch.is_tensor(other)
AssertionError
For example, when running the following:
import torch
x_tensor = torch.Tensor([[1, 2], [3, 4]])
y_tensor = torch.Tensor([[5, 6], [7, 8]])
x_variable = torch.autograd.Variable(x_tensor)
print(x_tensor * y_tensor)
print(x_variable * 2)
print(x_variable * y_tensor)
I would expect the first and last print statements to show similar results. The first two multiplications work as expected, with the error coming up in the third. I have attempted the aliases of *
in PyTorch (i.e. x_variable.mul(y_tensor)
, torch.mul(y_tensor, x_variable)
, etc.).
It seems that element-wise multiplication between a tensor and a variable is not supported given the error and the code which produces it. Is this correct? Or is there something I'm missing? Thank you!
torch. mul() method is used to perform element-wise multiplication on tensors in PyTorch. It multiplies the corresponding elements of the tensors.
For . view() pytorch expects the new shape to be provided by individual int arguments (represented in the doc as *shape ). The asterisk ( * ) can be used in python to unpack a list into its individual elements, thus passing to view the correct form of input arguments it expects.
Returns the value of this tensor as a standard Python number. This only works for tensors with one element. For other cases, see tolist() .
To perform element-wise subtraction on tensors, we can use the torch. sub() method of PyTorch. The corresponding elements of the tensors are subtracted. We can subtract a scalar or tensor from another tensor.
Yes, you are correct. Elementwise multiplication (like most other operations) is only supported for Tensor * Tensor
or Variable * Variable
, but not for Tensor * Variable
.
To perform your multiplication above, wrap your Tensor
as a Variable
which doesn't require gradients. The additional overhead is insignificant.
y_variable = torch.autograd.Variable(y_tensor, requires_grad=False)
x_variable * y_variable # returns Variable
But obviously, only use Variables
though, if you actually require automatic differentiation through a graph. Else you can just perform the operation on the Tensors
directly as you did in your question.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With