I have a list of tensors of the same shape.
I would like to sum the entire list of tensors along an axis.
Does torch.cumsum
perform this op along a dim?
If so it requires the list to be converted to a single tensor and summed over?
The tf. sum() function is used to calculate sum of the elements of a specified Tensor across its dimension. It reduces the given input elements along the dimensions of axes.
Two tensors of the same size can be added together by using the + operator or the add function to get an output tensor of the same shape.
item () → number. Returns the value of this tensor as a standard Python number. This only works for tensors with one element. For other cases, see tolist() .
you don't need cumsum
, sum
is your friend
and yes you should first convert them into a single tensor with stack
or cat
based on your needs, something like this:
import torch
my_list = [torch.randn(3, 5), torch.randn(3, 5)]
result = torch.stack(my_list, dim=0).sum(dim=0).sum(dim=0)
print(result.shape) #torch.Size([5])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With