I have a matrix like
A= [ 1 2 4
2 3 1
3 1 2 ]
and I would like to calculate its cumulative sum by row and by column, that is, I want the result to be
B = [ 1 3 7
3 8 13
6 12 19 ]
Any ideas of how to make this in R in a fast way? (Possibly using the function cumsum) (I have huge matrices)
Thanks!
A one-liner:
t(apply(apply(A, 2, cumsum)), 1, cumsum))
The underlying observation is that you can first compute the cumulative sums over the columns and then the cumulative sum of this matrix over the rows.
Note: When doing the rows, you have to transpose the resulting matrix.
Your example:
> apply(A, 2, cumsum)
[,1] [,2] [,3]
[1,] 1 2 4
[2,] 3 5 5
[3,] 6 6 7
> t(apply(apply(A, 2, cumsum), 1, cumsum))
[,1] [,2] [,3]
[1,] 1 3 7
[2,] 3 8 13
[3,] 6 12 19
About performance: I have now idea how good this approach scales to big matrices. Complexity-wise, this should be close to optimal. Usually, apply
is not that bad in performance as well.
Now I was getting curious - what approach is the better one? A short benchmark:
> A <- matrix(runif(1000*1000, 1, 500), 1000)
>
> system.time(
+ B <- t(apply(apply(A, 2, cumsum), 1, cumsum))
+ )
User System elapsed
0.082 0.011 0.093
>
> system.time(
+ C <- lower.tri(diag(nrow(A)), diag = TRUE) %*% A %*% upper.tri(diag(ncol(A)), diag = TRUE)
+ )
User System elapsed
1.519 0.016 1.530
Thus: Apply outperforms matrix multiplication by a factor of 15. (Just for comparision: MATLAB needed 0.10719 seconds.) The results do not really surprise, as the apply
-version can be done in O(n^2), while the matrix multiplication will need approx. O(n^2.7) computations. Thus, all optimizations that matrix multiplication offers should be lost if n is big enough.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With