I have no idea why something like this should be slow:
steps=500
samples=100000
s_0=2.1
r=.02
sigma=.2
k=1.9
at<-matrix(nrow=(steps+1),ncol=samples)
at[1,]=s_0
for(j in 1:samples)
{
for(i in 2:(steps+1))
{
at[i,j]=at[(i-1),j] + sigma*sqrt(.0008)*rnorm(1)
}
}
I tried to rewrite this using sapply, but it was still awful from a performance standpoint.
Am I missing something here? This would be seconds in c++ or even the bloated c#.
R can vectorize certain operations. In your case you can get rid of the outer loop by doing a following change.
for(i in 2:(steps + 1))
{
at[i,] = at[(i - 1),] + sigma * sqrt(.0008) * rnorm(samples)
}
According to system.time
the original version for samples = 1000
takes 6.83s, while the modified one 0.09s.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With