I'm running Monte Carlo simulation for a Simulink model with a Matlab script that looks more or less like this :
model = 'modelName';
load_system(model)
for ii = 1 : numberOfMC
% Some set_param...
% Some values are set
sim(model);
results{ii, 1} = numberOfMC;
% ect...
end
close_system(model,0);
As the number of Monte Carlo trials increase, the time of one simulation increases as well like n^2.
Is there a simple explanation for that and is there a solution to have something linear in time?
Thank you!
EDIT:
When I split my simulation in 6 batchs, and I run them in series, the sum of the simulation times is far less than when I run the entire simulation ine one shot.
You can change the start time and stop time for the simulation by entering new values in the Start time and Stop time fields. The default start time is 0.0 seconds and the default stop time is 10.0 seconds. Simulation time and actual clock time are not the same.
One strategy to reduce the variance of the Monte-Carlo estimate is to attempt to develop a corresponding estimate based instead on a sequence of variates Xi which have desirable correlations resulting in cancellations in the sum which yield to a smaller effective variance for the estimate.
As it seems that there is a limit to what one can do without feedback from the asker I will just post my comment as an answer:
My bet would be memory issues, if you want to eliminate this see whether the problem still occurs if you don't store the result in the first place, simply remove this line:
results{ii, 1} = numberOfMC;
Also confirm that you don't have other growing variables or that you accidentally make the input more complex as you go. It is probably not relevant, does the time also increase like this if you do all simulations in reversed order? Or if you do the full amount of iterations, but each time with exactly the same input?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With