This maybe more mathematical question than programming. In JS I wanted to a function that returns a random integer number in an interval lets say 1-6 and this is what I found:
// Returns a random integer between min and max
// Using Math.round() will give you a non-uniform distribution!
function getRandomInt(min, max) {
return Math.floor(Math.random() * (max - min + 1)) + min;
}
I feel guilty if I copy and paste this in my code. I don't understand this : Why we subtract min from max, add 1, multiply the answer by Math.random() and then add the min. I tired with several numbers manually on paper and it work just fine ! But I don't understand why !
Assuming you already understand the behaviour of Math.floor
and Math.random
, here's the rest step by step:
Math.random()
↝ a random number between 0
(inclusive) and 1
(exclusive)Math.random() * max
↝ a random number between 0
(inclusive) and max
(exclusive)Math.floor(Math.random() * max)
↝ a random integer between 0
(incl.) and max
(excl.)Math.floor(Math.random() * (max - min)) + min
↝ a random integer between min
(incl.) and max
(excl.)Math.floor(Math.random() * ((max + 1) - min)) + min
↝ a random integer between min
(incl.) and max+1
(excl.) (OR between min
and max
both inclusive)If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With