It feels like I am missing something obvious here. This has been asked a number of times - and the answer usually boils down to:
var num = 4.5;
num % 1 === 0; // false - 4.5 is a decimal
But, this fails for
var num = 1.0; // or 2.0, 3.0, ...
num % 1 // 0
Unfortunately, these doesn't work either
num.toString() // 1
typeof num // "number"
I am writing a JavaScript color parsing library, and I want to process input differently if it is given as 1.0 or 1. In my case, 1.0
really means 100% and 1
means 1.
Otherwise, both rgb 1 1 1
and rgb 1 255 255
will be parsed as rgb 255 255 255
(since I am right now taking anything <= 1 to mean a ratio).
Those numbers aren't actually decimals or integers. They're all floats. The only real difference between 1
and 1.0
is the notation that was used to create floats of equal values.
1 === 1.0; // true
parseInt('1') == parseInt('1.0'); // true
parseFloat('1') === parseFloat('1.0'); // true
parseInt('1') === parseFloat('1'); // true
// etc...
Also, to demonstrate that they are really the same underlying data type:
typeof(1); // 'number'
typeof(1.0); // 'number'
Also, note that 'number' isn't unambiguous in JavaScript like it would be in other languages, because numbers are always floats.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With