I'd like to know if it is an easy way of determining the maximum number of characters to print a decimal int
.
I know <limits.h>
contains definitions like INT_MAX
that say the maximum value an int can assume, but it is not what I want.
I'd like to be able to do something like:
int get_int( void )
{
char draft[ MAX_CHAR_OF_A_DECIMAL_INT ];
fgets( draft, sizeof( draft ), stdin );
return strtol( draft, NULL, 10 );
}
But how to find the value of MAX_CHAR_OF_A_DECIMAL_INT
in a portable and low overheaded way?
Thanks!
Who would read such a number - a double (typically) has approximately 15 significant decimal digits - all the rest would be a large number of leading or traiining zeros. No you can have much more than 15 significant digits for decimal digits but only 15 significant digits for integer.
The maximum length of a string literal allowed in Microsoft C is approximately 2,048 bytes.
If you assume CHAR_BIT
is 8 (required on POSIX, so a safe assumption for any code targetting POSIX systems as well as any other mainstream system like Windows), a cheap safe formula is 3*sizeof(int)+2
. If not, you can make it 3*sizeof(int)*CHAR_BIT/8+2
, or there's a slightly simpler version.
In case you're interested in the reason this works, sizeof(int)
is essentially a logarithm of INT_MAX
(roughly log base 2^CHAR_BIT), and conversion between logarithms of different bases (e.g. to base 10) is just multiplication. In particular, 3 is an integer approximation/upper bound on log base 10 of 256.
The +2 is to account for a possible sign and null termination.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With