#include <stdio.h>
#include <string.h>
int main()
{
printf("%d\n",sizeof("S\065AB"));
printf("%d\n",sizeof("S65AB"));
printf("%d\n",sizeof("S\065\0AB"));
printf("%d\n",sizeof("S\06\05\0AB"));
printf("%d\n",sizeof("S6\05AB"));
printf("%d\n",sizeof("\0S65AB"));
return 0;
}
output:
5
6
6
7
6
7
http://ideone.com/kw23IV
Can anyone explain this behaviour with character strings?
Using GCC on Debian 7.4
The size of a string literal is the number of characters in it including the trailing null byte that is added. If there embedded nulls in the string, they are immaterial; they get counted. It is unrelated to strlen()
except that if the literal includes no embedded nulls, strlen(s) == sizeof(s) - 1
.
printf("%zu\n", sizeof("S\065AB")); // 5: '\065' is a single character
printf("%zu\n", sizeof("S65AB")); // 6
printf("%zu\n", sizeof("S\065\0AB")); // 6: '\065' is a single character
printf("%zu\n", sizeof("S\06\05\0AB")); // 7: '\06' and '\05' are single chars
printf("%zu\n", sizeof("S6\05AB")); // 6: '\05' is a single character
printf("%zu\n", sizeof("\0S65AB")); // 7
Note that '\377'
is a valid octal constant, equivalent to '\xFF'
or 255. You can use them in strings, too. The value '\0'
is only a special case of a more general octal constant.
Note that sizeof()
evaluates to a value of type size_t
, and the correct formatting type qualifier in C99 and C11 for size_t
is z
, and since it is unsigned, u
is more appropriate than d
, hence the "%zu\n"
format that I used.
A literal character string is an array of exactly the size needed to hold all the characters and an extra terminating zero-byte.
So, "hello"
has type char[6]
and sizeof
yields 6.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With