This self-contained program, when run with the windows command line (i.e. test.exe > test.txt
), emits an extra byte for reasons I cannot understand.
#include <stdio.h>
int main() {
int N = 50;
for (int i = 0; i < N; ++i)
printf("%c%c%c", (int)(i / (float)N * 255), 0, 255);
}
One would expect test.txt to have 150 bytes, but it has 151. Looking at it with a hex editor you can see this:
0000 ff05 00ff 0d0a 00ff 0f00 ff14 00ff
1900 ff1e 00ff 2300 ff28 00ff 2d00 ff33
00ff 3800 ff3d 00ff 4200 ff47 00ff 4c00
ff51 00ff 5600 ff5b 00ff 6000 ff66 00ff
6b00 ff70 00ff 7500 ff7a 00ff 7f00 ff84
00ff 8900 ff8e 00ff 9300 ff99 00ff 9e00
ffa3 00ff a800 ffad 00ff b200 ffb7 00ff
bc00 ffc1 00ff c600 ffcc 00ff d100 ffd6
00ff db00 ffe0 00ff e500 ffea 00ff ef00
fff4 00ff f900 ff
The third iteration of the loop seems to be the culprit where four bytes are instead emitted: 0d0a00ff
. I cannot for the life of me figure out why this would happen. I compiled this with Visual Studio 2015, in case that matters.
(int)(2 / (float)5 * 255)
is 10, which is the ASCII representation of the newline character, which Windows translates to its normal CRLF representation.
There is no way to avoid this behaviour (freopen
is limited on Windows) as far I know, but seeing as you want to write binary data, using the text-formatting function printf
and standard output is perhaps not the ideal approach.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With