I'm trying to write a simple program to output in hex the first 16 kilobytes of a binary file (Game Boy ROM) in 16-bit chunks. However during the for
loop my program will invariably segfault, however it always segfaults at a different point in the array. Here is the code:
#include <stdio.h>
#include <stdint.h>
int main ()
{
uint16_t buffer[8000];
FILE* ROM = fopen("rom.gb", "rb");
if (ROM == NULL)
{
printf("Error");
fclose(ROM);
return 1;
}
fread(buffer, sizeof(buffer), 1, ROM);
int i;
for(i = 0; i < sizeof(buffer); ++i)
{
if (buffer[i] < 16)
{
printf("000%x ", buffer[i]);
}
else if (buffer[i] < 256)
{
printf("00%x ", buffer[i]);
}
else if (buffer[i] < 4096)
{
printf("0%x ", buffer[i]);
}
else
{
printf("%x ", buffer[i]);
}
}
fclose(ROM);
return 0;
}
Before I changed to using uint16_t instead of char (since the Game Boy has a 16-bit address space) this did not occur, and in fact if I include the declaration
unsigned char buffer2[16000];
next to the declaration of the first buffer I get the expected output. So my questions are, why would adding an unused variable stop the program from segfaulting? And how can I avoid having to do this and declaring a huge array which is entirely unused in the program?
In this line:
for(i = 0; i < sizeof(buffer); ++i)
sizeof(buffer)
is the size of array in bytes, if you want the number of elements use
i < (sizeof(buffer) / sizeof(buffer[0]))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With