Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Segmentation fault without defining an unused array

I'm trying to write a simple program to output in hex the first 16 kilobytes of a binary file (Game Boy ROM) in 16-bit chunks. However during the for loop my program will invariably segfault, however it always segfaults at a different point in the array. Here is the code:

#include <stdio.h>
#include <stdint.h>

int main ()
{
    uint16_t buffer[8000];
    FILE* ROM = fopen("rom.gb", "rb");
    if (ROM == NULL)
    {
        printf("Error");
        fclose(ROM);
        return 1;
    }
    fread(buffer, sizeof(buffer), 1, ROM);
    int i;
    for(i = 0; i < sizeof(buffer); ++i)
    {
        if (buffer[i] < 16)
        {
            printf("000%x ", buffer[i]);
        }
        else if (buffer[i] < 256)
        {
            printf("00%x ", buffer[i]);
        }
        else if (buffer[i] < 4096)
        {
            printf("0%x ", buffer[i]);
        }
        else
        {
            printf("%x ", buffer[i]);
        }
    }
    fclose(ROM);
    return 0; 
}

Before I changed to using uint16_t instead of char (since the Game Boy has a 16-bit address space) this did not occur, and in fact if I include the declaration

unsigned char buffer2[16000]; 

next to the declaration of the first buffer I get the expected output. So my questions are, why would adding an unused variable stop the program from segfaulting? And how can I avoid having to do this and declaring a huge array which is entirely unused in the program?

like image 463
Xerxes Avatar asked Jan 05 '15 11:01

Xerxes


1 Answers

In this line:

for(i = 0; i < sizeof(buffer); ++i)

sizeof(buffer) is the size of array in bytes, if you want the number of elements use

i < (sizeof(buffer) / sizeof(buffer[0]))

like image 121
David Ranieri Avatar answered Oct 06 '22 02:10

David Ranieri