Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Experimenting with C - Why can't I allocate and use 2GB of memory?

Tags:

c

windows

malloc

I continue experimenting with C. I have this program that allows you to decide how much RAM you want to eat.

char * eatRAM()
{
    unsigned long long toEat;
    unsigned long long i = 0;
    float input;
    char * pMemory = NULL;
    int megaByte = 1048576;

    puts("How much RAM do you want to eat? (in Mega Bytes)");
    puts("NOTE: If you want to eat more RAM than you have available\nin your system, the program will crash");
    printf("\n>> MB: ");
    scanf("%f", &input);

    toEat = (unsigned long long)(input * megaByte);
    pMemory = malloc(toEat);

    printf("\n\nEating in total: %llu Bytes\n", toEat);
    puts("Check your task manager!\n");

    if(pMemory != NULL)
    {
        printf("\n\nEating in total: %llu Bytes\n", toEat);
        puts("Check your task manager!\n");

        for(i; i < toEat; i++)
        {
            pMemory[i] = 'x';
        }
    }
    else
    {
        puts("\nSeems like that amount of memory couldn't be allocated :( \n");
    }
    return pMemory;
}

UPDATED QUESTION:

The thing is that... if I enter for example 1024MB it works, I can see in the task manager it is using 1GB of RAM. Even if I enter 1500MB it works..

But if I enter 2048MB it says

Seems like that amount of memory couldn't be allocated :(

or even if I enter 1756MB

Remember I'm new to C, maybe I'm omitting something important related to how OS allows me to access memory, what could it be?

like image 242
Juan Bonnett Avatar asked Nov 29 '15 06:11

Juan Bonnett


1 Answers

A 32-bit process on Windows has a 2 gigabyte address space available by default. The bottom half of the full pow(2, 32) address space, the top 2 GB is used by the operating system. Since just about nobody uses a 32-bit OS anymore, you can get 4 GB when you link your program with /LARGEADDRESSAWARE.

That 2 GB VM space needs to be shared by code and data. Your program typically loads at 0x00400000, any operating system DLLs you use (like kernel32.dll and ntdll.dll) have high load addresses (beyond 0x7F000000). And at least the startup thread's stack and the default process heap are created before your program starts running, their addresses are generally unpredictable.

Your program will be subjected to shrink-wrapped viral attacks in most any OS install, you'll have DLLs injected that provide "services" like anti-malware and cloud storage. The load address of those DLLs are unpredictable. Also any DLLs that you linked with yourself and are implicitly loaded when your program starts. Few programmers pay attention to their preferred base address and leave it at the default, 0x1000000. You can see these DLLs from the debugger's Modules window. Such DLLs often have their own CRT and tend to create their own heaps.

Allocations you make yourself, particularly very large ones that won't come from the low-fragmentation heap, need to find address space in the holes that are left between existing code and data allocations. If you get 1500 MB then your VM is pretty clean. In general you'll get into trouble beyond 650 MB, quickly getting less when the program has been running for a while and fragmented the VM space. Allocation failures are almost always caused because the OS can't find a big enough hole, not because you don't have enough VM left. The sum of the holes can be considerably larger than your failed allocation request.

These details are rapidly becoming a folk tale, there are very few remaining reasons to still target x86. Target x64 and address space fragmentation won't be a problem for the next 20 years, very hard to fragment 8 terabytes of VM. With lots of headroom to grow beyond that.

So it should be obvious why you can't get 2048 MB, you can't get it all. Get further insight from SysInternals' VMMap utility, it shows you how the VM is carved up. And Mark Russinovich' blog post and book give lots of background.

like image 99
Hans Passant Avatar answered Nov 14 '22 23:11

Hans Passant