I have written a program which works on huge set of data. My CPU and OS(Ubuntu) both are 64 bit and I have got 4GB of RAM. Using "top" (%Mem field), I saw that the process's memory consumption went up to around 87% i.e 3.4+ GB and then it got killed.
I then checked how much memory a process can access using "uname -m" which comes out to be "unlimited".
Now, since both the OS and CPU are 64 bit and also there exists a swap partition, the OS should have used the virtual memory i.e [ >3.4GB + yGB from swap space ] in total and only if the process required more memory, it should have been killed.
So, I have following ques:
Please suggest.
It's not only the data size that could be the reason. For example, do ulimit -a
and check the max stack size. Have you got a kill reason? Set 'ulimit -c 20000' to get a core file, it shows you the reason when you examine it with gdb.
Check with file
and ldd
that your executable is indeed 64 bits.
Check also the resource limits. From inside the process, you could use getrlimit system call (and setrlimit
to change them, when possible). From a bash
shell, try ulimit -a
. From a zsh
shell try limit
.
Check also that your process indeed eats the memory you believe it does consume. If its pid is 1234 you could try pmap 1234
. From inside the process you could read the /proc/self/maps
or /proc/1234/maps
(which you can read from a terminal). There is also the /proc/self/smaps
or /proc/1234/smaps
and /proc/self/status
or /proc/1234/status
and other files inside your /proc/self/
...
Check with free
that you got the memory (and the swap space) you believe. You can add some temporary swap space with swapon /tmp/someswapfile
(and use mkswap
to initialize it).
I was routinely able, a few months (and a couple of years) ago, to run a 7Gb process (a huge cc1
compilation), under Gnu/Linux/Debian/Sid/AMD64, on a machine with 8Gb RAM.
And you could try with a tiny test program, which e.g. allocates with malloc
several memory chunks of e.g. 32Mb each. Don't forget to write some bytes inside (at least at each megabyte).
standard C++ containers like std::map
or std::vector
are rumored to consume more memory than what we usually think.
Buy more RAM if needed. It is quite cheap these days.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With