What means the column VIRT in this HTOP output? It appears in red color and bolded.
I'm running an elasticsearch cluster with 4 nodes and indexing massive data.
It shows green health all the time.
This is a DigitalOcean droplet with 4GB RAM and 2 CPU's. I'm setting heapsize to 2gb (-Xms and -Xmx).
Is this an overhead of RAM memory?
1 [||||| 9.4%] Tasks: 26, 122 thr; 2 running 2 [||| 4.3%] Load average: 0.25 0.47 0.65 Mem[||||||||||||||||||||||||||||||2592/3954MB] Uptime: 2 days, 01:05:57 Swp[ 0/0MB] PID USER PRI NI VIRT RES SHR S CPU% MEM% TIME+ Command 9629 root 20 0 16.2G 2516M 97360 S 12.0 63.6 27h02:30 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 21564 root 20 0 16.2G 2516M 97360 S 4.0 63.6 1:12.17 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9644 root 20 0 16.2G 2516M 97360 S 1.0 63.6 47:39.34 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 20451 root 20 0 25808 2020 1208 R 0.0 0.0 25:19.48 htop 9654 root 20 0 16.2G 2516M 97360 S 1.0 63.6 5:43.32 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9651 root 20 0 16.2G 2516M 97360 S 0.0 63.6 6:34.53 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9653 root 20 0 16.2G 2516M 97360 S 1.0 63.6 1:46.23 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 21565 root 20 0 16.2G 2516M 97360 S 0.0 63.6 1:12.48 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 21563 root 20 0 16.2G 2516M 97360 S 0.0 63.6 1:11.12 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 21472 root 20 0 16.2G 2516M 97360 S 0.0 63.6 1:15.85 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9652 root 20 0 16.2G 2516M 97360 S 0.0 63.6 5:40.13 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 21562 root 20 0 16.2G 2516M 97360 S 0.0 63.6 1:10.93 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9631 root 20 0 16.2G 2516M 97360 S 0.0 63.6 1h19:18 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9632 root 20 0 16.2G 2516M 97360 S 0.0 63.6 1h19:19 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9633 root 20 0 16.2G 2516M 97360 S 0.0 63.6 7h19:13 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9850 root 20 0 16.2G 2516M 97360 S 0.0 63.6 18:16.00 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9634 root 20 0 16.2G 2516M 97360 S 0.0 63.6 1h29:54 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9722 root 20 0 16.2G 2516M 97360 S 0.0 63.6 50:24.42 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9703 root 20 0 16.2G 2516M 97360 S 0.0 63.6 4h25:50 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9704 root 20 0 16.2G 2516M 97360 S 0.0 63.6 4h26:01 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9662 root 20 0 16.2G 2516M 97360 S 0.0 63.6 1:19.60 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9669 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:08.13 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9665 root 20 0 16.2G 2516M 97360 S 0.0 63.6 6:39.74 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 385 syslog 20 0 243M 1696 872 S 0.0 0.0 0:13.37 rsyslogd -c5 397 syslog 20 0 243M 1696 872 S 0.0 0.0 0:05.58 rsyslogd -c5 9640 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:24.60 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9647 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:10.90 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9635 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:38.98 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 950 root 20 0 15988 664 488 S 0.0 0.0 0:21.99 /usr/sbin/irqbalance 9645 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:19.28 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9700 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:10.25 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 19171 root 20 0 86232 2316 1364 S 0.0 0.1 0:10.56 sshd: root@pts/0 9648 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:09.37 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9639 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:32.61 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9642 root 20 0 16.2G 2516M 97360 S 0.0 63.6 3:04.59 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9643 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:37.35 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9649 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:55.92 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9650 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:15.67 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9706 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:51.00 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9705 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:38.98 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9667 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:08.34 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9646 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:10.10 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9668 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:07.83 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 9666 root 20 0 16.2G 2516M 97360 S 0.0 63.6 0:08.01 /usr/bin/java -Xms256m -Xmx1g -Xss256k - 398 syslog 20 0 243M 1696 872 S 0.0 0.0 0:01.40 rsyslogd -c5 632 root 20 0 50044 1648 1036 S 0.0 0.0 0:02.10 /usr/sbin/sshd -D 1 root 20 0 24300 1644 740 S 0.0 0.0 0:01.80 /sbin/init
I can buy more machines, but I don't know which is better: another machines VS more RAM.
UPDATE: first and last lines pmap
of one process
root@es2:~# pmap -d 3589 3589: /usr/bin/java -Xms256m -Xmx1g -Xss256k -Djava.awt.headless=true -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutOfMemoryError -Delasticsearch -Des.foreground=yes -Des.path.home=/root/elasticsearch -cp :/root/elasticsearch/lib/elasticsearch-1.1.1.jar:/root/elasticsearch/lib/*:/root/elasticsearch/lib/sigar/* -Xmx2g -Xms2g -Des.node.name=es2 -Des.node.data=true -Des.node.master=false org.elasticsearch.bootstrap.Elasticsearch Address Kbytes Mode Offset Device Mapping 0000000000400000 4 r-x-- 0000000000000000 0fd:00000 java 0000000000600000 4 r---- 0000000000000000 0fd:00000 java 0000000000601000 4 rw--- 0000000000001000 0fd:00000 java 000000000188d000 132 rw--- 0000000000000000 000:00000 [ anon ] 0000000775a00000 2118400 rw--- 0000000000000000 000:00000 [ anon ] 00000007f6ec0000 37696 rw--- 0000000000000000 000:00000 [ anon ] 00000007f9390000 111040 rw--- 0000000000000000 000:00000 [ anon ] 00007f6cb0132000 4872 r--s- 0000000000000000 0fd:00000 _8d.fdt 00007f6cb05f4000 5792 r--s- 0000000000000000 0fd:00000 _8d_es090_0.tim 00007f6cb0b9c000 6208 r--s- 0000000000000000 0fd:00000 _8c.fdt 00007f6cb11ac000 6548 r--s- 0000000000000000 0fd:00000 _a0.fdt 00007f6cb1811000 7648 r--s- 0000000000000000 0fd:00000 _a0_es090_0.tim 00007f6cb1f89000 5356 r--s- 0000000000000000 0fd:00000 _78.fdt 00007f6cb24c4000 6236 r--s- 0000000000000000 0fd:00000 _78_es090_0.tim .... ....... .... (huge text) .... ...... .. 00007f703dc6e000 4 rw--- 0000000000007000 0fd:00000 librt-2.15.so 00007f703dc6f000 84 r-x-- 0000000000000000 0fd:00000 libgcc_s.so.1 00007f703dc84000 2044 ----- 0000000000015000 0fd:00000 libgcc_s.so.1 00007f703de83000 4 r---- 0000000000014000 0fd:00000 libgcc_s.so.1 00007f703de84000 4 rw--- 0000000000015000 0fd:00000 libgcc_s.so.1 00007f703de85000 1004 r-x-- 0000000000000000 0fd:00000 libm-2.15.so 00007f703df80000 2044 ----- 00000000000fb000 0fd:00000 libm-2.15.so 00007f703e17f000 4 r---- 00000000000fa000 0fd:00000 libm-2.15.so 00007f703e180000 4 rw--- 00000000000fb000 0fd:00000 libm-2.15.so 00007f703e181000 904 r-x-- 0000000000000000 0fd:00000 libstdc++.so.6.0.16 00007f703e263000 2044 ----- 00000000000e2000 0fd:00000 libstdc++.so.6.0.16 00007f703e462000 32 r---- 00000000000e1000 0fd:00000 libstdc++.so.6.0.16 00007f703e46a000 8 rw--- 00000000000e9000 0fd:00000 libstdc++.so.6.0.16 00007f703e46c000 84 rw--- 0000000000000000 000:00000 [ anon ] 00007f703e481000 10988 r-x-- 0000000000000000 0fd:00000 libjvm.so 00007f703ef3c000 2048 ----- 0000000000abb000 0fd:00000 libjvm.so 00007f703f13c000 608 r---- 0000000000abb000 0fd:00000 libjvm.so 00007f703f1d4000 140 rw--- 0000000000b53000 0fd:00000 libjvm.so 00007f703f1f7000 180 rw--- 0000000000000000 000:00000 [ anon ] 00007f703f224000 88 r-x-- 0000000000000000 0fd:00000 libz.so.1.2.3.4 00007f703f23a000 2044 ----- 0000000000016000 0fd:00000 libz.so.1.2.3.4 00007f703f439000 4 r---- 0000000000015000 0fd:00000 libz.so.1.2.3.4 00007f703f43a000 4 rw--- 0000000000016000 0fd:00000 libz.so.1.2.3.4 00007f703f43b000 96 r-x-- 0000000000000000 0fd:00000 libpthread-2.15.so 00007f703f453000 2044 ----- 0000000000018000 0fd:00000 libpthread-2.15.so 00007f703f652000 4 r---- 0000000000017000 0fd:00000 libpthread-2.15.so 00007f703f653000 4 rw--- 0000000000018000 0fd:00000 libpthread-2.15.so 00007f703f654000 16 rw--- 0000000000000000 000:00000 [ anon ] 00007f703f658000 8 r-x-- 0000000000000000 0fd:00000 libdl-2.15.so 00007f703f65a000 2048 ----- 0000000000002000 0fd:00000 libdl-2.15.so 00007f703f85a000 4 r---- 0000000000002000 0fd:00000 libdl-2.15.so 00007f703f85b000 4 rw--- 0000000000003000 0fd:00000 libdl-2.15.so 00007f703f85c000 1748 r-x-- 0000000000000000 0fd:00000 libc-2.15.so 00007f703fa11000 2048 ----- 00000000001b5000 0fd:00000 libc-2.15.so 00007f703fc11000 16 r---- 00000000001b5000 0fd:00000 libc-2.15.so 00007f703fc15000 8 rw--- 00000000001b9000 0fd:00000 libc-2.15.so 00007f703fc17000 20 rw--- 0000000000000000 000:00000 [ anon ] 00007f703fc1c000 52 r-x-- 0000000000000000 0fd:00000 libjli.so 00007f703fc29000 2044 ----- 000000000000d000 0fd:00000 libjli.so 00007f703fe28000 4 r---- 000000000000c000 0fd:00000 libjli.so 00007f703fe29000 4 rw--- 000000000000d000 0fd:00000 libjli.so 00007f703fe2a000 136 r-x-- 0000000000000000 0fd:00000 ld-2.15.so 00007f703fe4c000 32 r--s- 000000000005a000 0fd:00000 lucene-codecs-4.7.2.jar 00007f703fe54000 16 r--s- 0000000000085000 0fd:00000 localedata.jar 00007f703fe58000 8 r--s- 0000000000012000 0fd:00000 zipfs.jar 00007f703fe5a000 12 r--s- 0000000000032000 0fd:00000 sunjce_provider.jar 00007f703fe5d000 16 r--s- 000000000003b000 0fd:00000 sunpkcs11.jar 00007f703fe61000 556 rw--- 0000000000000000 000:00000 [ anon ] 00007f703feec000 72 rw--- 0000000000000000 000:00000 [ anon ] 00007f703fefe000 220 rw--- 0000000000000000 000:00000 [ anon ] 00007f703ff35000 40 rw--- 0000000000000000 000:00000 [ anon ] 00007f703ff3f000 148 rw--- 0000000000000000 000:00000 [ anon ] 00007f703ff64000 580 rw--- 0000000000000000 000:00000 [ anon ] 00007f703fff5000 32 rw-s- 0000000000000000 0fd:00000 3279 00007f703fffd000 12 ----- 0000000000000000 000:00000 [ anon ] 00007f7040000000 268 rw--- 0000000000000000 000:00000 [ anon ] 00007f7040043000 4 r--s- 0000000000008000 0fd:00000 lucene-memory-4.7.2.jar 00007f7040044000 8 r--s- 000000000000f000 0fd:00000 pulse-java.jar 00007f7040046000 4 r--s- 0000000000007000 0fd:00000 java-atk-wrapper.jar 00007f7040047000 4 r--s- 0000000000002000 0fd:00000 dnsns.jar 00007f7040048000 4 rw--- 0000000000000000 000:00000 [ anon ] 00007f7040049000 4 r---- 0000000000000000 000:00000 [ anon ] 00007f704004a000 8 rw--- 0000000000000000 000:00000 [ anon ] 00007f704004c000 4 r---- 0000000000022000 0fd:00000 ld-2.15.so 00007f704004d000 8 rw--- 0000000000023000 0fd:00000 ld-2.15.so 00007fff9dcbe000 132 rw--- 0000000000000000 000:00000 [ stack ] 00007fff9ddfe000 8 r-x-- 0000000000000000 000:00000 [ anon ] ffffffffff600000 4 r-x-- 0000000000000000 000:00000 [ anon ] ffffffffff600000 4 r-x-- 0000000000000000 000:00000 [ anon ] mapped: 17052740K writeable/private: 2523144K shared: 13562836K root@es2:~#
VIRT stands for the virtual size of a process, which is the sum of memory it is actually using, memory it has mapped into itself (for instance the video card's RAM for the X server), files on disk that have been mapped into it (most notably shared libraries), and memory shared with other processes.
VIRT (Virtual Memory Size in KiB): Depicts the total amount of virtual memory used by the task. Virtual memory includes all code, data, and shared libraries. It also includes pages that have been swapped out and pages that have been mapped but not used.
VIRT represents how much memory the program is able to access at the present moment. RES stands for the resident size, which is an accurate representation of how much actual physical memory a process is consuming. (This also corresponds directly to the %MEM column.)
VIRT is the total memory that this process has access to shared memory, mapped pages, swapped out pages, etc. RES is the total physical memory used shared or private that the process has access to.
I would stop worrying about the VIRT thing.
Virtual Memory Usage from Java under Linux, too much memory used
I suspect the top page is allocated for vsyscall purposes (see What are vdso and vsyscall?). Meaning that whilst your process has a 16Gb range of virtual memory mapped, it certainly isn't using anything like that in terms of physical memory. Go by the RES entry.
Regarding whether you need more machines or more memory... Is it slow? You would need to do more forensics on where your bottlenecks are, but the htop capture there doesn't appear to be under much pressure.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With