Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Insufficient space for shared memory file when I try to run nutch generate command

Tags:

java

jvm

nutch

I have been running nutch crawling commands for the passed 3 weeks and now I get the below error when I try to run any nutch command:

Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file: /tmp/hsperfdata_user/27050 Try using the -Djava.io.tmpdir= option to select an alternate temp location.

Error: Could not find or load main class ___.tmp.hsperfdata_user.27055

How do I solve this issue?

like image 901
peter Avatar asked Jan 12 '13 05:01

peter


2 Answers

Yeah this is really an issue with the space available on the volume your /tmp is mounted on. If you are running this on EC2, or any cloud platform, attach a new volume and mount your /tmp on that. If running locally, no other option besides cleaning up to make more room.

Try commands like: df -h to see the % used and available space on each volume mounted on your instance. You will see something like:

Filesystem            Size  Used Avail Use% Mounted on
/dev/xvda1            7.9G  7.9G     0 100% /
tmpfs                  30G     0   30G   0% /dev/shm
/dev/xvda3             35G  1.9G   31G   6% /var
/dev/xvda4             50G   44G  3.8G  92% /opt
/dev/xvdb             827G  116G  669G  15% /data/1
/dev/xvdc             827G  152G  634G  20% /data/2
/dev/xvdd             827G  149G  637G  19% /data/3
/dev/xvde             827G  150G  636G  20% /data/4
cm_processes           30G   22M   30G   1% /var/run/cloudera-scm-agent/process

You will begin to see this error when the disk space is full as shown in this dump.

like image 83
Kingz Avatar answered Sep 22 '22 06:09

Kingz


I think that the temporary location that was used has got full. Try using some other location. Also, check the #inodes free in each partition and clear up some space.

EDIT: There is no need to change the /tmp at OS level. We want nutch and hadoop to use some other location for storing temp files. Look at this to do that : What should be hadoop.tmp.dir ?

like image 39
Tejas Patil Avatar answered Sep 23 '22 06:09

Tejas Patil