Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using Stanford CoreNLP - Java heap space

I am trying to use the coreference module of the Stanford CoreNLP pipeline, but I end up getting an OutOfMemory error in Java. I already increased the heap size (via Run->Run Configurations->VM Arguments in Eclipse) and set them to -Xmx3g -Xms1g. I even tried -Xmx12g -Xms4g, but that didn't help either. I'm using Eclipse Juno on OS X 10.8.5 with Java 1.6 on a 64-bit machine. Does anyone have an idea what else I could try?

I'm using the example code from the website (http://nlp.stanford.edu/software/corenlp.shtml):

Properties props = new Properties();
props.put("annotators", "tokenize, ssplit, pos, lemma, ner, parse, dcoref");
StanfordCoreNLP pipeline = new StanfordCoreNLP(props);

String text = "Stanford University is located in California. It is a great university";

Annotation document = new Annotation(text);

pipeline.annotate(document);

List<CoreMap> sentences = document.get(SentencesAnnotation.class);

for(CoreMap sentence: sentences) {
  for (CoreLabel token: sentence.get(TokensAnnotation.class)) {
    String word = token.get(TextAnnotation.class);
    String pos = token.get(PartOfSpeechAnnotation.class);
    String ne = token.get(NamedEntityTagAnnotation.class);       
  }

  Tree tree = sentence.get(TreeAnnotation.class);
  SemanticGraph dependencies = sentence.get(CollapsedCCProcessedDependenciesAnnotation.class);
}

Map<Integer, CorefChain> graph = document.get(CorefChainAnnotation.class);

And I get the error:

Adding annotator tokenize
Adding annotator ssplit
Adding annotator pos
Reading POS tagger model from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [0.9 sec].
Adding annotator lemma
Adding annotator ner
Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... done [3.1 sec].
Initializing JollyDayHoliday for sutime
Reading TokensRegex rules from edu/stanford/nlp/models/sutime/defs.sutime.txt
Reading TokensRegex rules from edu/stanford/nlp/models/sutime/english.sutime.txt
Jan 9, 2014 10:39:37 AM edu.stanford.nlp.ling.tokensregex.CoreMapExpressionExtractor appendRules
INFO: Ignoring inactive rule: temporal-composite-8:ranges
Reading TokensRegex rules from edu/stanford/nlp/models/sutime/english.holidays.sutime.txt
Adding annotator dcoref
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.lang.String.substring(String.java:1939)
at java.lang.String.subSequence(String.java:1972)
at java.util.regex.Pattern.split(Pattern.java:1002)
at java.lang.String.split(String.java:2292)
at java.lang.String.split(String.java:2334)
at edu.stanford.nlp.dcoref.Dictionaries.loadGenderNumber(Dictionaries.java:382)
at edu.stanford.nlp.dcoref.Dictionaries.<init>(Dictionaries.java:553)
at edu.stanford.nlp.dcoref.Dictionaries.<init>(Dictionaries.java:463)
at edu.stanford.nlp.dcoref.SieveCoreferenceSystem.<init>(SieveCoreferenceSystem.java:282)
at edu.stanford.nlp.pipeline.DeterministicCorefAnnotator.<init>(DeterministicCorefAnnotator.java:52)
at edu.stanford.nlp.pipeline.StanfordCoreNLP$11.create(StanfordCoreNLP.java:775)
at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:81)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:260)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:127)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:123)
at extraction.BaselineApproach.main(BaselineApproach.java:88)
like image 955
Isabelle Augenstein Avatar asked Jan 09 '14 11:01

Isabelle Augenstein


1 Answers

It seems the problem wasn't Stanford CoreNLP or Java, but Eclipse. Here's what I tried:

  • Checking if the arguments actually get passed through to the VM by printing System.out.println(Runtime.getRuntime().maxMemory());
  • Checking what arguments Eclipse uses by looking at what processes are running and what arguments they're using (ps -al)
  • It then turned out that Eclipse wasn't using the VM settings I specified.

  • When I checked with Runtime.getRuntime().maxMemory(), the output was 530186240 instead of 3203792896.
  • The arguments used by Eclipse were in the wrong order, i.e. -Xms1g -Xmx3g -Xmx256m -Xmx512m instead of -Xmx256m -Xmx512m -Xms1g -Xmx3g. Since only the last set of arguments are used, this explains why it's not working.
  • I then tried the following to fix it:

  • Don't override settings for a specific application, but rather override the default settings (see http://help.eclipse.org/juno/index.jsp?topic=%2Forg.eclipse.pde.doc.user%2Fguide%2Ftools%2Flaunchers%2Farguments.htm).
  • When that didn't work either, I tried to:

  • Manually edit the eclipse.ini file (add -vmargs -Xms1g -Xmx3g)
  • When that didn't work either, I re-installed Eclipse. Now everything works again: I can set default settings and override them for specific applications.

    like image 51
    Isabelle Augenstein Avatar answered Nov 03 '22 15:11

    Isabelle Augenstein