I'm trying to use Stanford CoreNLP as a library in my java program. I use IntelliJ as the IDE. I was trying to test the library, so I wrote this code:
import edu.stanford.nlp.pipeline.StanfordCoreNLP;
import java.util.Properties;
/**
* Created by Benjamin on 15/5/4.
*/
public class SentimentAnaTest {
public static void main(String[] args) {
Properties props = new Properties();
props.setProperty("annotators", "tokenize, ssplit, pos, lemma, ner, parse, dcoref");
StanfordCoreNLP pipeline = new StanfordCoreNLP(props);
}
}
and it shows the error like this:
Adding annotator tokenize
TokenizerAnnotator: No tokenizer type provided. Defaulting to PTBTokenizer.
Adding annotator ssplit
Adding annotator pos
Exception in thread "main" java.lang.RuntimeException: edu.stanford.nlp.io.RuntimeIOException: Unrecoverable error while loading a tagger model
at edu.stanford.nlp.pipeline.AnnotatorFactories$4.create(AnnotatorFactories.java:292)
at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:85)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:289)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:126)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:122)
at SentimentAnaTest.main(SentimentAnaTest.java:12)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: edu.stanford.nlp.io.RuntimeIOException: Unrecoverable error while loading a tagger model
at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:770)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:298)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:263)
at edu.stanford.nlp.pipeline.POSTaggerAnnotator.loadModel(POSTaggerAnnotator.java:97)
at edu.stanford.nlp.pipeline.POSTaggerAnnotator.<init>(POSTaggerAnnotator.java:77)
at edu.stanford.nlp.pipeline.AnnotatorImplementations.posTagger(AnnotatorImplementations.java:59)
at edu.stanford.nlp.pipeline.AnnotatorFactories$4.create(AnnotatorFactories.java:290)
... 10 more
Caused by: java.io.IOException: Unable to resolve "edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger" as either class path, filename or URL
at edu.stanford.nlp.io.IOUtils.getInputStreamFromURLOrClasspathOrFileSystem(IOUtils.java:481)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:765)
... 16 more
I read a solution here but couldn't figure out what the problem was, because I added the library from the Maven Central Repository which already included the "stanford-corenlp-3.5.2-models.jar".
From the error message it seemed like the program was trying to load a file from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger
so I download the tagger file from here and put it into /edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger
. However it still didn't work.
Can someone tell me whats this error is about and help me to solve it? Thank you!
I had the same problem. Fixed using:
<dependencies>
<dependency>
<groupId>edu.stanford.nlp</groupId>
<artifactId>stanford-corenlp</artifactId>
<version>3.5.2</version>
</dependency>
<dependency>
<groupId>edu.stanford.nlp</groupId>
<artifactId>stanford-corenlp</artifactId>
<version>3.5.2</version>
<classifier>models</classifier>
</dependency>
</dependencies>
or for SBT:
"edu.stanford.nlp" % "stanford-corenlp" % "3.5.2",
"edu.stanford.nlp" % "stanford-corenlp" % "3.5.2" classifier "models",
The tagger file has to be placed into your project root.
project
-- src --> SentimentAnaTest
-- english-left3words/english-left3words-distsim.tagger
Tested in Eclipse project.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With