I have been trying to use the library neuralcoref
: State-of-the-art coreference resolution based on neural nets and spaCy. I am using Ubuntu 16.04, Python 3.7.3 in conda 1.9.7 and Spacy 2.2.4.
My code (from the https://spacy.io/universe/project/neuralcoref):
import spacy
import neuralcoref
nlp = spacy.load('en_core_web_sm')
neuralcoref.add_to_pipe(nlp)
doc1 = nlp('My sister has a dog. She loves him.')
print(doc1._.coref_clusters)
doc2 = nlp('Angela lives in Boston. She is quite happy in that city.')
for ent in doc2.ents:
print(ent._.coref_cluster)
I have got this error
/home/daniel/anaconda3/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: spacy.morphology.Morphology size changed, may indicate binary incompatibility. Expected 104 from C header, got 112 from PyObject
return f(*args, **kwds)
/home/daniel/anaconda3/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: spacy.vocab.Vocab size changed, may indicate binary incompatibility. Expected 96 from C header, got 104 from PyObject
return f(*args, **kwds)
/home/daniel/anaconda3/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: spacy.tokens.span.Span size changed, may indicate binary incompatibility. Expected 72 from C header, got 80 from PyObject
return f(*args, **kwds)
I have tried to downgrade the version of Spacy to 2.1.0 as suggested by this link:
conda config --append channels conda-forge
conda install spacy=2.1.0
However, I am not able
PackagesNotFoundError: The following packages are not available from current channels:
- spacy=2.1.0
Current channels:
- https://conda.anaconda.org/conda-forge/linux-64
- https://conda.anaconda.org/conda-forge/noarch
- https://repo.anaconda.com/pkgs/main/linux-64
- https://repo.anaconda.com/pkgs/main/noarch
- https://repo.anaconda.com/pkgs/r/linux-64
- https://repo.anaconda.com/pkgs/r/noarch
To search for alternate channels that may provide the conda package you're
looking for, navigate to
https://anaconda.org
and use the search bar at the top of the page.
How can I solve this issue without downgrade? Is there any new updated version of neuralcoref?
For neuralcoref
to work, you need to use spaCy version 2.1.0
and python version 3.7
. That is the only combination that neuralcored works for on Ubuntu 16.04 and on Mac.
python -m venv ./venv
,python -m pip install spacy==2.1.0
.python -m pip install neuralcoref
Hope this helps.
After running your code above, I get the following output:
[My sister: [My sister, She], a dog: [a dog, him]]
Angela: [Angela, She]
Boston: [Boston, that city]
Do Precisely what Raqib has said. I used google colab, so skip (1) if using google colab. Add the following commands:
1)Create a new environment using: (change myenv to the name you want to name the environment to)
conda create --name myenv
Select that environment:
conda info --envs
conda activate myenv
2)Then install the python 3.7 in that environment
!apt-get install python3.7
3)install spacy and neuralcoref with version supported.
!pip install spacy==2.1.0
!pip install neuralcoref
!pip install https://github.com/explosion/spacy-models/releases//download/en_core_web_lg-2.1.0/en_core_web_lg-2.1.0.tar.gz
import pandas as pd
import re
import spacy
import neuralcoref
import en_core_web_lg
nlp = en_core_web_lg.load()
neuralcoref.add_to_pipe(nlp)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With