How can I extract noun phrases from text using spacy?
I am not referring to part of speech tags.
In the documentation I cannot find anything about noun phrases or regular parse trees.
Noun chunks are “base noun phrases” – flat phrases that have a noun as their head. You can think of noun chunks as a noun plus the words describing the noun – for example, “the lavish green grass” or “the world's largest tech fund”. To get the noun chunks in a document, simply iterate over Doc.
spaCy is designed specifically for production use and helps you build applications that process and “understand” large volumes of text. It can be used to build information extraction or natural language understanding systems, or to pre-process text for deep learning.
if (val = = 'NN' or val = = 'NNS' or val = = 'NNPS' or val = = 'NNP' ): print (text, " is a noun." ) else : print (text, " is not a noun." )
If you want base NPs, i.e. NPs without coordination, prepositional phrases or relative clauses, you can use the noun_chunks iterator on the Doc and Span objects:
>>> from spacy.en import English
>>> nlp = English()
>>> doc = nlp(u'The cat and the dog sleep in the basket near the door.')
>>> for np in doc.noun_chunks:
>>> np.text
u'The cat'
u'the dog'
u'the basket'
u'the door'
If you need something else, the best way is to iterate over the words of the sentence and consider the syntactic context to determine whether the word governs the phrase-type you want. If it does, yield its subtree:
from spacy.symbols import *
np_labels = set([nsubj, nsubjpass, dobj, iobj, pobj]) # Probably others too
def iter_nps(doc):
for word in doc:
if word.dep in np_labels:
yield word.subtree
import spacy
nlp = spacy.load("en_core_web_sm")
doc =nlp('Bananas are an excellent source of potassium.')
for np in doc.noun_chunks:
print(np.text)
'''
Bananas
an excellent source
potassium
'''
for word in doc:
print('word.dep:', word.dep, ' | ', 'word.dep_:', word.dep_)
'''
word.dep: 429 | word.dep_: nsubj
word.dep: 8206900633647566924 | word.dep_: ROOT
word.dep: 415 | word.dep_: det
word.dep: 402 | word.dep_: amod
word.dep: 404 | word.dep_: attr
word.dep: 443 | word.dep_: prep
word.dep: 439 | word.dep_: pobj
word.dep: 445 | word.dep_: punct
'''
from spacy.symbols import *
np_labels = set([nsubj, nsubjpass, dobj, iobj, pobj])
print('np_labels:', np_labels)
'''
np_labels: {416, 422, 429, 430, 439}
'''
https://www.geeksforgeeks.org/use-yield-keyword-instead-return-keyword-python/
def iter_nps(doc):
for word in doc:
if word.dep in np_labels:
yield(word.dep_)
iter_nps(doc)
'''
<generator object iter_nps at 0x7fd7b08b5bd0>
'''
## Modified method:
def iter_nps(doc):
for word in doc:
if word.dep in np_labels:
print(word.text, word.dep_)
iter_nps(doc)
'''
Bananas nsubj
potassium pobj
'''
doc = nlp('BRCA1 is a tumor suppressor protein that functions to maintain genomic stability.')
for np in doc.noun_chunks:
print(np.text)
'''
BRCA1
a tumor suppressor protein
genomic stability
'''
iter_nps(doc)
'''
BRCA1 nsubj
that nsubj
stability dobj
'''
You can also get noun from a sentence like this:
import spacy
nlp=spacy.load("en_core_web_sm")
doc=nlp("When Sebastian Thrun started working on self-driving cars at "
"Google in 2007, few people outside of the company took him "
"seriously. “I can tell you very senior CEOs of major American "
"car companies would shake my hand and turn away because I wasn’t "
"worth talking to,” said Thrun, in an interview with Recode earlier "
"this week.")
#doc text is from spacy website
for x in doc :
if x.pos_ == "NOUN" or x.pos_ == "PROPN" or x.pos_=="PRON":
print(x)
# here you can get Nouns, Proper Nouns and Pronouns
If you want to specify more exactly which kind of noun phrase you want to extract, you can use textacy's matches
function. You can pass any combination of POS tags. For example,
textacy.extract.matches(doc, "POS:ADP POS:DET:? POS:ADJ:? POS:NOUN:+")
will return any nouns that are preceded by a preposition and optionally by a determiner and/or adjective.
Textacy was built on spacy, so they should work perfectly together.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With