I was wondering if there is any possibility to use Named-Entity-Recognition with a self trained model in tensorflow.
There is a word2vec implementation, but I could not find the 'classic' POS or NER tagger.
Thanks for your help!
You can adapt the Sequence-to-Sequence model for NER tagging. Your training text is the source vocabulary/sequences to the encoder:
Yesterday afternoon , Mike Smith drove to New York .
your BIO / BILOU NER tags are your target vocabulary/sequences to the decoder for NER tagging:
O O O B_PER I_PER O O B_LOC I_LOC O
or instead use POS tags to the decoder for POS tagging:
NN NN , NNP NNP VBD TO NNP NNP .
[IMHO using a deep learning approach usually eliminates the need POS tagging as an intermediate step, unless you specifically need those features as an output for something.]
You would probably want to switch off the word embeddings for the decoder.
This well-known paper applies sequence-to-sequence models to syntactic parsing which has some similarities to the POS and/or NER tasks: Grammar as a Foreign Language
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With