I am using Google Colab and the following import doesn't work somehow:
from bert.tokenization import FullTokenizer
I am getting this error:
ModuleNotFoundError: No module named 'bert.tokenization'
I tried to install bert by running the following command:
!pip install --upgrade bert
Any idea how to resolve this error?
I found it:
!pip install bert-tensorflow
For anyone experiencing this problem with TensorFlow 2.0 and the bert-for-tf2 library, I found out that I was missing some files after using pip3 install. I've posted my solution here:
https://github.com/google-research/bert/issues/638#issuecomment-592488730
install :
pip install bert-for-tf2
then import,
from bert import bert_tokenization
BertTokenizer = bert_tokenization.FullTokenizer
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With