Is that possible to generate texts from OpenAI GPT-2 using TensorFlowJS?
If not what is the limitation, like model format or ...?
GPT-2 Pre-training and text generation, implemented in Tensorflow 2.0. Distributed training on multiple gpu.
If you intend to fine-tune gpt2 I recommend installing TensorFlow version 1.15.
js is a library for machine learning in JavaScript. Develop ML models in JavaScript, and use ML directly in the browser or in Node.
Trained on 40 GB of textual data, GPT-2 is a very large model containing a massive amount of compressed knowledge from a cross-section of the internet. GPT-2 has a lot of potential use cases. It can be used to predict the probability of a sentence. This, in turn, can be used for text autocorrection.
I don't see any reason as to why not, other than maybe some operation that is in gpt-2 that is not supported by tensorflowjs.
I don't know how to do it, but here's a nice starting point:
install.sh
python3 -m pip install -q git+https://github.com/huggingface/transformers.git
python3 -m pip install tensorflow
save.py
from transformers import TFGPT2LMHeadModel, GPT2Tokenizer
tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
# add the EOS token as PAD token to avoid warnings
model = TFGPT2LMHeadModel.from_pretrained("gpt2", pad_token_id=tokenizer.eos_token_id)
model.save("./test_gpt2")
that will give you a SavedModel file. Now you can try figure out the input and output nodes, and use tensorflowjs_converter
to try and convert it. Pointer: https://www.tensorflow.org/js/tutorials/conversion/import_saved_model.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With