One way is to download the model each time from tensorflow_hub
like following
import tensorflow as tf
import tensorflow_hub as hub
hub_url = "https://tfhub.dev/google/tf2-preview/nnlm-en-dim128/1"
embed = hub.KerasLayer(hub_url)
embeddings = embed(["A long sentence.", "single-word", "http://example.com"])
print(embeddings.shape, embeddings.dtype)
I want to download the file once and use again and again with out downloading each time
Hub load is created by belt. tension. Belt tension, or the difference between. the tight side and slack side tension called the. effective tension, Te, is what creates torque.
print(embeddings. shape) #(4,128) TensorFlow Hub is a repository of trained machine learning models ready for fine-tuning and deployable anywhere. Reuse trained models like BERT and Faster R-CNN with just a few lines of code.
TensorFlow Hub is an open repository and library for reusable machine learning. The tfhub. dev repository provides many pre-trained models: text embeddings, image classification models, TF. js/TFLite models and much more. The repository is open to community contributors.
You can use the hub.load()
method to load a TF Hub module. Also, the docs say,
Currently this method is fully supported only with TensorFlow 2.x and with modules created by calling
tensorflow.saved_model.save()
. The method works in both eager and graph modes.
The hub.load
method has an argument handle
. The types of modules handles are,
Smart URL resolvers such as tfhub.dev, e.g.: https://tfhub.dev/google/nnlm-en-dim128/1
.
A directory on a file system supported by Tensorflow containing module files. This may include a local directory (e.g. /usr/local/mymodule
) or a Google Cloud Storage bucket (gs://mymodule
).
A URL pointing to a TGZ archive of a module, e.g. https://example.com/mymodule.tar.gz
.
You can use the 2nd and the 3rd points.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With