Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How use TPU in google colab

Google colab brings TPUs in the Runtime Accelerator. I found an example, How to use TPU in Official Tensorflow github. But the example not worked on google-colaboratory. It stuck on following line:

tf.contrib.tpu.keras_to_tpu_model(model, strategy=strategy)

When I print available devices on colab it return [] for TPU accelerator. Does anyone knows how to use TPU on colab?

enter image description here

like image 399
Amir Avatar asked Sep 27 '18 06:09

Amir


People also ask

Is TPU available in Google Colab?

TPUs are Google's custom-developed application-specific integrated circuits (ASICs) used to accelerate machine learning workloads. They are available through Google Colab, the TPU Research Cloud, and Cloud TPU.

Is TPU faster than GPU Colab?

The number of TPU core available for the Colab notebooks is 8 currently. Takeaways: From observing the training time, it can be seen that the TPU takes considerably more training time than the GPU when the batch size is small. But when batch size increases the TPU performance is comparable to that of the GPU.


Video Answer


1 Answers

Here's a Colab-specific TPU example: https://colab.research.google.com/github/tensorflow/tpu/blob/master/tools/colab/shakespeare_with_tpu_and_keras.ipynb

The key lines are those to connect to the TPU itself:

# This address identifies the TPU we'll use when configuring TensorFlow.
TPU_WORKER = 'grpc://' + os.environ['COLAB_TPU_ADDR']

...

tpu_model = tf.contrib.tpu.keras_to_tpu_model(
training_model,
strategy=tf.contrib.tpu.TPUDistributionStrategy(
    tf.contrib.cluster_resolver.TPUClusterResolver(TPU_WORKER)))

(Unlike a GPU, use of the TPU requires an explicit connection to the TPU worker. So, you'll need to tweak your training and inference definition in order to observe a speedup.)

like image 192
Bob Smith Avatar answered Oct 20 '22 01:10

Bob Smith