I am trying to use Google Cloud's TPU from Colab. I was able to do it following the tutorial by using Tensorflow.
Does anybody know if it is possible to make use of the TPUs using PyTorch? If so how can I do it? Do you have any example?
Check out our repository pytorch/xla where you can start training PyTorch models on TPUs.
Also, you can even use free TPUs on Colab with PyTorch with these Colab notebooks.
As of today, PyTorch Lightning allows to run PyTorch code on TPUs trivially (you will need the XLA library installed). From their demo notebook on colab:
from pytorch_lightning import Trainer
model = CoolSystem()
# most basic trainer, uses good defaults
trainer = Trainer(num_tpu_cores=8)
trainer.fit(model)
Currently, it's not possible to use Cloud TPU with PyTorch since it's designed specifically for Tensorflow.
But, according to this product news posted three days ago in the Google Cloud blog, "engineers on Google’s TPU team are actively collaborating with core PyTorch developers to connect PyTorch to Cloud TPUs".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With