colab offers free TPUs. It's easy to see how many cores are given, but I was wondering if its possible to see how much memory per core?
As far as I know we don't have an Tensorflow op or similar for accessing memory info, though in XRT we do. In the meantime, would something like the following snippet work?
import os
from tensorflow.python.profiler import profiler_client
tpu_profile_service_address = os.environ['COLAB_TPU_ADDR'].replace('8470', '8466')
print(profiler_client.monitor(tpu_profile_service_address, 100, 2))
Output looks like:
Timestamp: 22:23:03
TPU type: TPU v2
Utilization of TPU Matrix Units (higher is better): 0.000%
TPUv2 has 8GB per-core and TPUv3 has 16GB HBM per-core (https://cloud.google.com/tpu).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With