Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Missing XLA configuration when running pytorch/xla

I am trying to run GCP TPU with Pytorch/XLA, I am using a VM with debian-9-torch-xla-v20200818 image, I initiate the TPU and check it is running using ctpu status which shows that both the CPU and TPU are running, I then activate the torch-xla-nightly environment, but when I try to invoke this simple code:

import torch
import torch_xla
import torch_xla.core.xla_model as xm

dev = xm.xla_device()
t1 = torch.ones(3, 3, device = dev)
print(t1)

this error comes up:

Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/anaconda3/envs/torch-xla-nightly/lib/python3.6/site-packages/torch_xla/core/xla_model.py", line 231, in xla_device
devkind=devkind if devkind is not None else None)
File "/anaconda3/envs/torch-xla-nightly/lib/python3.6/site-packages/torch_xla/core/xla_model.py", line 136, in get_xla_supported_devices
 xla_devices = _DEVICES.value
File "/anaconda3/envs/torch-xla-nightly/lib/python3.6/site-packages/torch_xla/utils/utils.py", line 32, in value
self._value = self._gen_fn()
File "/anaconda3/envs/torch-xla-nightly/lib/python3.6/site-packages/torch_xla/core/xla_model.py", line 18, in <lambda>
_DEVICES = xu.LazyProperty(lambda: torch_xla._XLAC._xla_get_devices())
RuntimeError: tensorflow/compiler/xla/xla_client/computation_client.cc:274 : Missing XLA configuration

I tried everything but nothing seem to work.

like image 483
flybrain Avatar asked Aug 19 '20 11:08

flybrain


1 Answers

Take a look at this link as it seems to pertain to the issue. Maybe you didn't setup the XRT_TPU_CONFIG: (vm)$ export XRT_TPU_CONFIG="tpu_worker;0;$TPU_IP_ADDRESS:8470" Follow the instructions here and you should be fine.

like image 88
Cam Avatar answered Sep 16 '22 13:09

Cam