I am trying to use TPU using pytorch_xla, but it shows import error in _XLAC.
!curl https://raw.githubusercontent.com/pytorch/xla/master/contrib/scripts/env-setup.py -o pytorch-xla-env-setup.py
!python pytorch-xla-env-setup.py --version $VERSION
import torch_xla
import torch_xla.core.xla_model as xm
ImportError Traceback (most recent call last)
<ipython-input-60-6a19e980152f> in <module>()
----> 1 import torch_xla
2 import torch_xla.core.xla_model as xm
/usr/local/lib/python3.6/dist-packages/torch_xla/__init__.py in <module>()
39 import torch
40 from .version import __version__
---> 41 import _XLAC
42
43 _XLAC._initialize_aten_bindings()
ImportError: /usr/local/lib/python3.6/dist-packages/_XLAC.cpython-36m-x86_64-linux-gnu.so: undefined symbol: _ZN2at6native6einsumENSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEN3c108ArrayRefINS_6TensorEEE
PyTorch uses Cloud TPUs just like it uses CPU or CUDA devices, as the next few cells will show. Each core of a Cloud TPU is treated as a different PyTorch device. And tensors can be transferred between CPU and TPU.
PyTorch runs on XLA devices, like TPUs, with the torch_xla package. This document describes how to run your models on these devices.
The TPU is 15x to 30x faster than current GPUs and CPUs on production AI applications that use neural network inference.
Combining PyTorch and Google's cloud-based Colab notebook environment can be a good solution for building neural networks with free access to GPUs.
curl https://raw.githubusercontent.com/pytorch/xla/master/contrib/scripts/env-setup.py -o pytorch-xla-env-setup.py
python pytorch-xla-env-setup.py --version 20200325
Something like:
export XRT_TPU_CONFIG="tpu_worker;0;$TPU_IP_ADDRESS:8470"
Or:
export COLAB_TPU_ADDR="10.16.26.36:8676"
Here is the detailed description: https://github.com/pytorch/xla/blob/master/README.md and example https://cloud.google.com/tpu/docs/tutorials/transformer-pytorch
Also, here is the Google Colab notebook created by PyTorch team, I just tested it, it works OK without any changes: https://colab.research.google.com/github/pytorch/xla/blob/master/contrib/colab/getting-started.ipynb
This notebook will show you how to:
- Install PyTorch/XLA on Colab, which lets you use PyTorch with TPUs.
- Run basic PyTorch functions on TPUs.
- Run PyTorch modules and autograd on TPUs.
- Run PyTorch networks on TPUs.
You may want to follow one of whose examples and try to reproduce the problem. Good luck!
Please try this:
!pip uninstall -y torch
!pip install torch==1.8.2+cpu -f https://download.pytorch.org/whl/lts/1.8/torch_lts.html
!pip install -q cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.8-cp37-cp37m-linux_x86_64.whl
import torch_xla
It worked for me.
Source: googlecolab/colabtools#2237
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With