I have been using TensorRT and TensorFlow-TRT to accelerate the inference of my DL algorithms.
Then I have heard of:
Both seem to accelerate DL. But I am having a hard time to understand them. Can anyone explain them in simple terms?
Trax
is a deep learning framework created by Google and extensively used by the Google Brain team. It comes as an alternative to TensorFlow
and PyTorch
when it comes to implementing off-the-shelf state of the art deep learning models, for example Transformers, Bert etc. , in principle with respect to the Natural Language Processing field.
Trax
is built upon TensorFlow
and JAX
. JAX
is an enhanced and optimised version of Numpy. The important distinction about JAX
and NumPy
is that the former using a library called XLA (advanced linear algebra) which allows to run your NumPy
code on GPU
and TPU
rather than on CPU
like it happens in the plain NumPy
, thus speeding up computation.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With