Does anyone know what will be a good library for computing linear algebra in Android (SVD, QR, LU, least-squares, inverse, etc) ?
The conventional Linear Algebra libraries are implemented in layers. Basic Linear Algebra Subprogram (BLAS) is in the bottom layer. Linear Algebra Package (LAPACK) is built on top of BLAS. The interfaces for these two layer libraries are standardized back in 1990s, and the hardware vendors will usually provide various customized implementations for their architectures. LAPACK provides the linear algebra library operations (SVD, QR, LU, least-squares, inverse, etc) you mentioned. In the most recent years, some more user-friendly linear algebra libraries emerge (e.g. Armadillo, Eigen), which actually provide some wrappers for conventional BLAS and LAPACK library.
JBLAS is just a java implementation of traditional BLAS. JAMA is also a LAPACK-like library implemented with Java. These two libraries are acutally not targetting at Android. But since Android programming usually involves Java, we can make them work on Android. However, we cannot expect the performance out of these implementations. My argument is that performance is a key factor, since you are invoking the libraries instead of writing it yourself, and high performance will usually boost low energy cost in mobile platforms with Android OS.
While the above linear algebra libraries usually target at CPU (e.g. x86 architecture, OS: Linux/Windows/MacOS), experts are now making progress to also provide full stack supports on mobile platforms (e.g. ARM, OS: Android).
I just notice that Qualcomm just released its own BLAS-like library Snapdragon Math Library, which can run on Qualcomm customized ARM architecture. With the top level LAPACK linking to it, these linear algebra operations (SVD, QR, LU, least-squares, inverse, etc) can be implemented on Android with high performance.
Most recently, with the rapid development in deep learning, a number of neural network packages like NNPACK become popular. Under the hood they are linear algebra library with low-level high-performance implementations of primitives for different layers in neural networks.
Jama works fairly well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With