Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Story behind Derived to MatrixBase<Derived> conversion

Tags:

c++

eigen

eigen3

What happens when you pass a matrix object into a function as a MatrixBase reference? I do not get what really happens behind the scenes.

An example function code would be:

#include <Eigen/Core>
#include <iostream>

using namspace Eigen;

template <typename Derived>
void print_size(const MatrixBase<Derived>& b)
{
  std::cout << "size (rows, cols): " << b.size() << " (" << b.rows()
            << ", " << b.cols() << ")" << std::endl;
  std::cout << sizeof(b) << std::endl;
}

int main() {
    Matrix<float, 2, 2> m;
    m << 0.0, 0.1,
         0.2, 0.3;

    print_size(m);
    std::cout << sizeof(m) << std::endl;
}

It gives the following output:

size (rows, cols): 4 (2, 2)
1
16

Where does the 16 vs. 1 difference come from?

And also why would a conversion be necessary?

Thanks in advance!

like image 662
OnurA Avatar asked Dec 23 '22 10:12

OnurA


1 Answers

sizeof is evaluated at compile time, so it is concerned with the declared (static) type of objects. b is of type MatrixBase<Derived> (ignoring the reference, just like sizeof does), which is most likely an empty base class, and hence has size 1.

m, on the other hand, is of type Matrix<float, 2, 2>, which apparently has size 16 on your platform.

I've created a live example demonstrating this behaviour of sizeof.

like image 148
Angew is no longer proud of SO Avatar answered Dec 25 '22 22:12

Angew is no longer proud of SO