Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why eigenvector & eigenvalue in LDA become zero?

I'd like to implement fast PLDA (Probabilistic Linear Discriminant Analysis) in OpenCV. in this, LINK fast PLDA have been implemented in Matlab and Python. One of the parts of PLDA is LDA. I've written following code for implementing LDA in OpenCV:

int LDA_dim = 120;  // Load data  FileStorage fs("newStorageFile.yml", FileStorage::READ);  // Read data  Mat train_data, train_labels;  fs["train_data"] >> train_data; fs["train_labels"] >> train_labels;  // LDA  if (LDA_dim > 0) {     LDA lda(LDA_dim);     lda.compute(train_data, train_labels);          // compute eigenvectors      Mat eigenvectors = lda.eigenvectors(); } 

I've converted database that was introduced in above link from .mat to .yml. The result is newStorageFile.yml that I've uploaded here. train_data have 650 rows and 600 cols and train_labels have 650 rows and 1 cols. I don't know why eigenvectors and eigenvalue become zero!!? PLZ help me to fix this code.

It's better to bring the code that convert data from .mat to .yml :

function matlab2opencv( variable, fileName, flag)  [rows cols] = size(variable);  % Beware of Matlab's linear indexing variable = variable';  % Write mode as default if ( ~exist('flag','var') )     flag = 'w';  end  if ( ~exist(fileName,'file') || flag == 'w' )     % New file or write mode specified      file = fopen( fileName, 'w');     fprintf( file, '%%YAML:1.0\n'); else     % Append mode     file = fopen( fileName, 'a'); end  % Write variable header fprintf( file, '    %s: !!opencv-matrix\n', inputname(1)); fprintf( file, '        rows: %d\n', rows); fprintf( file, '        cols: %d\n', cols); fprintf( file, '        dt: f\n'); fprintf( file, '        data: [ ');  % Write variable data for i=1:rows*cols     fprintf( file, '%.6f', variable(i));     if (i == rows*cols), break, end     fprintf( file, ', ');     if mod(i+1,4) == 0         fprintf( file, '\n            ');     end end  fprintf( file, ']\n');  fclose(file); 

Edit 1 ) I've tried LDA with some sample that myself generate:

Mat train_data = (Mat_<double>(3, 3) << 25, 45, 44, 403, 607, 494, 2900, 5900, 2200);     Mat train_labels = (Mat_<int>(3, 1) << 1, 2, 3 );      LDA lda(LDA_dim);      lda.compute(train_data, train_labels);          // compute eigenvectors     Mat_<double> eigenvectors = lda.eigenvectors();     Mat_<double> eigenvalues = lda.eigenvalues();     cout << eigenvectors << endl << eigenvalues; 

but I've to got same result: eigenvalue and eigenvector become zero: eigenvector and eigenvalue

like image 225
Saeid Avatar asked Oct 30 '17 06:10

Saeid


People also ask

Why is it necessary to calculate the eigenvectors and eigenvalues?

Eigen vectors and eigen values help us understand linear transformations in a much simpler way and so we find them. Eigen vectors are directions along which a linear transformation acts simply either by stretching or compressing. Eigenvalues are the factors by which the compression or stretch occurs.

Why are eigenvectors important in machines?

Decomposing a matrix in terms of its eigenvalues and its eigenvectors gives valuable insights into the properties of the matrix. Certain matrix calculations, like computing the power of the matrix, become much easier when we use the eigendecomposition of the matrix.

Why are eigenvectors used in PCA?

The eigenvectors and eigenvalues of a covariance (or correlation) matrix represent the “core” of a PCA: The eigenvectors (principal components) determine the directions of the new feature space, and the eigenvalues determine their magnitude.


1 Answers

It is because of the floating point imprecision that the eigen values get close to zero.

like image 113
Vineetha Vijayan Avatar answered Sep 27 '22 20:09

Vineetha Vijayan