I have to read some .mat data files from c++, I read through the documentation, but I would like to know how to handle the data in a clean and elegant way, e.g. using std:vector(modest .mat file size(10M~1G), but memory issues should be taken seriously)
My function is sth like:
#include <stdio.h>
#include "mat.h"
#include <vector>
int matread(const char *file, const vector<double>& pdata_v) {
MATFile *pmat;
pmat=matOpen("data.mat","r");
if (pmat == NULL) {
printf("Error opening file %s\n", file);
return(1);
}
mxArray *pdata = matGetVariable(pmat, "LocalDouble");
// pdata -> pdata_v
mxDestroy pa1; // clean up
return 0;
}
so, the question is, how can I make a copy from mxArray *pdata array to vector pdata_v efficiently and safely?
To load a subset of variables from a MAT-file on the Home tab, in the Variable section, click Import Data. Select the MAT-file you want to load and click Open. You also can drag the desired variables from the Current Folder browser Details panel of the selected MAT-file to the Workspace browser.
After logging is complete, you can open MAT-files in MATLAB®, and use them for further analysis. Since the data points are stored in MAT-files, you can directly open the files in MATLAB without converting them into any other format.
Version 7.3 MAT-files use an HDF5 based format that requires some overhead storage to describe the contents of the file. For cell arrays, structure arrays, or other containers that can store heterogeneous data types, Version 7.3 MAT-files are sometimes larger than Version 7 MAT-files.
Run eeglab. m, next in the pop up window: file->import data -> using EEGLAB functions and plugins -> from ASCII/float file or Matlab array.
Here is an example of using the MAT-API:
#include "mat.h"
#include <iostream>
#include <vector>
void matread(const char *file, std::vector<double>& v)
{
// open MAT-file
MATFile *pmat = matOpen(file, "r");
if (pmat == NULL) return;
// extract the specified variable
mxArray *arr = matGetVariable(pmat, "LocalDouble");
if (arr != NULL && mxIsDouble(arr) && !mxIsEmpty(arr)) {
// copy data
mwSize num = mxGetNumberOfElements(arr);
double *pr = mxGetPr(arr);
if (pr != NULL) {
v.reserve(num); //is faster than resize :-)
v.assign(pr, pr+num);
}
}
// cleanup
mxDestroyArray(arr);
matClose(pmat);
}
int main()
{
std::vector<double> v;
matread("data.mat", v);
for (size_t i=0; i<v.size(); ++i)
std::cout << v[i] << std::endl;
return 0;
}
First we build the standalone program, and create some test data as a MAT-file:
>> mex -client engine -largeArrayDims test_mat.cpp
>> LocalDouble = magic(4)
LocalDouble =
16 2 3 13
5 11 10 8
9 7 6 12
4 14 15 1
>> save data.mat LocalDouble
Now we run the program:
C:\> test_mat.exe
16
5
9
4
2
11
7
14
3
10
6
15
13
8
12
1
Here's another idea. If you're allergic to bare pointers in C++ code (nothing wrong with them, by the way), you could wrap the bare pointer in a boost or C++11 smart pointer with a deleter that calls the correct mxDestroyArray()
when the pointer goes out of scope. That way you don't need a copy, nor does your user code need to know how to correctly deallocate.
typedef shared_ptr<mxArray> mxSmartPtr;
mxSmartPtr readMATarray(MATFile *pmat, const char *varname)
{
mxSmartPtr pdata(matGetVariable(pmat, varname),
mxDestroyArray); // set deleter
return pdata;
}
int some_function() {
mxSmartPtr pdata = readMATarray(pmat, "LocalDouble");
...
// pdata goes out of scope, and mxDestroy automatically called
}
Idea taken from here: http://www.boost.org/doc/libs/1_56_0/libs/smart_ptr/sp_techniques.html#incomplete
You can first get the data pointer of the mxArray *pdata
and then copy data to vector<double> pdata_v
:
double *ptr = (double *) mxGetData(pdata);
pdata_v.resize(numOfData);
memcpy(&pdata_v[0], ptr, numOfData*sizeof(double));
ps1: Pay extra attention to that, in MATLAB, matrice are in col-major order. So if pdata
stores [1 2 3; 4 5 6]
, pdata_v
will be 1 4 2 5 3 6
.
ps2: Change const vector<double>& pdata_v
to vector<double>& pdata_v
if you want to change its content.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With