Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Serializing OpenCV Mat_<Vec3f>

I'm working on a robotics research project where I need to serialize 2D matrices of 3D points: basically each pixel is a 3-vector of floats. These pixels are saved in an OpenCV matrix, and they need to be sent over inter-process communication and saved into files to be processed on multiple computers. I'd like to serialize them in an endian/architecture-independent, space-efficient way, as quickly as possible. cv::imencode here would be perfect, except that it only works on 8-bit and 16-bit elements, and we don't want to lose any precision. The files don't need to be human-readable (although we do that now to ensure data portability, and it's incredibly slow). Are there best practices for this, or elegant ways to do it?

Thanks!

like image 328
btown Avatar asked Nov 13 '10 03:11

btown


3 Answers

The earlier answers are good, but they won't work for non-continuous matrices which arise when you want to serialize regions of interest (among other things). Also, it is unnecessary to serialize elemSize() because this is derived from the type value.

Here's some code that will work regardless of continuity (with includes/namespace)

#pragma once

#include <boost/archive/text_oarchive.hpp>
#include <boost/archive/text_iarchive.hpp>
#include <boost/serialization/utility.hpp>
#include <opencv2/opencv.hpp>

namespace boost {
namespace serialization {

template<class Archive>
void serialize(Archive &ar, cv::Mat& mat, const unsigned int)
{
    int cols, rows, type;
    bool continuous;

    if (Archive::is_saving::value) {
        cols = mat.cols; rows = mat.rows; type = mat.type();
        continuous = mat.isContinuous();
    }

    ar & cols & rows & type & continuous;

    if (Archive::is_loading::value)
        mat.create(rows, cols, type);

    if (continuous) {
        const unsigned int data_size = rows * cols * mat.elemSize();
        ar & boost::serialization::make_array(mat.ptr(), data_size);
    } else {
        const unsigned int row_size = cols*mat.elemSize();
        for (int i = 0; i < rows; i++) {
            ar & boost::serialization::make_array(mat.ptr(i), row_size);
        }
    }

}

} // namespace serialization
} // namespace boost
like image 105
user1520427 Avatar answered Nov 15 '22 20:11

user1520427


Edit: Christoph Heindl has commented on this post with a link to his blog where he has improved on this serialisation code. Highly recommended!

http://cheind.wordpress.com/2011/12/06/serialization-of-cvmat-objects-using-boost/

--

For whoever it may benefit: Some code to serialize Mat& with boost::serialization
I haven't tested with multi-channel data, but everything should work fine.

#include <iostream>
#include <fstream>
#include <boost/archive/binary_oarchive.hpp>
#include <boost/archive/binary_iarchive.hpp>
#include <boost/serialization/split_free.hpp>
#include <boost/serialization/vector.hpp>

BOOST_SERIALIZATION_SPLIT_FREE(Mat)
namespace boost {
namespace serialization {

    /*** Mat ***/
    template<class Archive>
    void save(Archive & ar, const Mat& m, const unsigned int version)
    {
      size_t elemSize = m.elemSize(), elemType = m.type();

      ar & m.cols;
      ar & m.rows;
      ar & elemSize;
      ar & elemType; // element type.
      size_t dataSize = m.cols * m.rows * m.elemSize();

      //cout << "Writing matrix data rows, cols, elemSize, type, datasize: (" << m.rows << "," << m.cols << "," << m.elemSize() << "," << m.type() << "," << dataSize << ")" << endl;

      for (size_t dc = 0; dc < dataSize; ++dc) {
          ar & m.data[dc];
      }
    }

    template<class Archive>
    void load(Archive & ar, Mat& m, const unsigned int version)
    {
        int cols, rows;
        size_t elemSize, elemType;

        ar & cols;
        ar & rows;
        ar & elemSize;
        ar & elemType;

        m.create(rows, cols, elemType);
        size_t dataSize = m.cols * m.rows * elemSize;

        //cout << "reading matrix data rows, cols, elemSize, type, datasize: (" << m.rows << "," << m.cols << "," << m.elemSize() << "," << m.type() << "," << dataSize << ")" << endl;

        for (size_t dc = 0; dc < dataSize; ++dc) {
                  ar & m.data[dc];
        }
    }

}
}

Now, mat can be serialized and deserialized as following:

    void saveMat(Mat& m, string filename) {
            ofstream ofs(filename.c_str());
            boost::archive::binary_oarchive oa(ofs);
            //boost::archive::text_oarchive oa(ofs);
            oa << m;
    }

    void loadMat(Mat& m, string filename) {
            std::ifstream ifs(filename.c_str());
            boost::archive::binary_iarchive ia(ifs);
            //boost::archive::text_iarchive ia(ifs);
            ia >> m;
    }

I've used the binary_oarchive and binary_iarchive here to keep the memory usage down. The binary format doesn't provide portability between platforms, but if desired the text_oarchive/iarchive can be used.

like image 30
TinkerTank Avatar answered Nov 15 '22 20:11

TinkerTank


You could use boost::serialization for that. It's heavily optimized and is pretty easy to integrate.

Possible speed-ups for your case include serializing each object as a raw binary block (see boost::serialization::make_binary) and disabling version tracking (BOOST_SERIALIZATION_DISABLE_TRACKING).

Also, you can experiment with adding compression into your serialization routines to save space (and time in case of data that is easily compressable). This can be implemented with boost::iostreams, for example.

like image 1
Yippie-Ki-Yay Avatar answered Nov 15 '22 20:11

Yippie-Ki-Yay