Serialization of cv::Mat objects using Boost

Update 2012/01/07 As pointed out by Paul, I updated the code fragments to contain missing template arguments.

While working on our real-time 3D reconstruction project ReconstructMe, we found that our unit tests based on single camera images and depth values did not cover all use cases. We decided to stream the complete depth and color values of many frames to disk, so we can base our tests on recorded video data. Here’s how we approached that.

Our class libraries at work are written in C++, heavily based on Boost. We’ve written all our serialization code using Boost.Serialization and that’s why we wanted to use that for serializing OpenCV cv::Mat as well.

Basic Boost.Serialization Support

The first need you need to do to in order to enable serialization support for cv::Mat objects is to provide a load and save functionality for that particular object. There is an entry at Stack Overflow which served as a starting point for our implementation. It provides save and load in generic way for cv::Mat.

We modified the code to use boost::serialization::make_array instead of the custom for-loop for performance reasons. Here is an optimized version:


// file: cvmat_serilization.h

#include <opencv2/opencv.hpp>
#include <boost/serialization/split_free.hpp>
#include <boost/serialization/vector.hpp>

BOOST_SERIALIZATION_SPLIT_FREE(::cv::Mat)
namespace boost {
  namespace serialization {

    /** Serialization support for cv::Mat */
    template
    void save(Archive & ar, const ::cv::Mat& m, const unsigned int version)
    {
      size_t elem_size = m.elemSize();
      size_t elem_type = m.type();

      ar & m.cols;
      ar & m.rows;
      ar & elem_size;
      ar & elem_type;

      const size_t data_size = m.cols * m.rows * elem_size;
      ar & boost::serialization::make_array(m.ptr(), data_size);
    }

    /** Serialization support for cv::Mat */
    template
    void load(Archive & ar, ::cv::Mat& m, const unsigned int version)
    {
      int cols, rows;
      size_t elem_size, elem_type;

      ar & cols;
      ar & rows;
      ar & elem_size;
      ar & elem_type;

      m.create(rows, cols, elem_type);

      size_t data_size = m.cols * m.rows * elem_size;
      ar & boost::serialization::make_array(m.ptr(), data_size);
    }

  }
}

To measure performance, we calculated the theoretical bandwidth for our application. Per frame we need to save colors (3 channels, 1 byte per channel) and depth values (1 channel, 2 bytes per channel), both exposed as cv::Mat objects. The resolution of the sensor is 640×480 pixels, that makes a total of 1.464 MB per frame. The sensor provides 30 frames per second, therefore the theoretical bandwidth roughly 43,94 MB/s.

Using the code above, a recording application can be written in a few lines of code.


#include <boost/archive/binary_oarchive.hpp>
#include <boost/archive/binary_iarchive.hpp>
#include <fstream>

#include "cvmat_serialization.h"

BOOST_AUTO_TEST_CASE(record)
{

  cv::Mat depths, colors;

  std::ofstream ofs("matrices.bin", std::ios::out | std::ios::binary);

  { // use scope to ensure archive goes out of scope before stream

    boost::archive::binary_oarchive oa(ofs);

    while (!_kbhit()) {
      // ... grab from sensor into depths and colors
      oa << depths << colors;
    }

  }

  ofs.close();
}

The above code performes at roughly 29 FPS, reaching an effective bandwidth of 43.4 MB/s. The file size of the archive after 100 frames (~ 3.7 secs) has grown to 450 MB, which is slightly too big for our purposes.

Compressing Streams

We decided to add compression to reduce the resulting file size. Compression based on stream compression using a predefined window size using Zlib, which integrates nicely into Boost.Iostreams. The compression enabled recording application requires only a few modification from the application presented above


#include <boost/archive/binary_oarchive.hpp>
#include <boost/archive/binary_iarchive.hpp>
#include <boost/iostreams/filtering_streambuf.hpp>
#include <boost/iostreams/filter/zlib.hpp>
#include <fstream>

#include "cvmat_serialization.h"

BOOST_AUTO_TEST_CASE(record)
{
  namespace io = boost::iostreams;

  cv::Mat depths, colors;

  std::ofstream ofs("matrices.bin", std::ios::out | std::ios::binary);

  { // use scope to ensure archive and filtering stream buffer go out of scope before stream
    io::filtering_streambuf<io::output> out;
    out.push(io::zlib_compressor(io::zlib::best_speed));
    out.push(ofs);

    boost::archive::binary_oarchive oa(out);

    while (!_kbhit()) {
      // ... grab from sensor into depths and colors
      oa << depths << colors;
    }
  }

  ofs.close();
}

The above code performes at roughly 13 FPS, reaching an effective bandwidth of 20.1 MB/s. The file size of the archive after 100 frames (~ 7.6 secs) is 214 MB which fits our needs perfectly.

Where to go from here? Compression can definitely optimized, both in terms of speed and archive size by exploiting temporal coherence between frames. Julius Kammerl developed a real-time point cloud compression using temporal coherence. Here’s a short video about his work:

Additionally we’d like to encode (cv::imencode) matrices before flushing them to the output stream. If CvFileStorage offers binary serialization it’s worth a look as well.

Reading from the Archive

One last thing to note: Since Boost.Serialization has to my knowledge no real streaming support (the number of objects to serialize is usually known beforehand), one need to apply some tricks when reading in the serialized archive. Boost.Serialization throws an boost::archive::archive_exception exception once you read past end of the input stream. The following functions attempts to read in the next cv::Mat and swallows the exception if it is due to the input stream reaching EOF.


// Try read next object from archive
template
bool try_stream_next(Archive &ar, const Stream &s, Obj &o)
{
  bool success = false;

  try {
    ar >> o;
    success = true;
  } catch (const boost::archive::archive_exception &e) {
    if (e.code != boost::archive::archive_exception::input_stream_error) {
      throw;
    }
  }

  return success;
}

Finally, the a sample application to read back in the archive goes something along the lines of

#include <boost/archive/binary_oarchive.hpp>
#include <boost/archive/binary_iarchive.hpp>
#include <boost/iostreams/filtering_streambuf.hpp>
#include <boost/iostreams/filter/zlib.hpp>
#include <fstream>

#include "cvmat_serialization.h"

BOOST_AUTO_TEST_CASE(replay)
{
  namespace io = boost::iostreams;

  cv::Mat depths, colors;

  std::ifstream ifs("matrices.bin", std::ios::in | std::ios::binary);

  {
    io::filtering_streambuf<io::input> in;
    in.push(io::zlib_decompressor());
    in.push(ifs);

    boost::archive::binary_iarchive ia(in);

    bool cont = true;
    while (cont)
    {
      cont = try_stream_next(ia, ifs, depths) &&
             try_stream_next(ia, ifs, colors);
      // do something with matrices ...
    }

  }

  ifs.close();
}

On my machine the replay application runs at a frame rate 36 having an effective bandwidth of 52.8 MB/s.

I hope you found the information useful.

About these ads

19 thoughts on “Serialization of cv::Mat objects using Boost

  1. Hi Christoph,

    that helped a lot! I was fiddling around with boost serialization and zlib compression but failed to get it right. how did you integrate zlib? compiled boost with zlib support?

    sl
    Steve

  2. Hi Christoph!

    Thanks a lot for this post, it helped. Is it possible that you forgot to put io::output and io::input into the template argument of the filtering_streambuf ?

    I had to do

    io::filtering_streambuf in;

    and

    io::filtering_streambuf out;

    to get it working in my case.
    cheers
    Paul

  3. Hi Cristoph!

    Great post! I has help us a lot..
    I was wondering if you can help us, or guide us with serialization of a vector object..is it possible?
    Thanks again!

    Gabriel

  4. Hi Cristoph!

    Great post! It has help us a lot..
    I was wondering if you can help us, or guide us with serialization of a vector of cv::Mat object..is it possible?
    Thanks again!

    Gabriel

  5. You were right! It was giving me a headache, becuase I included std:vector instead of boost vector.

    Thank you again!

    Gabriel

  6. At 1.464 MB per frame, storing the 100 frames, say, using raw binary, should take 146.4 MB, but you say that after 100 frames, the archive has grown to 450 MB. Is the boost serialization library adding a ~3x size overhead (without compression)?

  7. Thanks so much for this! Saved me many hours of research. Just a little adjustment. I had to add after the template keyword in your example so it looks like this:

    template
    void save(Archive & ar, const ::cv::Mat& m, const unsigned int version)

    Other than that, worked like a charm.

  8. Oh, I see what’s happening. WordPress is removing the “class Archive” section after the template keyword. I guess it’s good for you to know about it. You may or may not want to escape that part of the code. Maybe brackets are needed. Not sure. Thanks a lot for it!

  9. Pingback: How to Detect and Track Object With OpenCV

  10. Great article. I hadn’t noticed make_array for my own solution.

    Only I’m using cv::Mat:step to estimate datasize.

    const size_t data_size = m.cols * m.step;

    Then step has to be included in the serialization as well. A problem with your solution might occur, when padding is used.

  11. Hey, great article, definitely.
    But i have a problem, if i serialize more than one mat (always the same mat), i can only deserialize/load one of them, the next items in the bin-file while deserialization does not work, there’s a exception because the dimensions (rows, cols) are not right.
    During serialization, i always to this for every kind of mat element:
    boost::archive::binary_oarchive oa(std::ostream);
    Can it be the problem? Only refer once to the bin oarchive?

    Regards

  12. I have a suggestion for you. You can use an xml archive with the folowwing code. Ofcourse a binary archive is possible too.

    /** Serialization support for cv::Mat */
    template
    void save(Archive & ar, const ::cv::Mat& m, const unsigned int version)
    {
    int cols = m.cols;
    int rows = m.rows;
    size_t elem_size = m.elemSize();
    size_t elem_type = m.type();
    ar & BOOST_SERIALIZATION_NVP(cols);
    ar & BOOST_SERIALIZATION_NVP(rows);
    ar & BOOST_SERIALIZATION_NVP(elem_size);
    ar & BOOST_SERIALIZATION_NVP(elem_type);
    const size_t data_size = m.cols * m.rows * elem_size;

    boost::serialization::binary_object data(m.data, data_size);
    ar & BOOST_SERIALIZATION_NVP(data);
    }

    /** Serialization support for cv::Mat */
    template
    void load(Archive & ar, ::cv::Mat& m, const unsigned int version)
    {
    int cols, rows;
    size_t elem_size, elem_type;
    ar & BOOST_SERIALIZATION_NVP(cols);
    ar & BOOST_SERIALIZATION_NVP(rows);
    ar & BOOST_SERIALIZATION_NVP(elem_size);
    ar & BOOST_SERIALIZATION_NVP(elem_type);
    m.create(rows, cols, elem_type);
    size_t data_size = m.cols * m.rows * elem_size;

    boost::serialization::binary_object data(m.data, data_size);
    ar & BOOST_SERIALIZATION_NVP(data);
    }

    }; // namepace serialization
    }; // namepace boost

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s