Sensors (May 2015)

Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

  • Naveed ur Rehman,
  • Shoaib Ehsan,
  • Syed Muhammad Umer Abdullah,
  • Muhammad Jehanzaib Akhtar,
  • Danilo P. Mandic,
  • Klaus D. McDonald-Maier

DOI
https://doi.org/10.3390/s150510923
Journal volume & issue
Vol. 15, no. 5
pp. 10923 – 10947

Abstract

Read online

A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

Keywords