IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2013)

Multi-Modal and Multi-Temporal Data Fusion: Outcome of the 2012 GRSS Data Fusion Contest

  • Christian Berger,
  • Michael Voltersen,
  • Robert Eckardt,
  • Jonas Eberle,
  • Thomas Heyer,
  • Nesrin Salepci,
  • Soren Hese,
  • Christiane Schmullius,
  • Junyi Tao,
  • Stefan Auer,
  • Richard Bamler,
  • Ken Ewald,
  • Michael Gartley,
  • John Jacobson,
  • Alan Buswell,
  • Qian Du,
  • Fabio Pacifici

DOI
https://doi.org/10.1109/JSTARS.2013.2245860
Journal volume & issue
Vol. 6, no. 3
pp. 1324 – 1340

Abstract

Read online

The 2012 Data Fusion Contest organized by the Data Fusion Technical Committee (DFTC) of the IEEE Geoscience and Remote Sensing Society (GRSS) aimed at investigating the potential use of very high spatial resolution (VHR) multi-modal/multi-temporal image fusion. Three different types of data sets, including spaceborne multi-spectral, spaceborne synthetic aperture radar (SAR), and airborne light detection and ranging (LiDAR) data collected over the downtown San Francisco area were distributed during the Contest. This paper highlights the three awarded research contributions which investigate (i) a new metric to assess urban density (UD) from multi-spectral and LiDAR data, (ii) simulation-based techniques to jointly use SAR and LiDAR data for image interpretation and change detection, and (iii) radiosity methods to improve surface reflectance retrievals of optical data in complex illumination environments. In particular, they demonstrate the usefulness of LiDAR data when fused with optical or SAR data. We believe these interesting investigations will stimulate further research in the related areas.

Keywords