Remote Sensing (Oct 2024)

High-Precision Disparity Estimation for Lunar Scene Using Optimized Census Transform and Superpixel Refinement

  • Zhen Liang,
  • Hongfeng Long,
  • Zijian Zhu,
  • Zifei Cao,
  • Jinhui Yi,
  • Yuebo Ma,
  • Enhai Liu,
  • Rujin Zhao

DOI
https://doi.org/10.3390/rs16213930
Journal volume & issue
Vol. 16, no. 21
p. 3930

Abstract

Read online

High-precision lunar scene 3D data are essential for lunar exploration and the construction of scientific research stations. Currently, most existing data from orbital imagery offers resolutions up to 0.5–2 m, which is inadequate for tasks requiring centimeter-level precision. To overcome this, our research focuses on using in situ stereo vision systems for finer 3D reconstructions directly from the lunar surface. However, the scarcity and homogeneity of available lunar surface stereo datasets, combined with the Moon’s unique conditions—such as variable lighting from low albedo, sparse surface textures, and extensive shadow occlusions—pose significant challenges to the effectiveness of traditional stereo matching techniques. To address the dataset gap, we propose a method using Unreal Engine 4 (UE4) for high-fidelity physical simulation of lunar surface scenes, generating high-resolution images under realistic and challenging conditions. Additionally, we propose an optimized cost calculation method based on Census transform and color intensity fusion, along with a multi-level super-pixel disparity optimization, to improve matching accuracy under harsh lunar conditions. Experimental results demonstrate that the proposed method exhibits exceptional robustness and accuracy in our soon-to-be-released multi-scene lunar dataset, effectively addressing issues related to special lighting conditions, weak textures, and shadow occlusion, ultimately enhancing disparity estimation accuracy.

Keywords