IEEE Access (Jan 2023)

Deep Sparse Depth Completion Using Multi-Affinity Matrix

  • Wei Zhao,
  • Cheolkon Jung,
  • Jaekwang Kim

DOI
https://doi.org/10.1109/ACCESS.2023.3295133
Journal volume & issue
Vol. 11
pp. 78251 – 78261

Abstract

Read online

Image-guided depth completion aims to generate dense depth maps from sparse depth maps guided by their corresponding color (RGB) images. In this paper, we propose deep sparse depth completion using multi-affinity matrix. Recently, spatial propagation networks (SPNs) are used to refine depth maps obtained by initial depth completion. However, they use the same affinity matrix even in multiple iterations that has a limit to improving performance, which is not effective in considering the relationship between adjacent pixels. Thus, we replace it with a multi-affinity matrix to represent the relationship between an output pixel and its neighboring ones. Moreover, deep neural networks can effectively fuse features from two different modalities based on confidence maps. Inspired by dynamic guided filtering, we use a convolutional spatial propagation network (CSPN) to fuse multi-modal features at multiple stages. When training the color branch of the proposed network, we adopt supervised learning that constrains the output features of all layers in the decoder. Since spatially varying features are required for multi-modal feature fusion, the proposed network produces adaptive affinity features using a single decoder. Experimental results on KITTI and NYU-v2 datasets show that the multi-affinity matrix represents the dependency between pixels and predicts accurate depth values by calculating the weights between neighboring pixels. The proposed network based on multi-affinity matrix achieves state-of-the-art performance in terms of root mean square error (RMSE) and mean absolute error (MAE).

Keywords