Taiyuan Ligong Daxue xuebao (Sep 2023)

Dust Image Depth Prediction Based on Feature Sparsity

  • Huimin JIA,
  • Yuanyu WANG

DOI
https://doi.org/10.16355/j.tyut.1007-9432.2023.05.013
Journal volume & issue
Vol. 54, no. 5
pp. 853 – 860

Abstract

Read online

Purposes Aiming at the problem of low accuracy of single image depth prediction in dusty environment, a dust image depth prediction network based on sparse input features is proposed. Methods First, by using the relationship between the direct transmission rate of dust image and depth information, a depth prediction network is designed to obtain a depth prediction map. With the prior principle of image color attenuation, the sparse depth features of the dust image are further obtained from the estimated depth map.Then, the sparse depth features and dust images are used as the input of the depth prediction network. The deep prediction network uses an “encoder-decoder” model framework, and the two inputs in the encoder are encoded by different networks. Among them, the dust image is encoded by the residual network, which solves the problem of gradient dispersion and accuracy degradation in the deep network. This prediction method not only ensures the accuracy but also controls the speed. The sparse depth feature adopts a sparse convolutional network with a fusion channel attention mechanism. Coding makes effective feature maps weight more and invalid or less effective feature maps less. Conclusions The resulting output is then channel-fused and fed into the decoder. The decoder uses deconvolution and multi-scale upsampling to design an upsampling module, and uses bicubic upsampling to better reconstruct dense depth information. Minimum absolute value loss and structural similarity loss are adopted as edge preserving loss functions. The experimental results on the NYU-Depth-v2 dataset show that the method can effectively predict the depth information of dust images, the average relative error is reduced to 0.054, the root mean square error is reduced to 0.610, and the accuracy of δ<1.25 is 0.967.

Keywords