Sensors (Jun 2024)

Depth-Guided Bilateral Grid Feature Fusion Network for Dehazing

  • Xinyu Li,
  • Zhi Qiao,
  • Gang Wan,
  • Sisi Zhu,
  • Zhongxin Zhao,
  • Xinnan Fan,
  • Pengfei Shi,
  • Jin Wan

DOI
https://doi.org/10.3390/s24113589
Journal volume & issue
Vol. 24, no. 11
p. 3589

Abstract

Read online

In adverse foggy weather conditions, images captured are adversely affected by natural environmental factors, resulting in reduced image contrast and diminished visibility. Traditional image dehazing methods typically rely on prior knowledge, but their efficacy diminishes in practical, complex environments. Deep learning methods have shown promise in single-image dehazing tasks, but often struggle to fully leverage depth and edge information, leading to blurred edges and incomplete dehazing effects. To address these challenges, this paper proposes a deep-guided bilateral grid feature fusion dehazing network. This network extracts depth information through a dedicated module, derives bilateral grid features via Unet, employs depth information to guide the sampling of bilateral grid features, reconstructs features using a dedicated module, and finally estimates dehazed images through two layers of convolutional layers and residual connections with the original images. The experimental results demonstrate the effectiveness of the proposed method on public datasets, successfully removing fog while preserving image details.

Keywords