IEEE Access (Jan 2024)

GFHANet: Global Feature Hybrid Attention Network for Salient Object Detection in Side-Scan Sonar Images

  • Shen-Ao Yuan,
  • Zhen Wang,
  • Fu-Lin He,
  • Shan-Wen Zhang,
  • Zheng-Yang Zhao

DOI
https://doi.org/10.1109/ACCESS.2024.3463804
Journal volume & issue
Vol. 12
pp. 155943 – 155957

Abstract

Read online

With the wide application of deep learning in image processing, salient object detection (SOD) in underwater sonar images has become an important research topic. However, due to the interference of complex underwater environments and seabed reverberation noise, the existing deep learning methods have the problem of insufficient feature representation in the SOD task for side-scan sonar (SSS) images. In the down-sampling process, these methods may lose the relationship between the global information and local information of the sonar image, which affects the detection effect for object contour structure. To address this issue, we propose a novel Global Feature Hybrid Attention Network (GFHANet) specifically designed for SOD in SSS images. Specifically, we construct a dual-encoder structure based on global feature extraction and self-attention mechanism to capture the global structure information of sonar images. The adaptive hybrid attention mechanism (AHAM) is used to realize the effective reconstruction of channel and spatial features, and a global enhancement module (GEM) is designed to fuse the global and spatial features from the dual-encoder to compensate for information loss. Comparative experiments on the underwater sonar dataset with 10 other state-of-the-art methods show that GFHANet achieves superior performance in underwater sonar salient object detection, with a mean absolute error (MAE) of 2.35%, and a structural similarity (SM) of 74.25%. The source code and models are available at https://github.com/darkseid-arch/GFHANet-4SOD.

Keywords