ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences (Dec 2023)

SHIP DETECTION IN COSMO-SKYMED SAR IMAGERY USING A NOVEL CNN-BASED DETECTOR: A CASE STUDY FROM THE SUEZ CANAL

  • T. Saleh,
  • T. Saleh,
  • S. Holail,
  • X. Weng,
  • X. Xiao,
  • G.-S. Xia,
  • G.-S. Xia,
  • G.-S. Xia

DOI
https://doi.org/10.5194/isprs-annals-X-1-W1-2023-715-2023
Journal volume & issue
Vol. X-1-W1-2023
pp. 715 – 722

Abstract

Read online

The Suez Canal, strategically located as the shortest international sea route, plays a crucial role in facilitating the transportation of goods between Asia and Europe. However, the occurrence of traffic disruptions within the Canal poses a serious threat to global trade, as evidenced by the recent incident of the container ship Ever Given, which ran aground in the Suez Canal on March 23, 2021. This event led to a complete blockade of the Canal that lasted for six days, resulting in a fleet of ships waiting to pass through the Canal. This highlights the need to monitor the Canal to prevent similar disturbances in the future. In this paper, we propose a CNN-based attention-guided self-learning framework for ship detection from 3m high-resolution COSMO-SkyMed SAR imagery acquired in April 2021 via the Egyptian Suez Canal. We introduce a self-learning augmented segmentation (SLAS) technique to augment the dataset with new ship samples by pseudo-labeling an unlabeled dataset. We also present the Attention-guided Feature Refinement (AFR) module to extract more significant semantic features and contextual information, especially for ships of varying sizes in SAR images. Finally, the AFR module is fed into a Region Proposal Network (RPN) to generate a set of proposal anchors, which are later used in a Deep Detection Network (DDN) for ship classification and localization. Our experimental results demonstrate that the proposed method outperforms current state-of-the-art detection models in terms of detection accuracy, particularly in complex coastal scenes, with an overall accuracy of up to 87% mean average precision (mAP).