IEEE Access (Jan 2019)

Aurora Image Search With Saliency Deep Features

  • Xi Yang,
  • Nannan Wang,
  • Bin Song,
  • Xinbo Gao

DOI
https://doi.org/10.1109/ACCESS.2019.2917723
Journal volume & issue
Vol. 7
pp. 65996 – 66006

Abstract

Read online

Although the convolutional neural networks have obtained amazing performance in the area of image search, most of the existing methods are applied for natural images collected via normal lens without anamorphic distortion. Actually, there are great amounts of images collected with a circular fisheye lens to obtain larger field-of-view (FOV), especially in the area of natural science study. This paper aims to present a novel image search method with saliency deep features for those images, especially the aurora images used in solar-terrestrial space research. Our method exploits the advanced Mask R-CNN framework to extract semantic features. To utilize the unique physical characteristics of aurora and focus on the most informative local regions, we present a saliency proposal network (SPN) to take place in the region proposal network (RPN). In our SPN, different from the conventional rectangular gridding way, the proposed anchors show spherical distortion determined by imaging principle and magnetic information. In addition, instead of the horizontal directions, our anchor boxes direct perpendicular to the physical magnetic meridian, and thus ensure them to include the auroral structures within minimum areas. We perform numerous experiments on the big aurora image dataset, and the results prove the superiority of the proposed method over the state-of-the-art methods on both search accuracy and efficiency.

Keywords