IEEE Access (Jan 2021)

Semi-Global Context Network for Semantic Correspondence

  • Ho-Jun Lee,
  • Hong Tae Choi,
  • Sung Kyu Park,
  • Ho-Hyun Park

DOI
https://doi.org/10.1109/ACCESS.2020.3046845
Journal volume & issue
Vol. 9
pp. 2496 – 2507

Abstract

Read online

Estimating semantic correspondence between pairs of images can be challenging as a result of intra-class variation, background clutter, and repetitive patterns. This paper proposes a convolutional neural network (CNN) that attempts to learn rich semantic representations that contain the global semantic context to enable robust semantic correspondence estimation against intra-class variation and repetitive patterns. We introduce a global context fused feature representation that efficiently employs the global semantic context in estimating semantic correspondence as well as a semi-global self-similarity feature to reduce background clutter-induced distraction in capturing the global semantic context. The proposed network is trained in an end-to-end manner using a weakly supervised loss, which requires a weak level of supervision involving annotation on image pairs. This weakly supervised loss is supplemented with a historical averaging loss to effectively train the network. Our approach decreases running time by a factor of more than four and reduces the training memory requirement by a factor of three and produces competitive or superior results relative to previous approaches on the PF-PASCAL, PF-WILLOW, and TSS benchmarks.

Keywords