Frontiers in Artificial Intelligence (Apr 2024)

Expandable-RCNN: toward high-efficiency incremental few-shot object detection

  • Yiting Li,
  • Sichao Tian,
  • Haiyue Zhu,
  • Yeying Jin,
  • Keqing Wang,
  • Jun Ma,
  • Jun Ma,
  • Cheng Xiang,
  • Prahlad Vadakkepat

DOI
https://doi.org/10.3389/frai.2024.1377337
Journal volume & issue
Vol. 7

Abstract

Read online

This study aims at addressing the challenging incremental few-shot object detection (iFSOD) problem toward online adaptive detection. iFSOD targets to learn novel categories in a sequential manner, and eventually, the detection is performed on all learned categories. Moreover, only a few training samples are available for all sequential novel classes in these situations. In this study, we propose an efficient yet suitably simple framework, Expandable-RCNN, as a solution for the iFSOD problem, which allows online sequentially adding new classes with zero retraining of the base network. We achieve this by adapting the Faster R-CNN to the few-shot learning scenario with two elegant components to effectively address the overfitting and category bias. First, an IOU-aware weight imprinting strategy is proposed to directly determine the classifier weights for incremental novel classes and the background class, which is with zero training to avoid the notorious overfitting issue in few-shot learning. Second, since the above zero-retraining imprinting approach may lead to undesired category bias in the classifier, we develop a bias correction module for iFSOD, named the group soft-max layer (GSL), that efficiently calibrates the biased prediction of the imprinted classifier to organically improve classification performance for the few-shot classes, preventing catastrophic forgetting. Extensive experiments on MS-COCO show that our method can significantly outperform the state-of-the-art method ONCE by 5.9 points in commonly encountered few-shot classes.

Keywords