Symmetry (Aug 2019)

MILDMS: Multiple Instance Learning via DD Constraint and Multiple Part Similarity

  • Chao Wen,
  • Zhan Li,
  • Jian Qu,
  • Qingchen Fan,
  • Aiping Li

DOI
https://doi.org/10.3390/sym11091080
Journal volume & issue
Vol. 11, no. 9
p. 1080

Abstract

Read online

As a subject area of symmetry, multiple instance learning (MIL) is a special form of a weakly supervised learning problem where the label is related to the bag, not the instances contained in it. The difficulty of MIL lies in the incomplete label information of instances. To resolve this problem, in this paper, we propose a novel diverse density (DD) and multiple part similarity combination method for multiple instance learning, named MILDMS. First, we model the target concepts optimization with a DD function constraint on positive and negative instance space, which can greatly improve the robustness to label noise problem. Next, we combine the positive and negative instances in the bag (generated by hand-crafted and convolutional neural network features) with multiple part similarities to construct an MIL kernel. We evaluate the proposed approach on the MUSK dataset, whose results MUSK1 (91.9%) and MUSK2 (92.2%) show our method is comparable to other MIL algorithms. To further demonstrate generality, we also present experimental results on the PASCAL VOC 2007 and 2012 (46.5% and 42.2%) and COREL (78.6%) that significantly outperforms the state-of-the-art algorithms including deep MIL and other non-deep MIL algorithms.

Keywords