IEEE Access (Jan 2020)
A New Localization Objective for Accurate Fine-Grained Affordance Segmentation Under High-Scale Variations
Abstract
Fine-grained affordance segmentation for object parts can greatly benefit robotics and scene understanding applications. In this work, we propose an instance-segmentation framework that can accurately localize functionality and affordance of individual object parts. We build on the standard Mask-RCNN framework and propose two novelties to the localization objective that can lead to improved part detection and affordance segmentation results. Specifically, we notice two problems with the conventional IOU based regression loss, (a) the small boxes, that are specially relevant for fine-grained detection, have a higher risk of being ignored during the optimization process and (b) a constant value of IOU for non-overlapping candidates means no supervision is available to encourage the reduction in loss function. To address these limitations, we propose a novel Angular Intersection Over Larger (AIOL) measure. Our experiments show consistent improvement over other baselines and state of the art localization loss functions for the fine-grained affordance segmentation task.
Keywords