IEEE Access (Jan 2020)
A Deep Quadruplet Network for Local Descriptor Learning
Abstract
Generating a distinguishable feature for local patch is a main task in computer vision which aims at matching local patches. Recently, local patch descriptors from deep convolutional neural network (CNN) with a triplet loss have achieved promising performance. In this paper, we design a quadruplet loss, which can achieve a better result than other pairwise loss and triplet loss methods. Our loss is inspired by the thoughts of uniform distribution. It separates non-matching examples by using the hard sampled non-matching pairs in a batch, and simultaneously uses the random sampled non-matching examples to keep non-matching pairs to obey uniform distribution. A compact descriptor named QuadrupletNet is generated by combining the proposed quadruplet loss and L2Net CNN architecture. From our experiment, QuadrupletNet shows better performance on the Brown dataset and Hpatches dataset than Triplet loss methods on the same training set. The pre-trained QuadrupletNet is publicly available.
Keywords