Applied Sciences (Nov 2021)

TDCMR: Triplet-Based Deep Cross-Modal Retrieval for Geo-Multimedia Data

  • Jiagang Song,
  • Yunwu Lin,
  • Jiayu Song,
  • Weiren Yu,
  • Leyuan Zhang

DOI
https://doi.org/10.3390/app112210803
Journal volume & issue
Vol. 11, no. 22
p. 10803

Abstract

Read online

Mass multimedia data with geographical information (geo-multimedia) are collected and stored on the Internet due to the wide application of location-based services (LBS). How to find the high-level semantic relationship between geo-multimedia data and construct efficient index is crucial for large-scale geo-multimedia retrieval. To combat this challenge, the paper proposes a deep cross-modal hashing framework for geo-multimedia retrieval, termed as Triplet-based Deep Cross-Modal Retrieval (TDCMR), which utilizes deep neural network and an enhanced triplet constraint to capture high-level semantics. Besides, a novel hybrid index, called TH-Quadtree, is developed by combining cross-modal binary hash codes and quadtree to support high-performance search. Extensive experiments are conducted on three common used benchmarks, and the results show the superior performance of the proposed method.

Keywords