IEEE Access (Jan 2024)

An Automated Mechanism for Retrospective Event Discovery via Visual Query Responses

  • Pranita P. Deshmukh,
  • Shanmugam Poonkuntran

DOI
https://doi.org/10.1109/ACCESS.2024.3419153
Journal volume & issue
Vol. 12
pp. 142320 – 142330

Abstract

Read online

Consider the modern smartphone user, who, while on a journey or participating in an occasion, might snap hundreds of pictures. Over a short span of years, this can result in a collection of tens of thousands of photos and extensive video footage, all capturing precious memories of past events like workshops, short-term training programs, faculty development programs, conferences, annual social gatherings, and more. People often turn to these personal photos or videos as a means to recollect fragments of their memories associated with these events. However, the real challenge lies in pinpointing the specific photos or videos related to a particular event from the vast collection. This paper introduces a novel system designed to tackle this issue by enabling automatic past event search, leveraging a visual query response dataset. The proposed framework, known as Event-based Focal Visual-Content Text Attention (EFVCTA), forms the foundation of this system. The system, built on the EFVCTA, facilitates automatic past event search using visual query response (VQA). It can be utilized to extract information about past events from photos of various training programs, workshops, conferences, annual social gatherings, and more, conducted in academic institutions. This paper presents an evaluation of the proposed EFVCTA-based system. To our knowledge, no such system currently exists for extracting information about past events from photos of training programs.

Keywords