IEEE Access (Jan 2023)

Event Camera-Based Pupil Localization: Facilitating Training With Event-Style Translation of RGB Faces

  • Daehyun Kang,
  • Dongwoo Kang

DOI
https://doi.org/10.1109/ACCESS.2023.3343152
Journal volume & issue
Vol. 11
pp. 142304 – 142316

Abstract

Read online

An innovative approach to pupil tracking using event cameras is presented in this paper. Our method incorporates two primary processes: RGB-to-Event image domain translation and pupil localization utilizing Event Cameras. Initially, we convert traditional RGB images into event-like images with our novel adaptive StyleFlow algorithm. This advanced algorithm enables the generation of images that are remarkably similar to those produced by real event cameras in terms of their distinctive characteristics and visual appeal. Subsequently, we perform the pupil localization process, which involves applying the RetinaFace algorithm. This algorithm is trained using our unique cross-modal learning strategy on a mixed dataset, consisting of both RGB and the newly transformed event-like images. When evaluated using real event camera data, our approach sets a new benchmark in accuracy performance. We achieved a face detection accuracy of 99.4% and a pupil alignment accuracy of 97.2%, exceeding the performance of previous deep learning-based methods that were trained on conventional RGB images. Our results effectively demonstrate the promising potential of event camera-based pupil tracking. Furthermore, our study represents an important advance in the field, offering the possibility of future advancements and potential applications in vehicular systems and augmented reality heads-up displays (AR HUDs).

Keywords