Applied Sciences (Jul 2023)

Real-Time Information Fusion System Implementation Based on ARM-Based FPGA

  • Yu-Hsiang Tsai,
  • Yung-Jhe Yan,
  • Meng-Hsin Hsiao,
  • Tzu-Yi Yu,
  • Mang Ou-Yang

DOI
https://doi.org/10.3390/app13148497
Journal volume & issue
Vol. 13, no. 14
p. 8497

Abstract

Read online

In this study, an information fusion system displayed fusion information on a transparent display by considering the relationships among the display, background exhibit, and user’s gaze direction. We used an ARM-based field-programmable gate array (FPGA) to perform virtual–real fusion of this system as well as evaluated the virtual–real fusion execution speed. The ARM-based FPGA used Intel® RealsenseTM D435i depth cameras to capture depth and color images of an observer and exhibit. The image data was received by the ARM side and fed to the FPGA side for real-time object detection. The FPGA accelerated the computation of the convolution neural networks to recognize observers and exhibits. In addition, a module performed by the FPGA was developed for rapid registration between the color and depth images. The module calculated the size and position of the information displayed on a transparent display according to the pixel coordinates and depth values of the human eye and exhibit. A personal computer with GPU RTX2060 performed information fusion in ~47 ms, whereas the ARM-based FPGA accomplished it in 25 ms. Thus, the fusion speed of the ARM-based FPGA was 1.8 times faster than on the computer.

Keywords