Systems (Dec 2023)

Enhancing Multi-Modal Perception and Interaction: An Augmented Reality Visualization System for Complex Decision Making

  • Liru Chen,
  • Hantao Zhao,
  • Chenhui Shi,
  • Youbo Wu,
  • Xuewen Yu,
  • Wenze Ren,
  • Ziyi Zhang,
  • Xiaomeng Shi

DOI
https://doi.org/10.3390/systems12010007
Journal volume & issue
Vol. 12, no. 1
p. 7

Abstract

Read online

Visualization systems play a crucial role in industry, education, and research domains by offering valuable insights and enhancing decision making. These systems enable the representation of complex workflows and data in a visually intuitive manner, facilitating better understanding, analysis, and communication of information. This paper explores the potential of augmented reality (AR) visualization systems that enhance multi-modal perception and interaction for complex decision making. The proposed system combines the physicality and intuitiveness of the real world with the immersive and interactive capabilities of AR systems. By integrating physical objects and virtual elements, users can engage in natural and intuitive interactions, leveraging multiple sensory modalities. Specifically, the system incorporates vision, touch, eye-tracking, and sound as multi-modal interaction methods to further improve the user experience. This multi-modal nature enables users to perceive and interact in a more holistic and immersive manner. The software and hardware engineering of the proposed system are elaborated in detail, and the system’s architecture and preliminary function testing results are also included in the manuscript. The findings aim to aid visualization system designers, researchers, and practitioners in exploring and harnessing the capabilities of this integrated approach, ultimately leading to more engaging and immersive user experiences in various application domains.

Keywords