IEEE Access (Jan 2022)

Mobile Augmented Reality Based on Multimodal Inputs for Experiential Learning

  • Nurhazarifah Che Hashim,
  • Nazatul Aini Abd Majid,
  • Haslina Arshad,
  • Harwati Hashim,
  • Zaid Abdi Alkareem Alyasseri

DOI
https://doi.org/10.1109/ACCESS.2022.3193498
Journal volume & issue
Vol. 10
pp. 78953 – 78969

Abstract

Read online

The power of mobile devices has been harnessed as a platform for augmented reality (AR) that has embarked on opportunities to combine more inputs called multimodal inputs. There has been little research into the multimodal inputs that combine emotion, speech and markers in a mobile AR learning environment. This study aims to propose a framework for a mobile AR learning system that incorporates the combination of multimodal inputs, namely emotion, image-based marker and speech in order to determine how such a combination can enhance the learning experience. The proposed framework integrates the multimodal inputs based on the decision tree and develops into a four-phase learning system based on Kolb’s experiential learning model. To evaluate this learning system, 38 students were selected and divided into two groups for an experiment on vocabulary at a primary school. Quantitative findings showed better results concerning learning effectiveness, mental load, engagement, competency and challenge when the three multimodal inputs, speech, marker and emotion, were combined. Therefore, the proposed multimodal framework can be used as a guideline to develop a multimodal based AR application for learning environment by integrating multimodal inputs and Kolb’s experiential learning model.

Keywords