Sensors (Nov 2018)

EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data

  • Yuki Matsuda,
  • Dmitrii Fedotov,
  • Yuta Takahashi,
  • Yutaka Arakawa,
  • Keiichi Yasumoto,
  • Wolfgang Minker

DOI
https://doi.org/10.3390/s18113978
Journal volume & issue
Vol. 18, no. 11
p. 3978

Abstract

Read online

With the spread of smart devices, people may obtain a variety of information on their surrounding environment thanks to sensing technologies. To design more context-aware systems, psychological user context (e.g., emotional status) is a substantial factor for providing useful information in an appropriate timing. As a typical use case that has a high demand for context awareness but is not tackled widely yet, we focus on the tourism domain. In this study, we aim to estimate the emotional status and satisfaction level of tourists during sightseeing by using unconscious and natural tourist actions. As tourist actions, behavioral cues (eye and head/body movement) and audiovisual data (facial/vocal expressions) were collected during sightseeing using an eye-gaze tracker, physical-activity sensors, and a smartphone. Then, we derived high-level features, e.g., head tilt and footsteps, from behavioral cues. We also used existing databases of emotionally rich interactions to train emotion-recognition models and apply them in a cross-corpus fashion to generate emotional-state prediction for the audiovisual data. Finally, the features from several modalities are fused to estimate the emotion of tourists during sightseeing. To evaluate our system, we conducted experiments with 22 tourists in two different touristic areas located in Germany and Japan. As a result, we confirmed the feasibility of estimating both the emotional status and satisfaction level of tourists. In addition, we found that effective features used for emotion and satisfaction estimation are different among tourists with different cultural backgrounds.

Keywords