Sensors (Feb 2024)
Gesture-Based Interactions: Integrating Accelerometer and Gyroscope Sensors in the Use of Mobile Apps
Abstract
This study investigates the feasibility and functionality of accelerometer and gyroscope sensors for gesture-based interactions in mobile app user experience. The core of this innovative approach lies in introducing a dynamic and intuitive user interaction model with the device sensors. The Android app developed for this purpose has been created for its use in controlled experiments. Methodologically, it was created as a stand-alone tool to both capture quantitative (time, automatically captured) and qualitative (behavior, collected with post-task questionnaires) variables. The app’s setting features a set of modules with two levels each (randomized presentation applied, minimizing potential learning effects), allowing users to interact with both sensor-based and traditional touch-based scenarios. Preliminary results with 22 participants reveal that tasks involving sensor-based interactions tend to take longer to complete when compared to the traditional ones. Remarkably, many participants rated sensor-based interactions as a better option than touch-based interactions, as seen in the post-task questionnaires. This apparent discrepancy between objective completion times and subjective user perceptions requires a future in-depth exploration of factors influencing user experiences, including potential learning curves, cognitive load, and task complexity. This study contributes to the evolving landscape of mobile app user experience, emphasizing the benefits of considering the integration of device sensors (and gesture-based interactions) in common mobile usage.
Keywords