IEEE Access (Jan 2019)
Eliciting Contact-Based and Contactless Gestures With Radar-Based Sensors
Abstract
Radar sensing technologies now offer new opportunities for gesturally interacting with a smart environment by capturing microgestures via a chip that is embedded in a wearable device, such as a smartwatch, a finger or a ring. Such microgestures are issued at a very small distance from the device, regardless of whether they are contact-based, such as on the skin, or contactless. As this category of microgestures remains largely unexplored, this paper reports the results of a gesture elicitation study that was conducted with twenty-five participants who expressed their preferred user-defined gestures for interacting with a radar-based sensor on nineteen referents that represented frequent Internet-of-things tasks. This study clustered the $25 \times 19=475$ initially elicited gestures into four categories of microgestures, namely, micro, motion, combined, and hybrid, and thirty-one classes of distinct gesture types and produced a consensus set of the nineteen most preferred microgestures. In a confirmatory study, twenty new participants selected gestures from this classification for thirty referents that represented tasks of various orders; they reached a high rate of agreement and did not identify any new gestures. This classification of radar-based gestures provides researchers and practitioners with a larger basis for exploring gestural interactions with radar-based sensors, such as for hand gesture recognition.
Keywords