MATEC Web of Conferences (Jan 2018)
A dataset of head and eye gaze during dyadic interaction task for modeling robot gaze behavior
Abstract
In this work is presented a dataset of humans‘ head and eye gaze acquired with Pupil Labs gazetracking glasses and Optitrack motion capture system. The dataset contains recordings of adult subjects in dyadic interaction task. During the experiment, the subjects are asked to pick up an object and, based on the randomly defined instructions, to place it on the table in front of her/him or to give the object to a person sitting across the table. If the object is handed over, the second person takes the object and places it on the table it in front of her/him. The dataset is intended to be used to model the behavior of the human’s gaze while interacting with another human and implement the model in a controller of a robot for dyadic interaction with a humans.