Data in Brief (Apr 2021)

HANDS: an RGB-D dataset of static hand-gestures for human-robot interaction

  • Cristina Nuzzi,
  • Simone Pasinetti,
  • Roberto Pagani,
  • Gabriele Coffetti,
  • Giovanna Sansoni

Journal volume & issue
Vol. 35
p. 106791

Abstract

Read online

The HANDS dataset has been created for human-robot interaction research, and it is composed of spatially and temporally aligned RGB and Depth frames. It contains 12 static single-hand gestures performed with both the right-hand and the left-hand, and 3 static two-hands gestures for a total of 29 unique classes. Five actors (two females and three males) have been acquired performing the gestures, each of them adopting a different background and light conditions. For each actor, 150 RGB frames and their corresponding 150 Depth frames per gesture have been collected, for a total of 2400 RGB frames and 2400 Depth frames per actor.Data has been collected using a Kinect v2 camera intrinsically calibrated to spatially align RGB data to Depth data. The temporal alignment has been performed offline using MATLAB, aligning frames with a maximum temporal distance of 66 ms.This dataset has been used in [1] and it is freely available at http://dx.doi.org/10.17632/ndrczc35bt.1.

Keywords