IEEE Access (Jan 2019)

Enabling Reproducible Research in Sensor-Based Transportation Mode Recognition With the Sussex-Huawei Dataset

  • Lin Wang,
  • Hristijan Gjoreski,
  • Mathias Ciliberto,
  • Sami Mekki,
  • Stefan Valentin,
  • Daniel Roggen

DOI
https://doi.org/10.1109/ACCESS.2019.2890793
Journal volume & issue
Vol. 7
pp. 10870 – 10891

Abstract

Read online

Transportation and locomotion mode recognition from multimodal smartphone sensors is useful for providing just-in-time context-aware assistance. However, the field is currently held back by the lack of standardized datasets, recognition tasks, and evaluation criteria. Currently, the recognition methods are often tested on the ad hoc datasets acquired for one-off recognition problems and with different choices of sensors. This prevents a systematic comparative evaluation of methods within and across research groups. Our goal is to address these issues by: 1) introducing a publicly available, large-scale dataset for transportation and locomotion mode recognition from multimodal smartphone sensors; 2) suggesting 12 reference recognition scenarios, which are a superset of the tasks we identified in the related work; 3) suggesting relevant combinations of sensors to use based on energy considerations among accelerometer, gyroscope, magnetometer, and global positioning system sensors; and 4) defining precise evaluation criteria, including training and testing sets, evaluation measures, and user-independent and sensor-placement independent evaluations. Based on this, we report a systematic study of the relevance of statistical and frequency features based on the information theoretical criteria to inform recognition systems. We then systematically report the reference performance obtained on all the identified recognition scenarios using a machine-learning recognition pipeline. The extent of this analysis and the clear definition of the recognition tasks enable future researchers to evaluate their own methods in a comparable manner, thus contributing to further advances in the field. The dataset and the code are available online. http://www.shl-dataset.org/.

Keywords