Sensors (Jun 2016)

On Curating Multimodal Sensory Data for Health and Wellness Platforms

  • Muhammad Bilal Amin,
  • Oresti Banos,
  • Wajahat Ali Khan,
  • Hafiz Syed Muhammad Bilal,
  • Jinhyuk Gong,
  • Dinh-Mao Bui,
  • Soung Ho Cho,
  • Shujaat Hussain,
  • Taqdir Ali,
  • Usman Akhtar,
  • Tae Choong Chung,
  • Sungyoung Lee

DOI
https://doi.org/10.3390/s16070980
Journal volume & issue
Vol. 16, no. 7
p. 980

Abstract

Read online

In recent years, the focus of healthcare and wellness technologies has shown a significant shift towards personal vital signs devices. The technology has evolved from smartphone-based wellness applications to fitness bands and smartwatches. The novelty of these devices is the accumulation of activity data as their users go about their daily life routine. However, these implementations are device specific and lack the ability to incorporate multimodal data sources. Data accumulated in their usage does not offer rich contextual information that is adequate for providing a holistic view of a user’s lifelog. As a result, making decisions and generating recommendations based on this data are single dimensional. In this paper, we present our Data Curation Framework (DCF) which is device independent and accumulates a user’s sensory data from multimodal data sources in real time. DCF curates the context of this accumulated data over the user’s lifelog. DCF provides rule-based anomaly detection over this context-rich lifelog in real time. To provide computation and persistence over the large volume of sensory data, DCF utilizes the distributed and ubiquitous environment of the cloud platform. DCF has been evaluated for its performance, correctness, ability to detect complex anomalies, and management support for a large volume of sensory data.

Keywords