Applied Sciences (Mar 2023)

A Computer Vision-Based Application for the Assessment of Head Posture: A Validation and Reliability Study

  • Andoni Carrasco-Uribarren,
  • Xavier Marimon,
  • Flora Dantony,
  • Sara Cabanillas-Barea,
  • Alejandro Portela,
  • Luis Ceballos-Laita,
  • Albert Massip-Álvarez

DOI
https://doi.org/10.3390/app13063910
Journal volume & issue
Vol. 13, no. 6
p. 3910

Abstract

Read online

As its name implies, the forward head position (FHP) is when the head is further forward of the trunk than normal. This can cause neck and shoulder tension, as well as headaches. The craniovertebral angle (CVA) measured with 2D systems such as Kinovea software is often used to assess the FHP. Computer vision applications have proven to be reliable in different areas of daily life. The aim of this study is to analyze the test-retest and inter-rater reliability and the concurrent validity of a smartphone application based on computer vision for the measurement of the CVA. Methods: The CVAs of fourteen healthy volunteers, fourteen neck pain patients, and fourteen tension-type headache patients were assessed. The assessment was carried out twice, with a week of rest between sessions. Each examiner took a lateral photo in a standing position with the smartphone app based on computer vision. The test-retest reliability was calculated with the assessment of the CVA measured by the smartphone application, and the inter-rater reliability was also calculated. A third examiner assessed the CVA using 2D Kinovea software to calculate its concurrent validity. Results: The CVA in healthy volunteers was 54.65 (7.00); in patients with neck pain, 57.67 (5.72); and in patients with tension-type headaches, 54.63 (6.48). The test-retest reliability was excellent, showing an Intraclass Correlation Coefficient (ICC) of 0.92 (0.86–0.95) for the whole sample. The inter-rater reliability was excellent, with an ICC of 0.91 (0.84–0.95) for the whole sample. The standard error of the measurement with the app was stated as 1.83°, and the minimum detectable change was stated as 5.07°. The concurrent validity was high: r = 0.94, p < 0.001. Conclusion: The computer-based smartphone app showed excellent test-retest and inter-rater reliability and strong concurrent validity compared to Kinovea software for the measurement of CVA.

Keywords