SoftwareX (Sep 2024)
Call with eyes: A robust interface based on ANN to assist people with locked-in syndrome
Abstract
Given the critical need for effective communication for individuals with locked-in syndrome (LIS), significant efforts have been made to develop assistive technologies that minimize error margins and enhance reliability. In this work, we present a Python-based interface specifically designed to assist individuals with LIS, utilizing the face mesh library to reduce variability in facial illumination and background, thereby improving the detection of key facial points. The interface captures the spatial position of each iris center, along with other points of interest in the patient's eyes (30 points in total), calculating 54 Euclidean distances. Data from 10 individuals were used to train an artificial neural network (ANN) capable of identifying and classifying four gestures: vertical eye movements (looking up and looking down), closed eyes, and open eyes (no movement), achieving an accuracy of 99.8%. This ANN model was integrated into the developed interface, allowing the user to navigate through a vertical menu with five options. The user can position over an option using eye movements and select the desired option by closing their eyes. The interface was validated with five individuals whose data were not used to train the ANN, achieving error-free selection of all five options. The proposed system demonstrates superior performance compared to other systems, with the added advantage of being a simple, robust solution that does not require calibration for new users.