JOIV: International Journal on Informatics Visualization (Jan 2023)

Selecting Control Menu on Electric Wheelchair Using Eyeball Movement for Difable Person

  • Fitri Utaminingrum,
  • I Komang Somawirata,
  • Gusti Pengestu,
  • Tipajin Thaipisutikul,
  • Timothy K. Shih

DOI
https://doi.org/10.30630/joiv.7.1.1011
Journal volume & issue
Vol. 7, no. 1
pp. 37 – 43

Abstract

Read online

Each country's number of people with disabilities and strokes increases yearly. Hand defects and stroke make them have limitations in doing activities. It caused their hand has paralyzed. Hence, they find it difficult to do daily activities, such as running a wheelchair, choosing a menu on the screen display, and so on. One solution offered is utilizing eye movement as a navigation tool that can replace the role of the user's hand, so they can run a wheelchair independently or choose a menu selection on display by themselves through the movement of their eyes. Detection of eyeball movements in this study only utilizes a camera as a sensor mounted in front of the user. So that it is more practical and easier to use than if we have to pair an electrooculography sensor in the area around the user's eyes. This research proposed a new approach to detect the five gazes (upward, downward, leftward, rightward, and forward) of the eyeball movements by using Backpropagation Neural Network (BPNN) and Dynamic Line Sector Coordinate (DLSC). Line Sector Coordinate is used to detect the eyeball movement based on the pupil coordinate position. The eyeball movement direction was analyzed from four lengths of a line. Our proposed method can detect five gaze directions that can be used for selecting four menus on the display monitor. The mean accuracy of our proposed method to detect eye movements for each gaze is 88.6%.

Keywords