IEEE Access (Jan 2024)

MediSign: An Attention-Based CNN-BiLSTM Approach of Classifying Word Level Signs for Patient-Doctor Interaction in Hearing Impaired Community

  • Md. Amimul Ihsan,
  • Abrar Faiaz Eram,
  • Lutfun Nahar,
  • Muhammad Abdul Kadir

DOI
https://doi.org/10.1109/ACCESS.2024.3370684
Journal volume & issue
Vol. 12
pp. 33803 – 33815

Abstract

Read online

Along with day-to-day communication, receiving medical care is quite challenging for the hearing impaired and mute population, especially in developing countries where medical facilities are not as modernized as in the West. A word-level sign language interpretation system that is aimed toward detecting medically relevant signs can allow smooth communication between doctors and hearing impaired patients, ensuring seamless medical care. To that end, a dataset from twenty distinct signers of diverse backgrounds performing 30 frequently used words in patient-doctor interaction was created. The proposed system has been built employing MobileNetV2 in conjunction with an attention-based Bidirectional LSTM network to achieve robust classification, where the validation accuracy and f1- scores were 95.83% and 93%, respectively. Notably, the accuracy of the proposed model surpasses the recent word-level sign language classification method in a medical context by 5%. Furthermore, the comparison of evaluation metrics with contemporary word-level sign language recognition models in American, Arabic, and German Sign Language further affirmed the capability of the proposed architecture.

Keywords