Acta Informatica Malaysia (Jan 2023)

A COMPARATIVE STUDY USING 2D CNN AND TRANSFER LEARNING TO DETECT AND CLASSIFY ARABIC-SCRIPT-BASED SIGN LANGUAGE

  • Karwan Mahdi Hama Rawf,
  • Aree Ali Mohammed,
  • Ayub Othman Abdulrahman,
  • Peshraw Ahmed Abdalla,
  • Karzan J. Ghafor

DOI
https://doi.org/10.26480/aim.01.2023.08.14
Journal volume & issue
Vol. 7, no. 1
pp. 08 – 14

Abstract

Read online

Nowadays, Sign Language Recognition (SLR) plays a significant impact in the disabled community because it is utilized as a learning tool for everyday tasks like interaction, education, training, and human activities. All three of these languages—Arabic, Persian, and Kurdish—share the same writing system, called the Arabic script. In order to categorize sign languages written in the Arabic alphabet, this article employs convolutional neural networks (CNN) and transfer learning (mobileNet) methods. The study’s primary goal is to develop a common standard for alphabetic sign language in Arabic, Persian, and Kurdish. Different activation functions were used throughout the model’s extensive training on the ASSL2022 dataset. There are a total of 81857 images included in the collection, gathered from two sources and representing the 40 Arabic-script-based alphabets. As can be seen from the data obtained, the proposed models perform well, with an average training accuracy of 99.7% for CNN and 99.32% for transfer learning. When compared to other research involving languages written in the Arabic script, this one achieves better detection and identification accuracy.

Keywords