Affiliations 

  • 1 Faculty of Cognitive Sciences and Human Development, Universiti Malaysia Sarawak, Kuching, Sarawak 94300, Malaysia
  • 2 Department of Computer Science & Information Technology, Hazara University Mansehra, Mansehra, 21120 Khyber Pakhtunkhwa, Pakistan
  • 3 Department of Computer Science, College of Computer and Information Sciences, King Saud University, Riyadh 11543, Saudi Arabia
  • 4 Faculty of Engineering and Information Technology, Taiz University, Taiz 6803, Yemen
  • 5 Information and Communication Engineering Technology, School of Engineering Technology and Applied Science, Centennial College, Toronto, Canada
Comput Intell Neurosci, 2021;2021:9023010.
PMID: 34925497 DOI: 10.1155/2021/9023010

Abstract

The deaf-mutes population always feels helpless when they are not understood by others and vice versa. This is a big humanitarian problem and needs localised solution. To solve this problem, this study implements a convolutional neural network (CNN), convolutional-based attention module (CBAM) to recognise Malaysian Sign Language (MSL) from images. Two different experiments were conducted for MSL signs, using CBAM-2DResNet (2-Dimensional Residual Network) implementing "Within Blocks" and "Before Classifier" methods. Various metrics such as the accuracy, loss, precision, recall, F1-score, confusion matrix, and training time are recorded to evaluate the models' efficiency. The experimental results showed that CBAM-ResNet models achieved a good performance in MSL signs recognition tasks, with accuracy rates of over 90% through a little of variations. The CBAM-ResNet "Before Classifier" models are more efficient than "Within Blocks" CBAM-ResNet models. Thus, the best trained model of CBAM-2DResNet is chosen to develop a real-time sign recognition system for translating from sign language to text and from text to sign language in an easy way of communication between deaf-mutes and other people. All experiment results indicated that the "Before Classifier" of CBAMResNet models is more efficient in recognising MSL and it is worth for future research.

* Title and MeSH Headings from MEDLINE®/PubMed®, a database of the U.S. National Library of Medicine.