A Framework for Malaysian Sign Language Recognition Using Deep Learning Initiatives

  • IMRAN MD JELAS Faculty of Computer and Mathematical Sciences, Universiti Teknologi MARA, Perak Branch Tapah Campus, Perak, Malaysia

Abstract

Problem: The greatest challenge since the introduction of MSL occur when deaf and hard of hearing people try to communicate using MSL with persons without disabilities who do not use MSL. To break this communication barrier, a substantially number of studies has been done to produce Malaysian Sign Language Recognition system[1]–[4].


 


Aims/Objectives: The main of objective of this paper is to develop a Low-Cost Malaysian Sign Language Recognition from Action to Text Framework using Deep Learning.


 


Methodology/approach: To achieve this objective, we propose a framework consisting of three main modules namely learning module, training module and detection module.


 


Results/finding:  To achieve this aim, the usage of MediaPipe framework was introduced. MediaPipe framework improved image acquisition and simplified image processing stage tremendously. Long short-term memory (LSTM) artificial neural network (ANN) is used as training algorithm in training module and prediction algorithm in detection module.


 


Implication/impact:  Thus, researcher only need to focus on reducing the human intervention and create a scalable machine learning system to realize deep learning.

References

[1] K. Van Murugiah, G. Subhashini, and R. Abdulla, “Wearable IoT based Malaysian sign language recognition and text translation system,” J. Appl. Technol. Innov., vol. 5, no. 4, pp. 51–58, 2021.
[2] M. Karbasi, A. Zabidi, I. M. Yassin, A. Waqas, and Z. Bhatti, “Malaysian sign language dataset for automatic sign language recognition system,” J. Fundam. Appl. Sci., vol. 9, no. 4S, p. 459, 2018.
[3] F. Wong, G. Sainarayanan, W. M. Abdullah, A. Chekima, F. E. Jupirin, and Y. F. A. Gaus, “Software-based Malaysian sign language recognition,” Adv. Intell. Syst. Comput., vol. 182 AISC, pp. 297–306, 2013.
[4] A. Z. Shukor, M. F. Miskon, M. H. Jamaluddin, F. Bin Ali Ibrahim, M. F. Asyraf, and M. B. Bin Bahar, “A New Data Glove Approach for Malaysian Sign Language Detection,” Procedia Comput. Sci., vol. 76, no. Iris, pp. 60–67, 2015.
[5] E. Kavlakoglu, “AI vs. Machine Learning vs. Deep Learning vs. Neural Networks: What’s the Difference?,” 2020. [Online]. Available: https://www.ibm.com/cloud/blog/ai-vs-machine-learning-vs-deep-learning-vs-neural-networks.
[6] S. M. Kamal, Y. Chen, S. Li, X. Shi, and J. Zheng, “Technical Approaches to Chinese Sign Language Processing: A Review,” IEEE Access, vol. 7, pp. 96926–96935, 2019.
[7] Sutarman, M. B. A. Majid, J. B. M. Zain, and A. Hermawan, “Recognition of Malaysian Sign Language using skeleton data with Neural Network,” Proc. - 2015 Int. Conf. Sci. Inf. Technol. Big Data Spectr. Futur. Inf. Econ. ICSITech 2015, no. October, pp. 231–236, 2016.
[8] D. Kothadiya, C. Bhatt, K. Sapariya, K. Patel, A.-B. Gil-González, and J. M. Corchado, “Deepsign: Sign Language Detection and Recognition Using Deep Learning,” Electronics, vol. 11, no. 11, p. 1780, 2022.
[9] A. H. Alrubayi et al., “A pattern recognition model for static gestures in malaysian sign language based on machine learning techniques,” Comput. Electr. Eng., vol. 95, no. April 2020, p. 107383, 2021.
[10] T. S. Tan, A. K. Ariff, S. H. Salleh, K. S. Siew, and S. H. Leong, “Wireless data gloves Malay sign language recognition system,” 2007 6th Int. Conf. Information, Commun. Signal Process. ICICS, pp. 3–6, 2007.
[11] N. M. Kakoty and M. D. Sharma, “Recognition of Sign Language Alphabets and Numbers based on Hand Kinematics using A Data Glove,” Procedia Comput. Sci., vol. 133, pp. 55–62, 2018.
[12] M. S. Amin, S. T. H. Rizvi, and M. M. Hossain, “A Comparative Review on Applications of Different Sensors for Sign Language Recognition,” J. Imaging, vol. 8, no. 4, pp. 1–48, 2022.
[13] D. Kumar Choudhary, R. Singh, and D. Kamathania, “Sign Language Recognition System,” 4TH Int. Conf. Innov. Comput. Commun. (ICICC 2021), no. ICICC, pp. 4–7, 2021.
[14] I. A. Adeyanju, O. O. Bello, and M. A. Adegboye, “Machine learning methods for sign language recognition: A critical review and analysis,” Intell. Syst. with Appl., vol. 12, pp. 1–36, 2021.
[15] N. A. Mohammad, S. T. Kian, F. S. Chin, H. S. Toong, and R. Abdul Rahim, “Development of a Malaysian Sign Language interpreter using image recognition for the community to understand the deaf,” Elektr. J. Electr. Eng., vol. 20, no. 2-3 SE-Articles, pp. 70–72, 2021.
[16] D. K. Singh, “3D-CNN based Dynamic Gesture Recognition for Indian Sign Language Modeling,” Procedia CIRP, vol. 189, pp. 76–83, 2021.
[17] S. Katoch, V. Singh, and U. S. Tiwary, “Indian Sign Language recognition system using SURF with SVM and CNN,” Array, vol. 14, no. March, p. 100141, 2022.
[18] A. Liew, S. Lien, and L. K. Yin, “Gesture Recognition-Malaysian Sign Language Recognition using Convolutional Neural Network,” in International Conference on Digital Transformation and Applications (ICDXA) 2020, 2020, pp. 1–6.
[19] M. A. M. M. Asri, Z. Ahmad, I. A. Mohtar, and S. Ibrahim, “A real time Malaysian sign language detection algorithm based on YOLOv3,” Int. J. Recent Technol. Eng., vol. 8, no. 2 Special Issue 11, pp. 651–656, 2019.
[20] K. Wangchuk, P. Riyamongkol, and R. Waranusast, “Real-time Bhutanese Sign Language digits recognition system using Convolutional Neural Network,” ICT Express, vol. 7, no. 2, pp. 215–220, 2021.
[21] Suharjito, N. Thiracitta, and H. Gunawan, “SIBI Sign Language Recognition Using Convolutional Neural Network Combined with Transfer Learning and non-trainable Parameters,” Procedia Comput. Sci., vol. 179, no. 2019, pp. 72–80, 2021.
[22] A. Ardiansyah, B. Hitoyoshi, M. Halim, N. Hanafiah, and A. Wibisurya, “Systematic Literature Review: American Sign Language Translator,” Procedia Comput. Sci., vol. 179, no. 2020, pp. 541–549, 2021.
[23] S. Subburaj and S. Murugavalli, “Measurement : Sensors Survey on sign language recognition in context of vision-based and deep learning,” Meas. Sensors, vol. 23, no. June, p. 100385, 2022.
[24] S. Alashhab, A. J. Gallego, and M. Á. Lozano, “Efficient Gesture Recognition for the Assistance of Visually Impaired People using Multi-Head Neural Networks,” Eng. Appl. Artif. Intell., vol. 114, no. June, p. 105188, 2022.
[25] Ü. Atila and F. Sabaz, “Turkish lip-reading using Bi-LSTM and deep learning models,” Eng. Sci. Technol. , an Int. J., no. xxxx, 2022.
[26] C. Lugaresi et al., “MediaPipe: A Framework for Building Perception Pipelines,” Google Res., pp. 1–9, 2019.
[27] C. Lugaresi et al., “MediaPipe: A Framework for Perceiving and Processing Reality,” Google Res., pp. 1–4, 2019.
[28] F. Zhang et al., “MediaPipe Hands: On-device Real-time Hand Tracking,” Google Res., pp. 1–5, 2020.
Published
2022-11-15
How to Cite
MD JELAS, IMRAN. A Framework for Malaysian Sign Language Recognition Using Deep Learning Initiatives. Mathematical Sciences and Informatics Journal, [S.l.], v. 3, n. 2, p. 65-79, nov. 2022. ISSN 2735-0703. Available at: <https://myjms.mohe.gov.my/index.php/mij/article/view/19395>. Date accessed: 15 sep. 2024. doi: https://doi.org/10.24191/mij.v3i2.19395.
Section
Articles

Most read articles by the same author(s)

Obs.: This plugin requires at least one statistics/report plugin to be enabled. If your statistics plugins provide more than one metric then please also select a main metric on the admin's site settings page and/or on the journal manager's settings pages.