RT Journal Article T1 A Machine Learning Based Full Duplex System Supporting Multiple Sign Languages for the Deaf and Mute A1 Imran Saleem, Muhammad A1 Siddiqui, Atif Ahmed A1 Noor, Shaheena A1 Luque-Nieto, Miguel Ángel A1 Nava-Baro, Enrique K1 Mudos K1 Sordos K1 Lengua de signos - Innovaciones tecnológicas K1 Dispositivos de comunicación para minusválidos AB This manuscript presents a full duplex communication system for the Deaf and Mute (D-M) based on Machine Learning (ML). These individuals, who generally communicate through sign language, are an integral part of our society, and their contribution is vital. They face communication difficulties mainly because others, who generally do not know sign language, are unable to communicate with them. The work presents a solution to this problem through a system enabling the non-deaf and mute (ND-M) to communicate with the D-M individuals without the need to learn sign language. The system is low-cost, reliable, easy to use, and based on a commercial-off-the-shelf (COTS) Leap Motion Device (LMD). The hand gesture data of D-M individuals is acquired using an LMD device and processed using a Convolutional Neural Network (CNN) algorithm. A supervised ML algorithm completes the processing and converts the hand gesture data into speech. A new dataset for the ML-based algorithm is created and presented in this manuscript. This dataset includes three sign language datasets, i.e., American Sign Language (ASL), Pakistani Sign Language (PSL), and Spanish Sign Language (SSL). The proposed system automatically detects the sign language and converts it into an audio message for the ND-M. Similarities between the three sign languages are also explored, and further research can be carried out in order to help create more datasets, which can be a combination of multiple sign languages. The ND-M can communicate by recording their speech, which is then converted into text and hand gesture images. The system can be upgraded in the future to support more sign language datasets. The system also provides a training mode that can help D-M individuals improve their hand gestures and also understand how accurately the system is detecting these gestures. The proposed system has been validated through a series of experiments resulting in hand gesture detection accuracy exceeding 95% PB MDPI YR 2023 FD 2023-02-28 LK https://hdl.handle.net/10630/26584 UL https://hdl.handle.net/10630/26584 LA eng NO Saleem MI, Siddiqui A, Noor S, Luque-Nieto M-A, Nava-Baro E. A Machine Learning Based Full Duplex System Supporting Multiple Sign Languages for the Deaf and Mute. Applied Sciences. 2023; 13(5):3114. https://doi.org/10.3390/app13053114 NO Funding for open access charge: Universidad de Málaga DS RIUMA. Repositorio Institucional de la Universidad de Málaga RD 19 ene 2026