TWO-HANDED DYNAMIC GESTURE RECOGNITION USING RGB-D SENSORS
DOI:
https://doi.org/10.29121/ijetmr.v12.i5.2025.1568Keywords:
Human-Computer Interaction, Rgb-D Sensors, Gesture RecognitionAbstract
Sign language is a form of visual-gestural communication that conveys semantic content through hand movements and postures. It comprises a structured set of gestures, each associated with specific meanings. For individuals with hearing impairments, sign language serves as a primary medium for expression, perception, and interaction with the external world. However, due to the general lack of sign language proficiency among the broader population, effective communication remains a significant challenge. This paper uses an RGB-D imaging device to capture both color and depth information of dynamic hand gestures, enabling more robust and discriminative feature extraction than traditional RGB-based approaches. The proposed system focuses on recognizing two-handed dynamic gestures by analyzing spatial configurations and temporal motion patterns. A gesture symbolization mechanism facilitates the recognition process, wherein complex gesture sequences are encoded into a set of primitive symbols representing key postures and transitions. These symbolic representations are then compared using a fuzzy matching algorithm, which accounts for variations in gesture execution and temporal alignment, thereby enhancing the system's tolerance. This methodology aims to provide a reliable and flexible framework for real-time gesture recognition in natural interaction scenarios.
Downloads
References
Keselman, L., Woodfill, J. I., Grunnet-Jepsen, A., & Bhowmik, A. (2017). Intel Realsense Stereoscopic Depth Cameras. 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 1267-1276. https://doi.org/10.1109/CVPRW.2017.167 DOI: https://doi.org/10.1109/CVPRW.2017.167
Lei, L., Jinling, Z., Yingjie, Z., & Hong, L. (2015). A Static Gesture Recognition Method Based on Data Glove. Journal of Computer-Aided Design & Computer Graphics, 27(12), 2410-2418.
Obaid, F., Babadi, A., & Yoosofan, A. (2020). Hand Gesture Recognition in Video Sequences Using Deep Convolutional and Recurrent Neural Networks. Applied Computer Systems, 25(1), 57-61. https://doi.org/10.2478/acss-2020-0007 DOI: https://doi.org/10.2478/acss-2020-0007
Osman Hashi, A., Zaiton Mohd Hashim, S., & Bte Asamah, A. (2024). A Systematic Review of Hand Gesture Recognition: An Update from 2018 To 2024. IEEE Access, 12, 143599-143626. https://doi.org/10.1109/ACCESS.2024.3421992 DOI: https://doi.org/10.1109/ACCESS.2024.3421992
Parveen, N., Roy, A., & Sandesh, D. S. (2020). Human-Computer Interaction Through Hand Gesture Recognition Technology. International Journal of Computing and Digital Systems, 9(4).
Rahman, M. M., Uzzaman, A., Khatun, F., Aktaruzzaman, M., & Siddique, N. (2025). A Comparative Study of Advanced Technologies and Methods in Hand Gesture Analysis and Recognition Systems. Expert Systems with Applications, 266(C). https://doi.org/10.1016/j.eswa.2024.125929 DOI: https://doi.org/10.1016/j.eswa.2024.125929
Rudwan, M. S. M., & Fonou-Dombeu, J. V. (2023). Hybridizing Fuzzy String Matching and Machine Learning for Improved Ontology Alignment. Future Internet, 15(7), Article 7.https://doi.org/10.3390/fi15070229 DOI: https://doi.org/10.3390/fi15070229
Shaikh, M. B., & Chai, D. (2021). Rgb-D Data-Based Action Recognition: A Review. Sensors, 21(12), Article 12. https://doi.org/10.3390/s21124246 DOI: https://doi.org/10.3390/s21124246
Sun, Y., Weng, Y., Luo, B., Li, G., Tao, B., Jiang, D., & Chen, D. (2023). Gesture Recognition Algorithm Based on Multi-Scale Feature Fusion in Rgb-D Images. IET Image Processing, 17(4), 1280-1290. https://doi.org/10.1049/ipr2.12712 DOI: https://doi.org/10.1049/ipr2.12712
Wu, Y., Huang, D., Du, W.-C., Wu, M., & Li, C.-Z. (2020). Joint-Based Hand Gesture Recognition Using RealSense. Journal of Computer, 31(2), 141-151. https://doi.org/10.3966/199115992020043102013
Yasen, M., & Jusoh, S. (2019). A Systematic Review on Hand Gesture Recognition Techniques, Challenges, and Applications. PeerJ Computer Science, 5, e218. https://doi.org/10.7717/peerj-cs.218 DOI: https://doi.org/10.7717/peerj-cs.218
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Yu-Chi Pu, Wei-Chang Du, Kai-Wei Shih

This work is licensed under a Creative Commons Attribution 4.0 International License.
License and Copyright Agreement
In submitting the manuscript to the journal, the authors certify that:
- They are authorized by their co-authors to enter into these arrangements.
- The work described has not been formally published before, except in the form of an abstract or as part of a published lecture, review, thesis, or overlay journal.
- That it is not under consideration for publication elsewhere.
- That its release has been approved by all the author(s) and by the responsible authorities – tacitly or explicitly – of the institutes where the work has been carried out.
- They secure the right to reproduce any material that has already been published or copyrighted elsewhere.
- They agree to the following license and copyright agreement.
Copyright
Authors who publish with International Journal of Engineering Technologies and Management Research agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (CC BY-SA 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors can enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or edit it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) before and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
For More info, please visit CopyRight Section