TWO-HANDED DYNAMIC GESTURE RECOGNITION USING RGB-D SENSORS

Authors

  • Yu-Chi Pu Department of Maritime Information and Technology, National Kaohsiung University of Science and Technology, Kaohsiung, Taiwan, Republic of China
  • Wei-Chang Du Department of Information Engineering, I-Shou University, Kaohsiung, Taiwan, Republic of China
  • Kai-Wei Shih Department of Information Engineering, I-Shou University, Kaohsiung, Taiwan, Republic of China

DOI:

https://doi.org/10.29121/ijetmr.v12.i5.2025.1568

Keywords:

Human-Computer Interaction, Rgb-D Sensors, Gesture Recognition

Abstract

Sign language is a form of visual-gestural communication that conveys semantic content through hand movements and postures. It comprises a structured set of gestures, each associated with specific meanings. For individuals with hearing impairments, sign language serves as a primary medium for expression, perception, and interaction with the external world. However, due to the general lack of sign language proficiency among the broader population, effective communication remains a significant challenge. This paper uses an RGB-D imaging device to capture both color and depth information of dynamic hand gestures, enabling more robust and discriminative feature extraction than traditional RGB-based approaches. The proposed system focuses on recognizing two-handed dynamic gestures by analyzing spatial configurations and temporal motion patterns. A gesture symbolization mechanism facilitates the recognition process, wherein complex gesture sequences are encoded into a set of primitive symbols representing key postures and transitions. These symbolic representations are then compared using a fuzzy matching algorithm, which accounts for variations in gesture execution and temporal alignment, thereby enhancing the system's tolerance. This methodology aims to provide a reliable and flexible framework for real-time gesture recognition in natural interaction scenarios.

Downloads

Download data is not yet available.

References

Keselman, L., Woodfill, J. I., Grunnet-Jepsen, A., & Bhowmik, A. (2017). Intel Realsense Stereoscopic Depth Cameras. 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 1267-1276. https://doi.org/10.1109/CVPRW.2017.167 DOI: https://doi.org/10.1109/CVPRW.2017.167

Lei, L., Jinling, Z., Yingjie, Z., & Hong, L. (2015). A Static Gesture Recognition Method Based on Data Glove. Journal of Computer-Aided Design & Computer Graphics, 27(12), 2410-2418.

Obaid, F., Babadi, A., & Yoosofan, A. (2020). Hand Gesture Recognition in Video Sequences Using Deep Convolutional and Recurrent Neural Networks. Applied Computer Systems, 25(1), 57-61. https://doi.org/10.2478/acss-2020-0007 DOI: https://doi.org/10.2478/acss-2020-0007

Osman Hashi, A., Zaiton Mohd Hashim, S., & Bte Asamah, A. (2024). A Systematic Review of Hand Gesture Recognition: An Update from 2018 To 2024. IEEE Access, 12, 143599-143626. https://doi.org/10.1109/ACCESS.2024.3421992 DOI: https://doi.org/10.1109/ACCESS.2024.3421992

Parveen, N., Roy, A., & Sandesh, D. S. (2020). Human-Computer Interaction Through Hand Gesture Recognition Technology. International Journal of Computing and Digital Systems, 9(4).

Rahman, M. M., Uzzaman, A., Khatun, F., Aktaruzzaman, M., & Siddique, N. (2025). A Comparative Study of Advanced Technologies and Methods in Hand Gesture Analysis and Recognition Systems. Expert Systems with Applications, 266(C). https://doi.org/10.1016/j.eswa.2024.125929 DOI: https://doi.org/10.1016/j.eswa.2024.125929

Rudwan, M. S. M., & Fonou-Dombeu, J. V. (2023). Hybridizing Fuzzy String Matching and Machine Learning for Improved Ontology Alignment. Future Internet, 15(7), Article 7.https://doi.org/10.3390/fi15070229 DOI: https://doi.org/10.3390/fi15070229

Shaikh, M. B., & Chai, D. (2021). Rgb-D Data-Based Action Recognition: A Review. Sensors, 21(12), Article 12. https://doi.org/10.3390/s21124246 DOI: https://doi.org/10.3390/s21124246

Sun, Y., Weng, Y., Luo, B., Li, G., Tao, B., Jiang, D., & Chen, D. (2023). Gesture Recognition Algorithm Based on Multi-Scale Feature Fusion in Rgb-D Images. IET Image Processing, 17(4), 1280-1290. https://doi.org/10.1049/ipr2.12712 DOI: https://doi.org/10.1049/ipr2.12712

Wu, Y., Huang, D., Du, W.-C., Wu, M., & Li, C.-Z. (2020). Joint-Based Hand Gesture Recognition Using RealSense. Journal of Computer, 31(2), 141-151. https://doi.org/10.3966/199115992020043102013

Yasen, M., & Jusoh, S. (2019). A Systematic Review on Hand Gesture Recognition Techniques, Challenges, and Applications. PeerJ Computer Science, 5, e218. https://doi.org/10.7717/peerj-cs.218 DOI: https://doi.org/10.7717/peerj-cs.218

Downloads

Published

2025-05-07

How to Cite

Pu, Y.-C., Du, W.-C., & Shih, K.-W. (2025). TWO-HANDED DYNAMIC GESTURE RECOGNITION USING RGB-D SENSORS. International Journal of Engineering Technologies and Management Research, 12(5), 1–10. https://doi.org/10.29121/ijetmr.v12.i5.2025.1568