DESIGNING EMOTION-SENSITIVE INTERACTIVE ART USING BIOMETRIC SENSORS AND ADAPTIVE COMPUTING MODELS
DOI:
https://doi.org/10.29121/shodhkosh.v7.i4s.2026.7491Keywords:
Emotion Recognition, Interactive Art, Biometric Sensors, Adaptive Computing, Deep Learning, Reinforcement LearningAbstract [English]
The dynamic terminal to technology and human perception, creativity and human expression, an emotion-sensitive interactive art is where creative art is capable of responding to the emotional state of the involved participants. The paper will give an elaborate layout of the evolution of adaptive art systems through the multimodal biometric sensor and intelligent computing model. Physiological stimuli to acquire real-time emotional stimuli are electroencephalography (EEG), electrocardiography (ECG), galvanic skin response (GSR) and eye-tracking measurements. Noise filtering, normalization and feature extraction are some of the advanced signal preprocessing algorithms that ensure high data representation. The machine learning and deep learning models that have been applied to classify emotional states with high precision are referred to as Emotional state SVM, Convolutional Neural Networks (CNN), Long Short-Term Memory (LSTM), and Transformer-based models. The suggested system includes a dynamic computing system that is guided by the reinforcement learning, creating the possibility of active dialogue between the user emotions and artistic products. The system responses are constantly upgraded by the feedback control mechanisms, making them more engaging and individual. The proposed approach can be proved effective as experimental assessments show an increased rate of emotion recognition and responsiveness.
References
Bhatia, Y., Bari, A. H., Hsu, G.-S. J., and Gavrilova, M. (2022). Motion Capture Sensor-Based Emotion Recognition Using a Bi-Modular Sequential Neural Network. Sensors, 22(1), 403. https://doi.org/10.3390/s22010403
Chen, X., Ibrahim, Z., and Aziz, A. A. (2025). Predicting Emotional Responses in Interactive Art Using Random Forests: A Model Grounded in Enactive Aesthetics. Frontiers in Psychology, 16, 1609103. https://doi.org/10.3389/fpsyg.2025.1609103
Cimtay, Y., and Ekmekcioglu, E. (2020). Investigating the use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition. Sensors, 20, 2034. https://doi.org/10.3390/s20072034
DaCosta, B., and Kinsell, C. (2023). Serious Games in Cultural Heritage: A Review of Practices and Considerations in the Design of Location-Based Games. Education Sciences, 13, 47. https://doi.org/10.3390/educsci13010047
Du, G., Zhou, W., Li, C., Li, D., and Liu, P. X. (2020). An Emotion Recognition Method for Game Evaluation Based on Electroencephalogram. IEEE Transactions on Affective Computing, 10, 598.
Fu, E., Li, X., Yao, Z., Ren, Y. X., Wu, Y. H., and Fan, Q. Q. (2021). Personnel Emotion Recognition Model for Internet of Vehicles Security Monitoring in Community Public Space. EURASIP Journal on Advances in Signal Processing, 2021, 81. https://doi.org/10.1186/s13634-021-00789-5
Hu, H., Wan, Y., Tang, K. Y., Li, Q., and Wang, X. (2025). Affective-Computing-Driven Personalized Display of Cultural Information for Commercial Heritage Architecture. Applied Sciences, 15, 2076–3417. https://doi.org/10.3390/app15073459
Kashyap, S. V., Purohit, S., Kumar, D. A., Jawaid, F. I. M., Kumar, J. R. R., and Ajani, S. N. (2025, December). Visual Storytelling and Explainable Intelligence in Organizational Change Communication. ShodhKosh Journal of Visual and Performing Arts, 6(5s), 696–707. https://doi.org/10.29121/shodhkosh.v6.i5s.2025.6965
Oh, G., Ryu, J., Jeong, E., Yang, J. H., Hwang, S., Lee, S., and Lim, S. (2021). DRER: Deep Learning-Based Driver’s Real Emotion Recognizer. Sensors, 21, 2166. https://doi.org/10.3390/s21062166
Patil, S. S., and Rosemaro, E. (2025, March). Cognitive Computing: Integrating Reasoning and Learning in Intelligent Systems. International Journal of Advanced Computer Engineering and Communication Technology (IJACECT), 13(1), 9–13. https://doi.org/10.65521/ijacect.v13i1.59
Wei, C., Chen, L.-L., Song, Z.-Z., Lou, X.-G., and Li, D.-D. (2020). EEG-Based Emotion Recognition Using Simple Recurrent Units Network and Ensemble Learning. Biomedical Signal Processing and Control, 58, 101756. https://doi.org/10.1016/j.bspc.2019.101756
Xu, G., Ren, T., Chen, Y., and Che, W. (2020). A One-Dimensional CNN-LSTM Model for Epileptic Seizure Recognition Using EEG Signal Analysis. Frontiers in Neuroscience, 14, 1253. https://doi.org/10.3389/fnins.2020.578126
Yan, D. N., Zhao, J., Ma, Y. M., and Ma, H. (2025). The Influence of Neuroticism Personality Trait on User Interaction with Game-Based Hand Rehabilitation Training. Displays, 87, 102944. https://doi.org/10.1016/j.displa.2024.102944
Yan, Z., Lim, C. K., Halim, S. A., Ahmed, M. F., Tan, K. L., and Li, L. (2025). Digital Sustainability of Heritage: Exploring Indicators Affecting the Effectiveness of Digital Dissemination of Intangible Cultural Heritage Through Qualitative Interviews. Sustainability, 17, 1593. https://doi.org/10.3390/su17041593
Yin, Y., Zheng, X., Hu, B., Zhang, Y., and Cui, X. (2020). EEG Emotion Recognition Using Fusion Model of Graph Convolutional Neural Networks and LSTM. Applied Soft Computing, 100, 106954. https://doi.org/10.1016/j.asoc.2020.106954
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Dr. Ria Kohli , Suhas Bhise , Yan Zhao , Senthil Kumar A , Muhammad Muhammad Suleiman , Dr. Aneesh Wunnava

This work is licensed under a Creative Commons Attribution 4.0 International License.
With the licence CC-BY, authors retain the copyright, allowing anyone to download, reuse, re-print, modify, distribute, and/or copy their contribution. The work must be properly attributed to its author.
It is not necessary to ask for further permission from the author or journal board.
This journal provides immediate open access to its content on the principle that making research freely available to the public supports a greater global exchange of knowledge.























