Multi-Modal Emotional Understanding in AI Virtual Characters: Integrating Micro-Expression-Driven Feedback within Context-Aware Facial Micro-Expression Processing Systems
Dr. Xiaoling XieDepartment of Fine Arts, International College, Krirk University xiexiaoling@siva.edu.cn0009-0004-3054-2529
Dr. Zeming FangProfessor, Department of Fine Arts, International College, Krirk University 13860669208@sohu.com0009-0005-5958-1879
Keywords: Multi-Modal, Emotional Understanding, AI Virtual Characters, Micro-Expressions, Cultural Context, User Interaction
Abstract
To engage users, AI Virtual Characters must comprehend emotions. The paper develops and evaluates Chinese-specific context-aware facial micro-expression processing algorithms and feedback mechanisms to improve AI virtual characters' multi-modal emotional comprehension in Chinese culture. Specialized algorithms were used to collect and evaluate Chinese micro-expressions and assess AI virtual characters' emotional comprehension in user interactions. Chinese participants of various ages, genders, and places were recruited for micro-expression recognition to ensure cultural inclusion. A comprehensive method collected quantitative and qualitative data. We integrated interview and AI virtual character feedback with quantitative indicators like emotion recognition accuracy, user engagement, and micro-expression intensity. In the study, demographics affect emotion recognition accuracy and age, gender, and location-specific virtual avatars increase emotional resonance. A study demonstrated that context influences micro-expression interpretation, particularly in distinguishing urban and rural surprise and grief. Textual micro-expression recommendations and real-time AI character expression modifications increased accuracy and user experience. Micro-expressions, visual and auditory cues, and physiological reactions are linked, requiring multimodal signal processing for emotional awareness. Interactive virtual help, gaming, and education may benefit from culturally appropriate AI characters. Deep learning, multimodal fusion, and explainable AI are used in AI emotional interaction theory and technique. Finally, using Chinese cultural intricacies, our study improves AI virtual character multi-modal emotional comprehension. Culturally sensitive AI personalities and emotional AI technologies are developed using context-aware face micro-expression analysis algorithms and feedback systems.