SUBMIT ONLINE: https://www.easychair.org/conferences/?conf=prai2026 (Please choose Special Session 7)
As a cutting-edge direction in interdisciplinary research, multimodal affective computing is progressively transcending the limitations of traditional unimodal affective analysis. By integrating multi-source heterogeneous data such as speech, text, images, and physiological signals, multimodal affective computing aims to develop more comprehensive and robust emotion recognition and understanding models. These advancements deliver natural and intelligent services for scenarios including human-computer interaction, mental health monitoring, personalized education, and intelligent healthcare. This special session will solicit contributions on the recent progress of multimodal affective computing and applicaitons.
Sirui ZhaoUniversity of Science and Technology of China, China Email: siruit@ustc.edu.cn |
RELATED TOPICS
Topics of interest include, but are not limited to:
-
I. Technical Foundations
- Speech emotion recognition (recognition, synthesis, etc.)
- Facial expression or micro-expression analysis
- Conversational emotion recognition
- Emotional dialogue generation
- Stance detection and multimodal sarcasm analysis
- Emotionally driven AI virtual digital humans
- LLM-driven affective computing
- Physiological signal-based affective computing (e.g., EEG, ECG, GSR fusion)
- Knowledge-driven emotion reasoning (e.g., emotion-centric knowledge graphs)
- Personality-aware emotion modeling and generation
- Emotion-value alignment and regulation in large-scale language modelsII. Application-Oriented Research
- Emotion recognition and monitoring in mental health support
(e.g.,rPPG, depression, anxiety detection, emotion-aware interventions)
- Emotion-aware intelligent customer service and dialogue systems
(e.g., adaptive empathy-driven response strategies)
- Affective computing in smart driving and mobility
(e.g., driver stress, fatigue, and emotional distraction monitoring)
- Emotion-driven intelligent education technologies
(e.g., affect-aware tutoring and engagement tracking)
- Emotionally aware healthcare and human-robot interaction
(e.g., improving doctor-patient communication, robotic caregiving)
- Emotion analysis and generation for social media content
(e.g., emotional trend analysis, affect-based recommendation and moderation)
IMPORTANT DATE
Final Submission Deadline: July 05, 2026
Final Notification Date: July 20, 2026
Final Registration Deadline: July 25, 2026

Sirui Zhao