Special Session 4. Multimodal Affective Computing and Applications
- Special Session 4 - Call-for-papers Flyer Download Link
As a cutting-edge direction in interdisciplinary research, multimodal affective computing is progressively transcending the limitations of traditional unimodal affective analysis. By integrating multi-source heterogeneous data such as speech, text, images, and physiological signals, multimodal affective computing aims to develop more comprehensive and robust emotion recognition and understanding models. These advancements deliver natural and intelligent services for scenarios including human-computer interaction, mental health monitoring, personalized education, and intelligent healthcare.This special session will solicit contributions on the recent progress of multimodal affective computing and applications.
Topics of interest include, but are not limited to:
I. Technical Foundations
Speech emotion recognition (recognition, synthesis, etc.)
Facial expression or micro-expression analysis
Conversational emotion recognition
Emotional dialogue generation
Stance detection and multimodal sarcasm analysis
Emotionally driven AI virtual digital humans
LLM-driven affective computing
Physiological signal-based affective computing (e.g., EEG, ECG, GSR fusion)
Knowledge-driven emotion reasoning (e.g., emotion-centric knowledge graphs)
Personality-aware emotion modeling and generation
Emotion-value alignment and regulation in large-scale language models
II. Application-Oriented Research
Emotion recognition and monitoring in mental health support
(e.g.,rPPG, depression, anxiety detection, emotion-aware interventions)
Emotion-aware intelligent customer service and dialogue systems
(e.g., adaptive empathy-driven response strategies)
Affective computing in smart driving and mobility
(e.g., driver stress, fatigue, and emotional distraction monitoring)
Emotion-driven intelligent education technologies
(e.g., affect-aware tutoring and engagement tracking)
Emotionally aware healthcare and human-robot interaction
(e.g., improving doctor-patient communication, robotic caregiving)
Emotion analysis and generation for social media content
(e.g., emotional trend analysis, affect-based recommendation and moderation
★ Submission Deadline: June 20, 2025
Notification Date: July 15, 2025
★ Submit Online: https://www.easychair.org/conferences/?conf=prai2025 (Please choose Specia Session 4)
|