INSPIRE lab focuses on developing technologies that can provide emotional intelligence to AI systems. We investigate statistical and algorithmic approaches that can quantify and analyze nonverbal human behaviors in multimodal (audio-visual) data, particularly emotion expressions during interactions.

Human-human and human-machine interactions often evoke and involve affective and social signals, such as emotion, social attitude, engagement, conflict, and persuasion. These signals profoundly influence the overall outcome of interactions, and hence the automatic recognition of the signals will enable us to build human-centered interactive technology tailored to an individual user's needs, preferences, and capabilities. This research builds upon multimodal signal processing, machine learning, and behavioral science.

Areas of Interest

Automatic emotion recognition, affective multimedia analysis, multimodal signal processing, human-centered computing

Current Projects

1. Multimodal Emotion Recognition
2. Affective Multimedia Analysis
3. Human-centered Behavior Analysis