INSPIRE lab focuses on developing technologies that can provide emotional intelligence to AI systems. We investigate statistical and algorithmic approaches that can quantify and analyze nonverbal human behaviors in multimodal (audio-visual) data, particularly emotion expressions during interactions.

Human-human and human-machine interactions often evoke and involve affective and social signals, such as emotion, social attitude, engagement, conflict, and persuasion. These signals profoundly influence the overall outcome of interactions, and hence the automatic recognition of the signals will enable us to build human-centered interactive technology tailored to an individual user's needs, preferences, and capabilities. This research builds upon multimodal signal processing, machine learning, and behavioral science.

Areas of Interest

Automatic emotion recognition, affective multimedia analysis, multimodal signal processing, human-centered computing

Current Projects

1. Multimodal Emotion Recognition
2. Affective Multimedia Analysis
3. Human-centered Behavior Analysis


(Aug, 2017) Inspire Lab's undergraduate student, Jesse, got his first paper accepted at the AAAI Fall Symposium Series: AI for HRI. Congratulations Jesse!
(Aug, 2017) Prof. Kim will organize IEEE International Conference on Automatic Face and Gesture Recognition (FG) 2018 as a Publicity Co-Chair.
(May, 2017) Prof. Kim will organize IEEE Conference on Affective Computing and Intelligent Interaction (ACII) 2017 as a Doctoral Consortium Co-Chair.
(Apr, 2017) Our journal paper is accepted at IEEE Transactions on Affective Computing (TAFFC) 2017.
(Feb, 2017) Prof. Kim received a SUNY-A Faculty Research Award.
(Sep, 2016) Two of our papers are accepted at ACM International Conference on Multimodal Interaction (ICMI) 2016.
(Sep, 2016) Prof. Kim joined UAlbany.