Bilkent University
Department of Computer Engineering
MS THESIS PRESENTATION

 

FACIAL ANALYSIS OF DYADIC INTERACTIONS USING MULTIPLE INSTANCE LEARNING

 

Dersu Giritlioğlu
MS Student
(Supervisor:Asst. Prof. Dr. Hamdi Dibeklioğlu)
Computer Engineering Department
Bilkent University

Interpretation of non-verbal behavior is vital for reliable analysis of social interactions. To this end, we automatically analyze facial expressions of romantic couples during their dyadic interactions, for the first time in the literature. In our study, recently collected Romantic Relationship Dataset is used, which includes videos of 167 couples while they are talking on a conflicting case and a positive experience they share, in different sessions. To distinguish between interactions during positive experience and conflicting discussions, we model facial expressions of the couples employing a deep multiple instance learning framework, adapted from the anomaly detection literature. Spatio-temporal representation of facial behavior is obtained from short video segments through a 3D residual network and used as the instances in bag formations of multiple instance learning. The goal is to detect conflicting sessions by revealing distinctive facial cues that are displayed in short periods. To this end, instance representations of positive experience and conflict sessions are further optimized, so as to be more separable using deep metric learning. In addition, for a more reliable analysis of dyadic interaction, facial expressions of both subjects in the interaction are analyzed in a joint manner. Our experiments on the Romantic Relationship Dataset shows that the proposed approach reaches an accuracy of 71%. In addition to provide comparisons to several baseline models, we also conducted a human evaluation study for the same task, employing 6 participants. The proposed approach performs 5% (absolute) more accurately than humans as well as outperforming all baseline models. As suggested by the experimental results, reliable modeling of facial behavior can greatly contribute to the analysis of dyadic interactions, yielding a better performance than that of humans.

 

DATE: 29 September 2021, Wednesday @ 17:30