TY - GEN
T1 - Recognizing perceived interdependence in face-to-face negotiations through multimodal analysis of nonverbal behavior
AU - Dudzik, Bernd
AU - Columbus, Simon
AU - Hrkalovic, Tiffany Matej
AU - Balliet, Daniel
AU - Hung, Hayley
N1 - Funding: This research was (partially) funded by the Hybrid Intelligence Center, a 10-year programme funded by the Dutch Ministry of Education, Culture and Science through the Netherlands Organisation for Scientific Research, https://hybrid-intelligence-centre.nl, grant number 024.004.022 and the MINGLE project number 639.022.606. Data collection was funded by an ERC Starting Grant (#635356) awarded to Daniel Balliet.
PY - 2021/10/18
Y1 - 2021/10/18
N2 - Enabling computer-based applications to display intelligent behavior in complex social settings requires them to relate to important aspects of how humans experience and understand such situations. One crucial driver of peoples' social behavior during an interaction is the interdependence they perceive, i.e., how the outcome of an interaction is determined by their own and others' actions. According to psychological studies, both the nonverbal behavior displayed by Motivated by this, we present a series of experiments to automatically recognize interdependence perceptions in dyadic face-to-face negotiations using these sources. Concretely, our approach draws on a combination of features describing individuals' Facial, Upper Body, and Vocal Behavior with state-of-the-art algorithms for multivariate time series classification. Our findings demonstrate that differences in some types of interdependence perceptions can be detected through the automatic analysis of nonverbal behaviors. We discuss implications for developing socially intelligent systems and opportunities for future research.
AB - Enabling computer-based applications to display intelligent behavior in complex social settings requires them to relate to important aspects of how humans experience and understand such situations. One crucial driver of peoples' social behavior during an interaction is the interdependence they perceive, i.e., how the outcome of an interaction is determined by their own and others' actions. According to psychological studies, both the nonverbal behavior displayed by Motivated by this, we present a series of experiments to automatically recognize interdependence perceptions in dyadic face-to-face negotiations using these sources. Concretely, our approach draws on a combination of features describing individuals' Facial, Upper Body, and Vocal Behavior with state-of-the-art algorithms for multivariate time series classification. Our findings demonstrate that differences in some types of interdependence perceptions can be detected through the automatic analysis of nonverbal behaviors. We discuss implications for developing socially intelligent systems and opportunities for future research.
KW - Situation Perception
KW - Social Signal Processing
KW - User-Modeling
UR - http://www.scopus.com/inward/record.url?scp=85119016965&partnerID=8YFLogxK
U2 - 10.1145/3462244.3479935
DO - 10.1145/3462244.3479935
M3 - Conference contribution
AN - SCOPUS:85119016965
T3 - Proceedings of the International Conference on Multimodal Interaction
SP - 121
EP - 130
BT - ICMI 2021 - Proceedings of the 2021 International Conference on Multimodal Interaction
PB - ACM
T2 - 23rd ACM International Conference on Multimodal Interaction, ICMI 2021
Y2 - 18 October 2021 through 22 October 2021
ER -