Context-Aware Human Behaviour Forecasting in Dyadic Interactions

Nguyen Tan Viet Tuyen, Oya Celiktutan
Understanding Social Behavior in Dyadic and Small Group Interactions, PMLR 173:88-106, 2022.

Abstract

Non-verbal behaviours play an indispensable role in social interaction. People tend to use a wide range of non-verbal channels, including eye gaze, body, and facial gestures, to communicate their intentions and emotions to their interacting partners. Such social signals encourage verbal messages of the communicator can be transmitted to other interlocutors in a facile and transparent manner. On the other hand, an essential aspect of communication behaviours is the dynamic exchange of non-verbal signals among interlocutors for adapting current social norms and building a common ground. This factor suggests that data observed from the interacting partners should be considered when modeling the target individual’s behaviours. Our paper introduces a generative framework with context awareness that captures the influence of the interacting partner’s non-verbal signals on the target individual. The model consists of three components, namely, Context Encoder, Generator, and Discriminator. Context Encoder is employed to extract social signals observed from the interacting partner while Generator and Discriminator are utilized to generate and optimize the target person’s gestures. We verify the efficiency of the framework on two different dyadic interaction datasets. The experimental results demonstrate that compared to baselines, our solution can produce human-like gestures better supporting interaction contexts. Undoubtedly, in dyadic interaction, the influence of the interacting partner’s social signals on the target individual is observable, and the proposed approach can efficiently capture those effects. The source code of our framework can be found at https://github.com/sairlab/Context-Aware-Human-Behavior-Forecasting.

Cite this Paper


BibTeX
@InProceedings{pmlr-v173-tuyen22a, title = {Context-Aware Human Behaviour Forecasting in Dyadic Interactions}, author = {Tuyen, Nguyen Tan Viet and Celiktutan, Oya}, booktitle = {Understanding Social Behavior in Dyadic and Small Group Interactions}, pages = {88--106}, year = {2022}, editor = {Palmero, Cristina and Jacques Junior, Julio C. S. and Clapés, Albert and Guyon, Isabelle and Tu, Wei-Wei and Moeslund, Thomas B. and Escalera, Sergio}, volume = {173}, series = {Proceedings of Machine Learning Research}, month = {16 Oct}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v173/tuyen22a/tuyen22a.pdf}, url = {https://proceedings.mlr.press/v173/tuyen22a.html}, abstract = {Non-verbal behaviours play an indispensable role in social interaction. People tend to use a wide range of non-verbal channels, including eye gaze, body, and facial gestures, to communicate their intentions and emotions to their interacting partners. Such social signals encourage verbal messages of the communicator can be transmitted to other interlocutors in a facile and transparent manner. On the other hand, an essential aspect of communication behaviours is the dynamic exchange of non-verbal signals among interlocutors for adapting current social norms and building a common ground. This factor suggests that data observed from the interacting partners should be considered when modeling the target individual’s behaviours. Our paper introduces a generative framework with context awareness that captures the influence of the interacting partner’s non-verbal signals on the target individual. The model consists of three components, namely, Context Encoder, Generator, and Discriminator. Context Encoder is employed to extract social signals observed from the interacting partner while Generator and Discriminator are utilized to generate and optimize the target person’s gestures. We verify the efficiency of the framework on two different dyadic interaction datasets. The experimental results demonstrate that compared to baselines, our solution can produce human-like gestures better supporting interaction contexts. Undoubtedly, in dyadic interaction, the influence of the interacting partner’s social signals on the target individual is observable, and the proposed approach can efficiently capture those effects. The source code of our framework can be found at https://github.com/sairlab/Context-Aware-Human-Behavior-Forecasting.} }
Endnote
%0 Conference Paper %T Context-Aware Human Behaviour Forecasting in Dyadic Interactions %A Nguyen Tan Viet Tuyen %A Oya Celiktutan %B Understanding Social Behavior in Dyadic and Small Group Interactions %C Proceedings of Machine Learning Research %D 2022 %E Cristina Palmero %E Julio C. S. Jacques Junior %E Albert Clapés %E Isabelle Guyon %E Wei-Wei Tu %E Thomas B. Moeslund %E Sergio Escalera %F pmlr-v173-tuyen22a %I PMLR %P 88--106 %U https://proceedings.mlr.press/v173/tuyen22a.html %V 173 %X Non-verbal behaviours play an indispensable role in social interaction. People tend to use a wide range of non-verbal channels, including eye gaze, body, and facial gestures, to communicate their intentions and emotions to their interacting partners. Such social signals encourage verbal messages of the communicator can be transmitted to other interlocutors in a facile and transparent manner. On the other hand, an essential aspect of communication behaviours is the dynamic exchange of non-verbal signals among interlocutors for adapting current social norms and building a common ground. This factor suggests that data observed from the interacting partners should be considered when modeling the target individual’s behaviours. Our paper introduces a generative framework with context awareness that captures the influence of the interacting partner’s non-verbal signals on the target individual. The model consists of three components, namely, Context Encoder, Generator, and Discriminator. Context Encoder is employed to extract social signals observed from the interacting partner while Generator and Discriminator are utilized to generate and optimize the target person’s gestures. We verify the efficiency of the framework on two different dyadic interaction datasets. The experimental results demonstrate that compared to baselines, our solution can produce human-like gestures better supporting interaction contexts. Undoubtedly, in dyadic interaction, the influence of the interacting partner’s social signals on the target individual is observable, and the proposed approach can efficiently capture those effects. The source code of our framework can be found at https://github.com/sairlab/Context-Aware-Human-Behavior-Forecasting.
APA
Tuyen, N.T.V. & Celiktutan, O.. (2022). Context-Aware Human Behaviour Forecasting in Dyadic Interactions. Understanding Social Behavior in Dyadic and Small Group Interactions, in Proceedings of Machine Learning Research 173:88-106 Available from https://proceedings.mlr.press/v173/tuyen22a.html.

Related Material