Measuring Two-People Communication from Omnidirectional Video

Yui Niibori, Shigang Li
Proceedings of IJCAI 2019 3rd Workshop on Artificial Intelligence in Affective Computing, PMLR 122:28-35, 2020.

Abstract

In this paper we propose a method of measuring the communication between two people by analyzing their heads’ information: head pose, gaze vectors and facial action units. Assuming two people are sitting around a table, an omnidirectional camera is used to observe the two people simultaneously.Next, the visual cues of the heads of the two people, including head pose, gaze vectors and facial action units, are extracted using a popular facial behavior analysis toolkit, OpenFace. Then, a LSTM (Long Short Term Memory) neural network is used to learn measuring the communication between the two people from the temporal sequence of the extracted head information. The preliminary experimental results show the effectiveness of the proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v122-niibori20a, title = {Measuring Two-People Communication from Omnidirectional Video}, author = {Niibori, Yui and Li, Shigang}, booktitle = {Proceedings of IJCAI 2019 3rd Workshop on Artificial Intelligence in Affective Computing}, pages = {28--35}, year = {2020}, editor = {Hsu, William}, volume = {122}, series = {Proceedings of Machine Learning Research}, month = {10 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v122/niibori20a/niibori20a.pdf}, url = {https://proceedings.mlr.press/v122/niibori20a.html}, abstract = {In this paper we propose a method of measuring the communication between two people by analyzing their heads’ information: head pose, gaze vectors and facial action units. Assuming two people are sitting around a table, an omnidirectional camera is used to observe the two people simultaneously.Next, the visual cues of the heads of the two people, including head pose, gaze vectors and facial action units, are extracted using a popular facial behavior analysis toolkit, OpenFace. Then, a LSTM (Long Short Term Memory) neural network is used to learn measuring the communication between the two people from the temporal sequence of the extracted head information. The preliminary experimental results show the effectiveness of the proposed method.} }
Endnote
%0 Conference Paper %T Measuring Two-People Communication from Omnidirectional Video %A Yui Niibori %A Shigang Li %B Proceedings of IJCAI 2019 3rd Workshop on Artificial Intelligence in Affective Computing %C Proceedings of Machine Learning Research %D 2020 %E William Hsu %F pmlr-v122-niibori20a %I PMLR %P 28--35 %U https://proceedings.mlr.press/v122/niibori20a.html %V 122 %X In this paper we propose a method of measuring the communication between two people by analyzing their heads’ information: head pose, gaze vectors and facial action units. Assuming two people are sitting around a table, an omnidirectional camera is used to observe the two people simultaneously.Next, the visual cues of the heads of the two people, including head pose, gaze vectors and facial action units, are extracted using a popular facial behavior analysis toolkit, OpenFace. Then, a LSTM (Long Short Term Memory) neural network is used to learn measuring the communication between the two people from the temporal sequence of the extracted head information. The preliminary experimental results show the effectiveness of the proposed method.
APA
Niibori, Y. & Li, S.. (2020). Measuring Two-People Communication from Omnidirectional Video. Proceedings of IJCAI 2019 3rd Workshop on Artificial Intelligence in Affective Computing, in Proceedings of Machine Learning Research 122:28-35 Available from https://proceedings.mlr.press/v122/niibori20a.html.

Related Material