We propose a system that structures a meeting log by detecting and tagging the participants’ actions in the meeting using acceleration sensors. The proposed system detects head movement such as nodding of each participant or motion during utterances by using acceleration sensors attached to the heads of all participants in a meeting. In addition, we developed a Meeting Review Tree, which is an application that recognizes a meeting participants’ utterances and three kinds of actions using acceleration and angular velocity sensors and tags them to recorded movies. In the proposed system, the structure of the meeting is hierarchized into three layers and tagged contexts as follows: The first layer represents the transition of the reporter during the meeting, the second layer represents changes in information of speakers in the report, and the third layer represents motions such as nodding. As a result of the evaluation experiment, the recognition accuracy of the stratified first layer was 57.0% and that of the second layer was 61.0%.
[1]
Toyoaki Nishida,et al.
Analysis environment of conversational structure with nonverbal multimodal data
,
2010,
ICMI-MLMI '10.
[2]
Junji Yamato,et al.
Recognizing communicative facial expressions for discovering interpersonal emotions in group meetings
,
2009,
ICMI-MLMI '09.
[3]
Hiroshi Murase,et al.
Quantifying interpersonal influence in face-to-face conversations based on visual attention patterns
,
2006,
CHI Extended Abstracts.
[4]
Tatsuya Kawahara.
Multi-modal Sensing and Analysis of Poster Conversations: Toward Smart Posterboard
,
2012,
SIGDIAL Conference.
[5]
Stefan Kopp,et al.
A Calibration-Free Head Gesture Recognition System with Online Capability
,
2010,
2010 20th International Conference on Pattern Recognition.