상호작용 영상 주석 기반 사용자 참여도 및 의도 인식
暂无分享,去创建一个
A pattern classifier-based approach for recognizing internal states of human participants in interactions is presented along with its experimental results. The approach includes a step for collecting video recordings of human-human interactions or humanrobot interactions and subsequently analyzing the videos based on human coded annotations. The annotation includes social signals directly observed in the video recordings and the internal states of human participants indirectly inferred from those observed social signals. Then, a pattern classifier is trained using the annotation data, and tested. In our experiments on human-robot interaction, 7 video recordings were collected and annotated with 20 social signals and 7 internal states. Several experiments were performed to obtain an 84.83% recall rate for interaction engagement, 93% for concentration intention, and 81% for task comprehension level using a C4.5 based decision tree classifier.
[1] 박재현,et al. 전자펜 기반 편측시각무시(UVN) 환자 검사 및 재활치료 시스템 , 2014 .
[2] 최호림,et al. 볼-빔 시스템에서 AC와 DC 노이즈가 포함된 상태 궤환 제어기 설계 및 분석 , 2014 .