Human behavior modeling for multimodal interaction with robot partner
暂无分享,去创建一个
[1] Elena Mugellini,et al. Context-Aware 3D Gesture Interaction Based on Multiple Kinects , 2011 .
[2] Timothy F. Cootes,et al. Feature Detection and Tracking with Constrained Local Models , 2006, BMVC.
[3] Jake K. Aggarwal,et al. View invariant human action recognition using histograms of 3D joints , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.
[4] Takenori Obo,et al. Human gesture recognition for robot partners by spiking neural network and classification learning , 2012, The 6th International Conference on Soft Computing and Intelligent Systems, and The 13th International Symposium on Advanced Intelligence Systems.
[5] Kanad K. Biswas,et al. Gesture recognition using Microsoft Kinect® , 2011, The 5th International Conference on Automation, Robotics and Applications.
[6] Hans-Paul Schwefel,et al. Numerical Optimization of Computer Models , 1982 .
[7] S. Mitra,et al. Gesture Recognition: A Survey , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).
[8] Takenori Obo,et al. Mutual adaptation in neuro fuzzy system for human posture recognition , 2015, 2015 10th Asian Control Conference (ASCC).
[9] Junsong Yuan,et al. Robust Part-Based Hand Gesture Recognition Using Kinect Sensor , 2013, IEEE Transactions on Multimedia.
[10] Christopher J. Bishop,et al. Pulsed Neural Networks , 1998 .
[11] Honghai Liu,et al. Human Behavior Measurement Based on Sensor Network and Robot Partners , 2010, J. Adv. Comput. Intell. Intell. Informatics.
[12] Gilbert Syswerda,et al. A Study of Reproduction in Generational and Steady State Genetic Algorithms , 1990, FOGA.
[13] Z. Liu,et al. A real time system for dynamic hand gesture recognition with a depth sensor , 2012, 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO).
[14] Chu Kiong Loo,et al. Geometric Feature-Based Facial Emotion Recognition Using Two-Stage Fuzzy Reasoning Model , 2014, ICONIP.