Human-Robot Interaction Using Pointing Gestures
暂无分享,去创建一个
[1] Andrew Blake,et al. Efficient Human Pose Estimation from Single Depth Images , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[2] Yael Edan,et al. Comparison of Interaction Modalities for Mobile Indoor Robot Guidance: Direct Physical Interaction, Person Following, and Pointing Control , 2015, IEEE Transactions on Human-Machine Systems.
[3] Michal Tolgyessy,et al. Foundations of Visual Linear Human–Robot Interaction via Pointing Gesture Navigation , 2017, Int. J. Soc. Robotics.
[4] Jörg Stückler,et al. Learning to interpret pointing gestures with a time-of-flight camera , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[5] Maria Pateraki,et al. Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).
[6] Yoichiro Maeda,et al. Evaluation of pointing navigation interface for mobile robot with spherical vision system , 2011, 2011 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2011).