3D Semantic Map-Based Shared Control for Smart Wheelchair

The previous perception and control system of smart wheelchairs normally doesn't distinguish different objects and treats all objects as obstacles. Consequently it is hard to realize the object related navigation tasks such as furniture docking or door passage with interference from the obstacle avoidance behavior. In this article, a local 3D semantic map is built online using a low-cost RGB-D camera, which provides the semantic and geometrical data of the recognized objects to the shared control modules for user intention estimation, target selection, motion control, as well as parameters adjusting of weight optimization for addressing different target. With the object information provided by 3D semantic map, our control system can choose different behaviors according to user intention to implement object related navigation. A smart wheelchair prototype equipped with a Kinect is developed and tested in real environment. The experiments showed that the 3D semantic map-based shared control can effectively enhance the smart wheelchair's mobility, and improve the collaboration between the user and the smart wheelchair.

[1]  Cristina Urdiales,et al.  A New Efficiency-Weighted Strategy for Continuous Human/Robot Cooperation in Navigation , 2009, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[2]  Yiannis Demiris,et al.  Human-wheelchair collaboration through prediction of intention and adaptive assistance , 2008, 2008 IEEE International Conference on Robotics and Automation.

[3]  Jingchuan Wang,et al.  Dynamic shared control for human-wheelchair cooperation , 2011, 2011 IEEE International Conference on Robotics and Automation.

[4]  Roland Siegwart,et al.  Introduction to Autonomous Mobile Robots , 2004 .

[5]  Y. Agostini,et al.  Man-machine Cooperation for the Control of an Intelligent Powered Wheelchair , 1998, J. Intell. Robotic Syst..

[6]  Axel Lankenau,et al.  A versatile and safe mobility assistant , 2001, IEEE Robotics Autom. Mag..

[7]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[8]  Nico Blodow,et al.  Towards 3D Point cloud based object maps for household environments , 2008, Robotics Auton. Syst..

[9]  Reinhard Klein,et al.  Efficient RANSAC for Point‐Cloud Shape Detection , 2007, Comput. Graph. Forum.

[10]  M. Micire,et al.  Development of Vision-Based Navigation for a Robotic Wheelchair , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.

[11]  Benjamin Kuipers,et al.  A stereo vision based mapping algorithm for detecting inclines, drop-offs, and obstacles for safe local navigation , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Vijay Kumar,et al.  Integrating Human Inputs with Autonomous Behaviors on an Intelligent Wheelchair Platform , 2007, IEEE Intelligent Systems.

[13]  Hendrik Van Brussel,et al.  User-adapted plan recognition and user-adapted shared control: A Bayesian approach to semi-autonomous wheelchair driving , 2008, Auton. Robots.

[14]  Ulises Cortés,et al.  An Adaptive Scheme for Wheelchair Navigation Collaborative Control , 2008, AAAI Fall Symposium: AI in Eldercare: New Solutions to Old Problems.