A Next-Best-View Method With self-termination in Active Modeling of 3D Objects

The objective of view planning in a visual sensing system is to make task-directed decisions for optimal sensing pose selection. The primary focus of the research described in this paper is to propose a new method for creating a complete model of free-form surface object from multiple range images acquired by a scan sensor at different space poses. Using the view sphere to limit the number of possible sensor positions, the candidates for the next-best-view (NBV) position are easily determined by detecting and measuring occlusions to the camera's view in an image. Ultimately, the candidate which obtains maximum the known partial modeling boundary integral value of the vector fields is selected as the next-best-view position. We also present a self-termination criterion for judging the completion condition in the measurement and reconstruction process. The termination condition is derived based on changes in the volume computed from two successive viewpoints. The experimental results show that the method is effective in practical implementation

[1]  Michael A. Greenspan,et al.  An automation system for industrial 3-D laser digitizing , 1999, Second International Conference on 3-D Digital Imaging and Modeling (Cat. No.PR00062).

[2]  Frank P. Ferrie,et al.  Uniform and complete surface coverage with a robot-mounted laser rangefinder , 1998, Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190).

[3]  Frank P. Ferrie,et al.  Entropy-based gaze planning , 2001, Image Vis. Comput..

[4]  Hongbin Zha,et al.  Computations on a spherical view space for efficient planning of viewpoints in 3-D object modeling , 1999, Second International Conference on 3-D Digital Imaging and Modeling (Cat. No.PR00062).

[5]  C. Ian Connolly,et al.  The determination of next best views , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[6]  Hongbin Zha,et al.  Next best viewpoint (NBV) planning for active object modeling based on a learning-by-showing approach , 1998, Proceedings. Fourteenth International Conference on Pattern Recognition (Cat. No.98EX170).

[7]  Bingwei He,et al.  A Self-Termination Judgment Method in 3D Object Automatic Measurement and Inspection , 2006, 2006 6th World Congress on Intelligent Control and Automation.

[8]  Xiaobu Yuan,et al.  A Mechanism of Automatic 3D Object Modeling , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  G. Roth,et al.  View planning for automated three-dimensional object reconstruction and inspection , 2003, CSUR.

[10]  Richard Pito,et al.  A sensor-based solution to the "next best view" problem , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[11]  Christophe Dumont,et al.  A next-best-view system for autonomous 3-D object reconstruction , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[12]  Ruzena Bajcsy,et al.  Occlusions as a Guide for Planning the Next View , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Shengyong Chen,et al.  Active viewpoint planning for model construction , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[14]  Subhashis Banerjee,et al.  Active recognition through next view planning: a survey , 2004, Pattern Recognit..

[15]  Shengyong Chen,et al.  Automatic recalibration of an active structured light vision system , 2003, IEEE Trans. Robotics Autom..

[16]  Glenn H. Tarbox,et al.  Planning for Complete Sensor Coverage in Inspection , 1995, Comput. Vis. Image Underst..

[17]  Besma Abidi Automatic sensor placement , 1995, Other Conferences.

[18]  Robert B. Fisher,et al.  A Best Next View Selection Algorithm Incorporating a Quality Criterion , 1998, BMVC.

[19]  Franc Solina,et al.  Planning the Next View Using the Max-Min Principle , 1993, CAIP.

[20]  Subhashis Banerjee,et al.  Isolated 3D object recognition through next view planning , 2000, IEEE Trans. Syst. Man Cybern. Part A.