Intelligent Lighting Control for Vision-Based Robotic Manipulation

The ability of a robot vision system to capture informative images is greatly affected by the condition of lighting in the scene. This paper reveals the importance of active lighting control for robotic manipulation and proposes novel strategies for good visual interpretation of objects in the workspace. Good illumination means that it helps to get images with large signal-to-noise ratio, wide range of linearity, high image contrast, and true color rendering of the object's natural properties. It should also avoid occurrences of highlight and extreme intensity unbalance. If only passive illumination is used, the robot often gets poor images where no appropriate algorithms can be used to extract useful information. A fuzzy controller is further developed to maintain the lighting level suitable for robotic manipulation and guidance in dynamic environments. As carried out in this paper, with both examples of numerical simulations and practical experiments, it promises satisfactory results with the proposed idea of active lighting control.

[1]  Honghai Liu,et al.  Fuzzy Qualitative Robot Kinematics , 2008, IEEE Transactions on Fuzzy Systems.

[2]  Rui J. P. de Figueiredo,et al.  Illumination control as a means of enhancing image features in active vision systems , 1995, IEEE Trans. Image Process..

[3]  Xiaoli Li,et al.  This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. 1 Classification of Energy Consumption in Buildings with Outlier Detection , 2022 .

[4]  Katsushi Ikeuchi,et al.  Light source position and reflectance estimation from a single view without the distant illumination assumption , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Han-Pang Huang,et al.  Development and Fuzzy Control of a Pipe Inspection Robot , 2010, IEEE Transactions on Industrial Electronics.

[6]  Long Wang,et al.  Vision-Based Target Tracking and Collision Avoidance for Two Autonomous Robotic Fish , 2009, IEEE Transactions on Industrial Electronics.

[7]  Ramesh Raskar,et al.  Vision-guided Robot System for Picking Objects by Casting Shadows , 2010, Int. J. Robotics Res..

[8]  Yangsheng Xu,et al.  Abstracting human control strategy in projecting light source , 2001, IEEE Trans. Inf. Technol. Biomed..

[9]  Rasmus Larsen,et al.  An Active Illumination and Appearance (AIA) Model for Face Alignment , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Jae Byung Park,et al.  Analysis of clipping effect in color images captured by CCD cameras , 2004, Proceedings of IEEE Sensors, 2004..

[11]  Shuntaro Yamazaki,et al.  Temporal Dithering of Illumination for Fast Active Vision , 2008, ECCV.

[12]  Zhao-Bang Pu,et al.  Design of self-adapting illumination in the vision measuring system , 2003, Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.03EX693).

[13]  Jing Xu,et al.  Develop feedback robot planning method for 3D surface inspection , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Chih-Lyang Hwang,et al.  A Distributed Active-Vision Network-Space Approach for the Navigation of a Car-Like Wheeled Robot , 2009, IEEE Transactions on Industrial Electronics.

[15]  T. Kawai,et al.  Improved Visibility of Monocular Head-Mounted Displays Through the Bright Control of Backlighting , 2010, Journal of Display Technology.

[16]  Bin Liu,et al.  Color tracking vision system for the autonomous robot , 2009, 2009 9th International Conference on Electronic Measurement & Instruments.

[17]  Jianwei Zhang,et al.  Active Illumination for Robot Vision , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[18]  Jing Zhao,et al.  A Novel PWM Control Method for Hybrid-Clamped Multilevel Inverters , 2010, IEEE Transactions on Industrial Electronics.

[19]  Takeo Kanade,et al.  Surface Reflection: Physical and Geometrical Perspectives , 1989, IEEE Trans. Pattern Anal. Mach. Intell..

[20]  Ian Lewin,et al.  Luminaire Photometry Using Video Camera Techniques , 1999 .

[21]  Wonwoo Kim,et al.  Effect of Background Luminance on Discomfort Glare in Relation to the Glare Source Size , 2010 .

[22]  Woojin Chung,et al.  Safe Navigation of a Mobile Robot Considering Visibility of Environment , 2009, IEEE Transactions on Industrial Electronics.

[23]  Éric Marchand,et al.  Control Camera and Light Source Positions using Image Gradient Information , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[24]  Katsushi Ikeuchi,et al.  Illumination distribution from shadows , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[25]  Jae-Bok Song,et al.  Monocular Vision-Based SLAM in Indoor Environment Using Corner, Lamp, and Door Features From Upward-Looking Camera , 2011, IEEE Transactions on Industrial Electronics.

[26]  Zheng Li,et al.  Switching Control of Image-Based Visual Servoing With Laser Pointer in Robotic Manufacturing Systems , 2009, IEEE Transactions on Industrial Electronics.

[27]  Bernardo Tormos,et al.  Detection and Diagnosis of Incipient Faults in Heavy-Duty Diesel Engines , 2010, IEEE Transactions on Industrial Electronics.

[28]  Wooi Ping Hew,et al.  An Elevator Group Control System With a Self-Tuning Fuzzy Logic Group Controller , 2010, IEEE Transactions on Industrial Electronics.

[29]  Alberto Ortiz,et al.  Radiometric calibration of vision cameras and intensity uncertainty estimation , 2006, Image Vis. Comput..

[30]  Muhittin Gökmen,et al.  Active illumination and appearance model for face alignment , 2010 .

[31]  Chwan-Hwa John Wu,et al.  Adaptive color image processing and recognition for varying backgrounds and illumination conditions , 1998, IEEE Trans. Ind. Electron..