Evaluation of an Experimental Framework for Exploiting Vision in Swarm Robotics

Visual feature detection with limited resources of simple robots is an essential requirement for swarm robotic systems. Robots need to localize their position, to determine their orientation, and need to be able to acquire extra information from their surrounding environment using their sensors, while their computational and storage capabilities might be very limited. This paper evaluates the performance of an experimental framework, in which environmental elements such as landmarks and QR-codes are considered as key visual features. The performance is evaluated for environmental light disturbances and distance variations and feature detection speed is thoroughly examined. The applicability of the approach is shown in a real robot scenario by using epuck robots. Finally, the results of applying the approach to a completely different setting, i.e., simulation of pheromones using glowing trail detection, are presented. These results indicate the broad applicability range of the developed feature detection techniques.

[1]  J. De Schutter,et al.  Hybrid vision/force control at corners in planar robotic-contour following , 2002 .

[2]  Zhichao Chen,et al.  Qualitative vision-based mobile robot navigation , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[3]  Avinash C. Kak,et al.  Vision for Mobile Robot Navigation: A Survey , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Karl Tuyls,et al.  Bee-inspired foraging in a real-life autonomous robot collective , 2011 .

[5]  Gerhard Weiss,et al.  Bee-inspired foraging in an embodied swarm , 2011, AAMAS.

[6]  Karl Tuyls,et al.  Multi-robot collision avoidance with localization uncertainty , 2012, AAMAS.

[7]  Gerhard Weiss,et al.  StiCo in Action (Demonstration) , 2013 .

[8]  Gerhard Weiss,et al.  A Multi-robot Coverage Approach Based on Stigmergic Communication , 2012, MATES.

[9]  Rafael C. González,et al.  Local Determination of a Moving Contrast Edge , 1985, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Verena V. Hafner,et al.  LumiBots: making emergence graspable in a swarm of robots , 2010, Conference on Designing Interactive Systems.

[11]  Jun Hu,et al.  AdMoVeo: A Robotic Platform for Teaching Creative Programming to Designers , 2009, Edutainment.

[12]  Sergiu-Dan Stan,et al.  A Novel Robust Decentralized Adaptive Fuzzy Control for Swarm Formation of Multiagent Systems , 2012, IEEE Transactions on Industrial Electronics.

[13]  J. Gaspar,et al.  Omni-directional vision for robot navigation , 2000, Proceedings IEEE Workshop on Omnidirectional Vision (Cat. No.PR00704).

[14]  Roman Neruda,et al.  Localization With a Low-cost Robot , 2009, ITAT.

[15]  N. Lemmens Bee-inspired distributed optimization , 2011 .

[16]  Wolfram Burgard,et al.  Coordinated multi-robot exploration , 2005, IEEE Transactions on Robotics.

[17]  Fumitoshi Matsuno,et al.  Communication using pheromone field for multiple robots , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  Sonia Martínez,et al.  Coverage control for mobile sensing networks , 2002, IEEE Transactions on Robotics and Automation.

[19]  Marco Dorigo,et al.  Ant algorithms and stigmergy , 2000, Future Gener. Comput. Syst..

[20]  Stephan K. Chalup,et al.  Techniques for Improving Vision and Locomotion on the Sony AIBO Robot , 2003 .

[21]  Reza Safabakhsh,et al.  Computer Vision Techniques for Industrial Applications and Robot Control , 1982, Computer.