Real-time audio-visual calls detection system for a Chicken Robot

Design, study, and control of mixed animals-robots societies is the field of scientific exploration that can bring new opportunities for study and control of groups of social animals. In the Chicken Robot project we develop a mobile robot, socially acceptable by chicks and able to interact with them using appropriate communication channels. For interaction purposes the robot has to know positions of all birds in an experimental area and detect those uttering calls. In this paper, we present an audio-visual approach to locate the chicks on a scene and detect their calling activity in real-time. The visual tracking is provided by a marker-based tracker with a help of an overhead camera. Sound localization is achieved by the beamforming approach using an array of sixteen microphones. Visual and sound information are probabilistically mixed to detect the calling activity. The experiments using the e-puck robots instead of the real chicks demonstrate that our system is capable to detect the sound emission activity with more than 90% probability.

[1]  Francesco Mondada,et al.  Audio-visual detection of multiple chirping robots , 2008 .

[2]  Carlos Busso,et al.  Real-Time Monitoring of Participants' Interaction in a Meeting using Audio-Visual Sensors , 2007, 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07.

[3]  Tetsuya Ogata,et al.  Multiple moving speaker tracking by microphone array on mobile robot , 2005, INTERSPEECH.

[4]  G. DeMuth,et al.  Frequency domain beamforming techniques , 1977 .

[5]  Jean Rouat,et al.  Robust Recognition of Simultaneous Speech by a Mobile Robot , 2007, IEEE Transactions on Robotics.

[6]  Hiroshi G. Okuno,et al.  A robot referee for rock-paper-scissors sound games , 2008, 2008 IEEE International Conference on Robotics and Automation.

[7]  Nikolaus Correll,et al.  SwisTrack - a flexible open source tracking software for multi-agent systems , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  François Michaud,et al.  Embedded auditory system for small mobile robots , 2008, 2008 IEEE International Conference on Robotics and Automation.

[9]  Tetsuya Ogata,et al.  Auditory and visual integration based localization and tracking of humans in daily-life environments , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Fumio Kanehiro,et al.  Robust speech interface based on audio and video information fusion for humanoid HRP-2 , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[11]  Michael S. Brandstein,et al.  Microphone Arrays - Signal Processing Techniques and Applications , 2001, Microphone Arrays.

[12]  F Mondada,et al.  Social Integration of Robots into Groups of Cockroaches to Control Self-Organized Choices , 2007, Science.

[13]  Ivan Tashev Gain self-calibration procedure for microphone arrays , 2004, 2004 IEEE International Conference on Multimedia and Expo (ICME) (IEEE Cat. No.04TH8763).