Wireless steerable vision for live insects and insect-scale robots

A mechanically steerable vision system that imitates insect head motion can be mounted on insects and small robots. Vision serves as an essential sensory input for insects but consumes substantial energy resources. The cost to support sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions and evolve to move their visual systems independent of their bodies through head motion. By understanding the trade-offs made by insect vision systems in nature, we can design better vision systems for insect-scale robotics in a way that balances energy, computation, and mass. Here, we report a fully wireless, power-autonomous, mechanically steerable vision system that imitates head motion in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot. Our electronics and actuator weigh 248 milligrams and can steer the camera over 60° based on commands from a smartphone. The camera streams “first person” 160 pixels–by–120 pixels monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio from up to 120 meters away. We mounted this vision system on two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10–milliamp hour battery. We also built a small, terrestrial robot (1.6 centimeters by 2 centimeters) that can move at up to 3.5 centimeters per second, support vision, and operate for 63 to 260 minutes. Our results demonstrate that steerable vision can enable object tracking and wide-angle views for 26 to 84 times lower energy than moving the whole robot.

[1]  R. Wood,et al.  Jumping on water: Surface tension–dominated jumping of water striders and robotic insects , 2015, Science.

[2]  H. Hausen,et al.  Mechanism of phototaxis in marine zooplankton , 2008, Nature.

[3]  Sarah Bergbreiter,et al.  Toward Autonomy in Sub-Gram Terrestrial Robots , 2019, Annu. Rev. Control. Robotics Auton. Syst..

[4]  N. J. Strausfeld,et al.  The neck motor system of the flyCalliphora erythrocephala , 2004, Journal of Comparative Physiology A.

[5]  Radhika Nagpal,et al.  Kilobot: A low cost scalable robot system for collective behaviors , 2012, 2012 IEEE International Conference on Robotics and Automation.

[6]  Arianna Menciassi,et al.  An Innovative Wireless Endoscopic Capsule With Spherical Shape , 2017, IEEE Transactions on Biomedical Circuits and Systems.

[7]  Ronald S. Fearing,et al.  Insect-scale fast moving and ultrarobust soft robot , 2019, Science Robotics.

[8]  Denis C. Daly,et al.  A pulsed UWB receiver SoC for insect motion control , 2009, 2009 IEEE International Solid-State Circuits Conference - Digest of Technical Papers.

[9]  S. Kety,et al.  The circulation and energy metabolism of the brain. , 1963, Clinical neurosurgery.

[10]  N. Strausfeld,et al.  The neck motor system of the fly Calliphora erythrocephala. I: Muscles and motor neurons , 1987 .

[11]  Tarinee Chaiwong,et al.  Ommatidia of blow fly, house fly, and flesh fly: implication of their vision efficiency , 2008, Parasitology Research.

[12]  Shyamnath Gollakota,et al.  Liftoff of a 190 mg Laser-Powered Aerial Vehicle: The Lightest Wireless Robot to Fly , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[13]  T. T. Vo Doan,et al.  An Ultralightweight and Living Legged Robot. , 2017, Soft robotics.

[14]  Robert J. Wood,et al.  An actuated gaze stabilization platform for a flapping-wing microrobot , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[15]  P. Dario,et al.  Robotic versus manual control in magnetic steering of an endoscopic capsule. , 2009, Endoscopy.

[16]  Evangelos Papadopoulos,et al.  Dynamics, Design and Simulation of a Novel Microrobotic Platform Employing Vibration Microactuators , 2006 .

[17]  C. Desplan,et al.  Retinal perception and ecological significance of color vision in insects. , 2017, Current opinion in insect science.

[18]  Shyamnath Gollakota,et al.  Living IoT: A Flying Wireless Platform on Live Insects , 2018, MobiCom.

[19]  Ronald S. Fearing,et al.  DASH: A resilient high-speed 15g hexapedal robot , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Graham K Taylor,et al.  Head movements quadruple the range of speeds encoded by the insect motion vision system in hawkmoths , 2017, Proceedings of the Royal Society B: Biological Sciences.

[21]  Joshua R. Smith,et al.  Battery-Free Wireless Video Streaming Camera System , 2019, 2019 IEEE International Conference on RFID (RFID).

[22]  C. W. Berry,et al.  Radio-Controlled Cyborg Beetles: A Radio-Frequency System for Insect Neural Flight Control , 2009, 2009 IEEE 22nd International Conference on Micro Electro Mechanical Systems.

[23]  Nikolaus Correll,et al.  A stick-slip omnidirectional powertrain for low-cost swarm robotics: Mechanism, calibration, and control , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  Paolo Dario,et al.  An Integrated System for Wireless Capsule Endoscopy in a Liquid-Distended Stomach , 2014, IEEE Transactions on Biomedical Engineering.

[25]  J. Thorson Dynamics of Motion Perception in the Desert Locust , 1964, Science.

[26]  J. Niven Neural Energetics: Hungry Flies Turn Down the Visual Gain , 2014, Current Biology.

[27]  Yoan Civet,et al.  An autonomous untethered fast soft robotic insect driven by low-voltage dielectric elastomer actuators , 2019, Science Robotics.

[28]  Vo Doan Tat Thang,et al.  An Ultralightweight and Living Legged Robot , 2017 .

[29]  Joshua R. Smith,et al.  Towards Battery-Free HD Video Streaming , 2018, NSDI.

[30]  Steven G. Johnson,et al.  Inverse Designed Metalenses with Extended Depth of Focus , 2020, 2020 Conference on Lasers and Electro-Optics (CLEO).

[31]  Jie Liu,et al.  Glimpse: A Programmable Early-Discard Camera Architecture for Continuous Mobile Vision , 2017, MobiSys.

[32]  Karl Kral,et al.  Side-to-side head movements to obtain motion depth cues: A short review of research on the praying mantis , 1998, Behavioural Processes.

[33]  Martin Egelhaaf,et al.  The fine structure of honeybee head and body yaw movements in a homing task , 2010, Proceedings of the Royal Society B: Biological Sciences.

[34]  Neel Doshi,et al.  Power and Control Autonomy for High-Speed Locomotion With an Insect-Scale Legged Robot , 2018, IEEE Robotics and Automation Letters.

[35]  Claire J. Tomlin,et al.  Milligram-Scale Micro Aerial Vehicle Design for Low-Voltage Operation , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[36]  R. Hengstenberg,et al.  Compensatory head roll in the blowfly Calliphora during flight , 1986, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[37]  Kevin Y. Ma,et al.  Controlled Flight of a Biologically Inspired, Insect-Scale Robot , 2013, Science.

[38]  B. Remes,et al.  Design, Aerodynamics, and Vision-Based Control of the DelFly , 2009 .

[39]  Ronald S. Fearing,et al.  RoACH: An autonomous 2.4g crawling hexapod robot , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[40]  J. H. van Hateren,et al.  Saccadic head and thorax movements in freely walking blowflies , 2004, Journal of Comparative Physiology A.

[41]  Geoffrey L. Barrows,et al.  An Insect-Sized Robot That Uses a Custom-Built Onboard Camera and a Neural Network to Classify and Respond to Visual Input , 2018, 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob).

[42]  R. C. Miall,et al.  The flicker fusion frequencies of six laboratory insects, and the response of the compound eye to mains fluorescent ‘ripple’ , 1978 .

[43]  Holger G. Krapp,et al.  Evolution of Biological Image Stabilization , 2016, Current Biology.