Enhancing optical-flow-based control by learning visual appearance cues for flying robots
暂无分享,去创建一个
[1] R. Wehner,et al. Path integration in desert ants, Cataglyphis: how to make a homing ant run away from home , 2004, Proceedings of the Royal Society of London. Series B: Biological Sciences.
[2] Matthew Garratt,et al. An overview of insect-inspired guidance for application in ground and airborne platforms , 2004 .
[3] Nicolas H. Franceschini,et al. Small Brains, Smart Machines: From Fly Vision to Robot Vision and Back Again , 2014, Proceedings of the IEEE.
[4] F. Ruffier,et al. Optic flow-based collision-free strategies: From insects to robots. , 2017, Arthropod structure & development.
[5] Andrew M. Hyslop,et al. Autonomous Navigation in Three-Dimensional Urban Environments Using Wide-Field Integration of Optic Flow , 2010 .
[6] R. Wehner,et al. Look and turn: landmark-based goal navigation in honey bees , 2005, Journal of Experimental Biology.
[7] William Holderbaum,et al. Bioinspired autonomous visual vertical controlof a quadrotor unmanned aerial vehicle , 2015 .
[8] Yiannis Aloimonos,et al. GapFlyt: Active Vision Based Minimalist Structure-Less Gap Detection For Quadrotor Flight , 2018, IEEE Robotics and Automation Letters.
[9] Barbara Webb,et al. Robots in invertebrate neuroscience , 2002, Nature.
[10] Giacomo Indiveri,et al. Obstacle Avoidance and Target Acquisition for Robot Navigation Using a Mixed Signal Analog/Digital Neuromorphic Processing System , 2017, Front. Neurorobot..
[11] Norbert Boeddeker,et al. A universal strategy for visually guided landing , 2013, Proceedings of the National Academy of Sciences.
[12] Friedrich Fraundorfer,et al. Visual Odometry Part I: The First 30 Years and Fundamentals , 2022 .
[13] Thomas S Collett,et al. Insect Vision: Controlling Actions through Optic Flow , 2002, Current Biology.
[14] Svetha Venkatesh,et al. How honeybees make grazing landings on flat surfaces , 2000, Biological Cybernetics.
[15] Kristi Morgansen,et al. Monocular distance estimation from optic flow during active landing maneuvers , 2014, Bioinspiration & biomimetics.
[16] Lars Chittka,et al. Honeybee (Apis mellifera) vision can discriminate between and recognise images of human faces , 2005, Journal of Experimental Biology.
[17] Kari Pulli,et al. Real-time computer vision with OpenCV , 2012, Commun. ACM.
[18] Robert J. Wood,et al. Science, technology and the future of small autonomous drones , 2015, Nature.
[19] N. Franceschini,et al. From insect vision to robot vision , 1992 .
[20] James Sean Humbert,et al. Implementation of wide-field integration of optic flow for autonomous quadrotor navigation , 2009, Auton. Robots.
[21] Martin Giurfa,et al. Local-feature assembling in visual pattern recognition and generalization in honeybees , 2004, Nature.
[22] Michael H. Dickinson,et al. Flies Evade Looming Targets by Executing Rapid Visually Directed Banked Turns , 2014, Science.
[23] Tom Drummond,et al. Faster and Better: A Machine Learning Approach to Corner Detection , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[24] Giovanni M. Bianco,et al. The turn-back-and-look behaviour: bee versus robot , 2000, Biological Cybernetics.
[25] G. D. Croon. Monocular distance estimation with optical flow maneuvers and efference copies: a stability-based strategy , 2016 .
[26] Yiannis Aloimonos,et al. Obstacle Avoidance Using Flow Field Divergence , 1989, IEEE Trans. Pattern Anal. Mach. Intell..
[27] Paul Y. Oh,et al. Optic-Flow-Based Collision Avoidance , 2008, IEEE Robotics & Automation Magazine.
[28] Martin Egelhaaf,et al. A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes , 2015, PLoS Comput. Biol..