An autonomous vision-guided helicopter

Helicopters are indispensable air vehicles for many applications ranging from rescue and crime fighting to inspection and surveillance. They are most effective when flown at close proximity to objects of interest while performing tasks such as delivering critical supplies, rescuing stranded individuals, or inspecting damaged buildings. These tasks require dangerous flight patterns which risk human pilot safety. An unmanned helicopter which operates autonomously can carry out such tasks more effectively without risking human lives. The work presented in this dissertation develops an autonomous helicopter system for such applications. The system employs on-board vision for stability and guidance relative to objects of interest in the environment. Developing a vision-based helicopter positioning and control system is challenging for several reasons. First, helicopters are inherently unstable and capable of exhibiting high acceleration rates. They are highly sensitive to control inputs and require high frequency feedback with minimum delay for stability. For stable hovering, for example, vision-based feedback rates must be at least 30-60 Hz with no more than 1/30 second latency. Second, since helicopters rotate at high angular rates to direct main rotor thrust for translational motion, it is difficult to disambiguate rotation from translation with vision alone to estimate helicopter 3D motion. Third, helicopters have limited on-board power and payload capacity. Vision and control systems must be compact, efficient, and light weight for effective on-board integration. Finally, helicopters are extremely dangerous and present major obstacles to safe and calibrated experimentation to design and evaluate on-board systems. This dissertation addresses these issues by developing: a "visual odometer" for helicopter position estimation, a real-time and low latency vision machine architecture to implement an on-board visual odometer machine, and an array of innovative indoor testbeds for calibrated experimentation to design, build and demonstrate an airworthy vision-guided autonomous helicopter. The odometer visually locks on to ground objects viewed by a pair of on-board cameras. Using high-speed image template matching, it estimates helicopter motion by sensing object displacements in consecutive images. The visual odometer is implemented with a custom-designed real-time and low latency vision machine which modularly integrates field rate (60 Hz) template matching processors, synchronized attitude sensing and image tagging circuitry, and image acquisition, convolution, and display hardware. The visual odometer machine along with a carrier-phase differential Global Positioning System receiver, a classical PD control system, and human augmentation and safety systems are integrated on-board a mid-sized helicopter, the Yamaha R50, for vision-guided autonomous flight.

[1]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[2]  Azriel Rosenfeld,et al.  Digital Picture Processing , 1976 .

[3]  R. Paul Robot manipulators : mathematics, programming, and control : the computer control of robot manipulators , 1981 .

[4]  B. K. Townsend The application of quadratic optimal cooperative control synthesis to a CH-47 helicopter , 1986 .

[5]  Dale Enns Multivariable Flight Control for an Attack Helicopter , 1986, 1986 American Control Conference.

[6]  D. R. Downing,et al.  Flight test of a digital controller used in a helicopter autoland system , 1987, Autom..

[7]  Takeo Kanade,et al.  Vision and Navigation for the Carnegie-Mellon Navlab , 1987 .

[8]  Dean Pomerleau,et al.  ALVINN, an autonomous land vehicle in a neural network , 2015 .

[9]  R. A. Hess,et al.  Preview control pilot model for near-earth maneuvering helicopter flight , 1988 .

[10]  Ren C. Luo,et al.  Combined Vision/Ultrasonics For Multi-Dimensional Robotic Tracking , 1989, Optics East.

[11]  J. Kaletka,et al.  Identification of Mathematical Derivative Models for the Design of a Model Following Control System , 1989 .

[12]  N. R. Corby Image understanding research at GE , 1989 .

[13]  Takeo Kanade,et al.  3-D vision for outdoor navigation by an autonomous vehicle , 1989 .

[14]  Ronald A. Hess,et al.  An application of generalized predictive control to rotorcraft terrain-following flight , 1989, IEEE Trans. Syst. Man Cybern..

[15]  Neal Margulis i860 microprocessor internal architecture , 1990 .

[16]  Chris Harris,et al.  RAPID - a video rate object tracker , 1990, BMVC.

[17]  M. Atkins Performance and the i860 microprocessor , 1991, IEEE Micro.

[18]  Banavar Sridhar,et al.  Vision based techniques for rotorcraft low altitude flight , 1991 .

[19]  R. A. Roberts,et al.  The UTA autonomous aerial vehicle-automatic control , 1992, Proceedings of the IEEE 1992 National Aerospace and Electronics Conference@m_NAECON 1992.

[20]  Ernst D. Dickmanns,et al.  Recursive 3-D Road and Relative Ego-State Recognition , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[21]  Raymond E. Suorsa,et al.  Vision-based obstacle detection for rotorcraft flight , 1992, J. Field Robotics.

[22]  Peter Thompson,et al.  The T9000 transputer , 1992, Proceedings 1992 IEEE International Conference on Computer Design: VLSI in Computers & Processors.

[23]  D. C. Hodgson,et al.  A machine vision system for high speed object tracking using a moments algorithm , 1992 .

[24]  Masayuki Inaba,et al.  Robot vision system with a correlation chip for real-time tracking, optical flow and depth map generation , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[25]  Daniel E. Koditschek,et al.  Distributed real-time control of a spatial robot juggler , 1992, Computer.

[26]  Takeo Kanade,et al.  Visual tracking of a moving target by a camera mounted on a robot: a combination of control and vision , 1993, IEEE Trans. Robotics Autom..

[27]  George A. Bekey,et al.  The USC autonomous flying vehicle: An experiment in real-time behavior-based control , 1993, Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '93).

[28]  David M. Harvey,et al.  Analysis of the TMS320C40 Communication Channels Using Timed Petri Nets , 1993, Application and Theory of Petri Nets.

[29]  George A. Bekey,et al.  The USC autonomous flying vehicle: an experiment in real-time behavior-based control , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[30]  Ronald S. Cok,et al.  A T9000-based parallel image processor , 1993 .

[31]  O. Faugeras Three-dimensional computer vision: a geometric viewpoint , 1993 .

[32]  Hans P. Geering,et al.  Robust helicopter position control at hover , 1994, Proceedings of 1994 American Control Conference - ACC '94.

[33]  Raymond E. Suorsa,et al.  A parallel implementation of a multisensor feature-based range-estimation method , 1994, IEEE Trans. Robotics Autom..

[34]  Reg G. Willson Modeling and calibration of automated zoom lenses , 1994, Other Conferences.

[35]  Takeo Kanade,et al.  Fast template matching based on the normalized correlation by using multiresolution eigenimages , 1994, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94).

[36]  Thomas J. Ford,et al.  NovAtel's RT20 - A Real Time Floating Ambiguity Positioning System , 1994 .

[37]  Takeo Kanade,et al.  Research on an autonomous vision-guided helicopter , 1994 .

[38]  M. Uenohara,et al.  Vision Based Object Resistration for Real-Time Image Overlay Computer Vision, Virtual Reality and Robotics in Medicine , 1995 .

[39]  Takeo Kanade,et al.  Vision-Based Object Registration for Real-Time Image Overlay , 1995, CVRMed.

[40]  Todd Jochem,et al.  Rapidly Adapting Machine Vision for Automated Vehicle Steering , 1996, IEEE Expert.

[41]  Elliott D. Kaplan Understanding GPS : principles and applications , 1996 .