Real-time intelligent characters for a non-visual simulation environment

In recent years, there has been a fair amount of research directed toward the goal of developing virtual, human-like characters for simulation environments. Much of this work has focused on creating high-fidelity graphical animations that represent realistic human forms and movement. We are approaching the same goal from a different angle, focusing on the real-time generation of autonomous, intelligent behaviors that a virtual human must use to attain its goals in a complex environment. Because the emphasis of our work is on high-level behavior rather than visual representation, our current work is geared toward non-visual simulation environments. TacAir-Soar is a system that generates intelligent behavior for flying missions in simulated fixed-wing aircraft for military training simulations. The system has been in development for over five years, and has participated in multiple military training exercises. This paper presents many of the lessons we have learned in developing an autonomous, real-time system, together with suggestions for how these lessons might apply to the development of a full real-time, autonomous, virtual human that also incorporates realistic visual representation and movement.