Adaptive User Interfaces for Planning and Decision Aids in C3I Systems

IF PLANNING and decision aids are to meet expectations as force multipliers on the battlefield, something must be done to analyze, summarize, aggregate, and display their data in a manner which is consistent with the goal structure and cognitive characteristics of each individual user. Without this type of preprocessing, voluminous data will rapidly overload the processing ability of the human half of the human-machine system. It then becomes likely that an increasingly intelligent generation of support tools will be abandoned in favor of a less informed "seat of the pants" approach to meeting highly dynamic mission objectives. The range of variability of probable users of coordinated integrated battlefield databases is great. Not only do users exist with significantly different functional requirements (such as the commander, G-2, and G-3), but each user may require a different view of the data depending on his skills, experience, mental acuity (considering such factors as fatigue and morale), and accustomed problem-solving style. To these parameters must be added the influence of context; different classes of situation interact with user characteristics to yield variations in optimum data selection, prioritization, level of detail and resolution, "chunking," figure-ground relations, alarm thresholds, envelopes constraining exception processing, descriptive characterization of uncertainties, and choice of metaphor. Halpin [1], [2] and others [3] have proposed the creation of general purpose intelligent interfaces or intelligent assistants to facilitate the dialog between the user and complex systems. Fig. 1 illustrates the architecture for such an intelligent interface. Interposed between the user and the application software in a system, the intelligent interface consists of models or representations of the user, of the user's goals and tasks, and of the world (goal/task context) as well as the software to update those models dynamically and use the models to control information flow to and from the user. The premise is that we can develop the interface to the point that it understands the user well enough to interpret his requests and commands,