Abstract : Live fire training keeps warfighting capabilities at peak effectiveness. However, providing realistic targets for live fire exercises is prohibitively expensive. The United States Marine Corps uses a variety of target proxies in live fire exercises, such as derelict vehicles or piles of waste, which are non-reactive and stay in fixed locations. Augmented Reality (AR) can provide realistic, animated, and reactive virtual targets, as well as special effects such as explosions, for real world training exercises with no significant changes to the current training procedure. As part of USMC Fire Support Team (FiST) training, trainees learn how to call for fire as forward observers (FO). The FO determines the location of a target and calls for fire. After the round is fired, an instructor determines the effect on the target, and the FO adjusts. Initial FiST training takes place on a scale model firing range using pneumatic mortars, which is where we inserted an AR system. Our system provides a head-mounted display for the forward observer and a touch screen for the instructor, each showing virtual targets on the real range. The observer can see a simulated magnified view and reticule to determine target identity and location. The instructor controls the targets through a simple interface. The FO calls for fire and a real round is fired. The instructor sees where the round lands in the augmented touch screen view and designates the effect on the target. The forward observer sees that effect and adjusts. The system was demonstrated at Marine Corps Base Quantico in October 2004, where it was well received by mortar trainees and instructors. The system can also show virtual terrain and control measures. Future plans include testing at a full-scale live fire range like Twentynine Palms and completing a Semi-Automated Forces (SAF) interface for more intelligent targets.