AND OTHER DATA USING 2 D AND 3 D VISUALIZATION TECHNIQUES

The process of understanding, analyzing and communicating the large spatial datasets commonly output from LIDAR systems is difficult. Such data typically consist of several million 3D sample points distributed over several thousand hectares. Visualization techniques can help users better understand the raw data, formulate suitable analysis methods, and present results. Unfortunately, few visualization environments can represent entire LIDAR datasets in a way that promotes easy interaction and clear understanding. Even immersive techniques such as CAVEand headset-based systems tend to overwhelm the viewer with the shear volume of data presented. This paper describes a prototype visualization framework that fuses LIDAR data, photographs, and GIS data using 2D displays and a 3D stereoscopic visualization environment. We have used the system for projects where the terrain surface is of interest and in the derivation of vegetation characteristics from LIDAR data. The primary interface consists of a 2D display of a geo-referenced orthophoto, various point, line, and polygon overlays, and contour lines derived from a digital elevation model. Users interact with the 2D display to specify a subset of LIDAR data for stereoscopic viewing. The 3D viewing tool displays the terrain represented as a shaded surface, the specified subset of the raw LIDAR data points, and a portion of the orthophoto image mapped onto a horizontal plane that can be positioned anywhere along the vertical axis of the scene. The system provides methods for organizing and indexing large datasets making it useful for a wide variety of LIDAR projects. This paper presents an overview of the data interface and visualization system along with several examples of its use. Robert J. McGaughey, Research Forester Ward W. Carson, Research Engineer USDA Forest Service Pacific Northwest Research Station University of Washington PO Box 352100 Seattle, WA 98195-2100 bmcgaughey@fs.fed.us wcarson@fs.fed.us