Parallel processing and immersive visualization of sonar point clouds

The investigation of underwater structures and natural features through Autonomous Underwater Vehicles (AUVs) is an expanding field with applications in archaeology, engineering, environmental sciences and astrobiology. Processing and analyzing the raw sonar data generated by automated surveys is challenging due to the presence of complex error sources like water chemistry, zero-depth variations, inertial navigation errors and multipath reflections. Furthermore, the complexity of the collected data makes it difficult to perform effective analysis on a standard display. Point clouds made up of hundreds of millions to billions of points are not uncommon. Highly interactive, immersive visualization is a desirable tool that researchers can use to improve the quality of a final sonar-based data product. In this paper we present a scalable toolkit for the processing and visualization of sonar point clouds on a cluster-based, large scale immersive visualization environment. The cluster is used simultaneously as a parallel processing platform that performs sonar beam-tracing of the source raw data, and as the rendering driver of the immersive display.