This paper provides an update on the continuing research effort focused on creating a collaborative navigation solution of various multi-sensory platforms for indoor, outdoor, and transitional environments. The emphasis of this paper is to investigate the usefulness of 3D imagery, taken from optical or laser sensors, in aiding indoor cooperative navigation of multi-sensory pedestrian platforms. Data characterization is provided and performance of the Microsoft Kinect™ sensor is evaluated. In addition, indoor navigation test results using 3D image-based navigation and the potential of the Kinect for identification and ranging between platforms are discussed.