Three-dimensional neuron tracing from confocal microscope data using a head-coupled display

We describe a new interaction technique based on head-coupling designed to free a user's hands for other tasks such as 3D tracing. We used head motion to manipulate the 3D view of confocal microscope data of neurons. The simplest interaction mode shows the projected view from the user's eye point. Other modes implement more sophisticated motions such as ratcheting. Tracing is done by marking a point of interest in two different views controlled by the head. Under suitable restrictions this locates a point in 3D corresponding to a feature of interest in the data set. By repeatedly marking points in this way the user traces a 3D path through the data. Because of pointing errors and rendering artifacts, the path traced by a user needs to be refined by consulting the original data set. Using an rough data model, along with several successive points in a feature, the program can automatically supply the next point thereby implementing a form of semi-automatic tracing. An advantage of our method is that we have access to the entire 3D volume data, so feature geometry can be computed with reference to the complete 3D data and not just to a 2D projected view.