An intelligent method to select maintenance tools in immersive virtual environment

Selection and positioning of virtual objects are difficult in man-machine interaction under an immersive virtual environment. Maintenance crew tend to focus on how to select tools appropriately in a virtual maintenance simulation. Currently, maintenance tools are selected in two methods. First, a 3D interaction process is achieved with traditional 2D input devices. Another method involves utilizing 3D input devices (e.g., force feedback equipment or data gloves), grasping the virtual tool when the hand approaches it, and recognizing the intention of grasping through the geometric position of the hand and the tool. However, the specific geometrical shape of the tool is ignored in these two methods; this condition may result in distortion according to established rules for recognizing the hand gesture. Moreover, the selection process will be verbose if the tool kit contains various tools. This paper presents an intellectual tool-selection method based on operating gesture identification and matching of interaction feature points. By analyzing experimental data, gesture features and interaction area information are extracted, thereby modeling the features of virtual hands and maintenance tools. According to different hand gestures in an immersive environment, relevant tools are searched for in the standard tool library. Hence, intellectual selection of maintenance tools is implemented, and a better immersive interaction is achieved during the location process.