Transforming Human Interaction with Virtual Worlds

Virtual worlds, 3D simulations of real or imagined worlds, are far richer and more dynamic than standard 2D computer applications. Extended realities, which integrate an experience between both the physical and virtual worlds, provide even more possibilities. We believe this richness cries out for a more expressive, more powerful, more dynamic human control paradigm. To effect a paradigm change toward traditional human-computer interaction, we are investigating high performance interfaces modeled after the techniques of musicians and other performing artists. We approach the problem by extracting structured information from the actions of performing artists, translating that information into an appropriate control language and applying it to high-performance interactions with virtual worlds. These developments employ automated learning and data mining techniques to extract features and relationships from multiple streams of data (audio, motion capture, etc.), to discover meaningful performative “gestures”, and to provide mappings between multiple semantic domains. Author Keywords