Virtual pottery: a virtual 3D audiovisual interface using natural hand motions

In this paper, we present our approach towards designing and implementing a virtual 3D sound sculpting interface that creates audiovisual results using hand motions in real time. In the interface “Virtual Pottery,” we use the metaphor of pottery creation in order to adopt the natural hand motions to 3D spatial sculpting. Users can create their own pottery pieces by changing the position of their hands in real time, and also generate 3D sound sculptures based on pre-existing rules of music composition. The interface of Virtual Pottery can be categorized by shape design and camera sensing type. This paper describes how we developed the two versions of Virtual Pottery and implemented the technical aspects of the interfaces. Additionally, we investigate the ways of translating hand motions into musical sound. The accuracy of the detection of hand motions is crucial for translating natural hand motions into virtual reality. According to the results of preliminary evaluations, the accuracy of both motion-capture tracking system and portable depth sensing camera is as high as the actual data. We carried out user studies, which took into account information about the two exhibitions along with the various ages of users. Overall, Virtual Pottery serves as a bridge between the virtual environment and traditional art practices, with the consequence that it can lead to the cultivation of the deep potential of virtual musical instruments and future art education programs.

[1]  Sidney Fels,et al.  Sound Sculpting: Manipulating Sound through Virtual Sculpting , 1998 .

[2]  George K. Knopf,et al.  Deformable mesh for virtual shape sculpting , 2005 .

[3]  Hiroaki Nishino,et al.  An interactive 3D interface for a virtual ceramic art work environment , 1997, Proceedings. International Conference on Virtual Systems and MultiMedia VSMM '97 (Cat. No.97TB100182).

[4]  Seungmoon Choi,et al.  Haptic Pottery Modeling Using Circular Sector Element Method , 2008, EuroHaptics.

[5]  Ken-ichi Kameyama Virtual clay modeling system , 1997, VRST '97.

[6]  Marie-Paule Cani,et al.  Towards virtual clay , 2006, SIGGRAPH Courses.

[7]  Hong Qin,et al.  Virtual clay: a real-time sculpting system with haptic toolkits , 2001, I3D '01.

[8]  Diana Young,et al.  The Hyperbow Controller: Real-Time Dynamics Measurement of Violin Performance , 2002, NIME.

[9]  Golan Levin,et al.  Sounds from Shapes: Audiovisual Performance with Hand Silhouette Contours in The Manual Input Sessions , 2005, NIME.

[10]  Karthik Ramani,et al.  Handy-Potter: Rapid 3D Shape Exploration Through Natural Hand Motions , 2012 .

[11]  Gilbert Beyer,et al.  Music Interfaces for Novice Users: Composing Music on a Public Display with Hand Gestures , 2011, NIME.

[12]  Luciano da Fontoura Costa,et al.  Shape Analysis and Classification: Theory and Practice , 2000 .

[13]  Byeong-jun Han,et al.  Virtual Pottery: An Interactive Audio-Visual Installation , 2012, NIME.

[14]  Horace Ho-Shing Ip,et al.  BodyMusic: a novel framework design for body-driven music composition , 2005, ACE '05.

[15]  Michael Isard,et al.  3D position, attitude and shape input using video tracking of hands and lips , 1994, SIGGRAPH.

[16]  Horace Ho-Shing Ip,et al.  Cyber Composer: Hand Gesture-Driven Intelligent Music Composition and Generation , 2005, 11th International Multimedia Modelling Conference.

[17]  Andrew Wilson,et al.  Data miming: inferring spatial object descriptions from human gesture , 2011, CHI.