Supplementary Material for "Hand Postures for Sonification Control"
暂无分享,去创建一个
Sonification is a rather new technique in human-computer interaction which addresses auditory perception. In contrast to speech interfaces, sonification uses non-verbal sounds to present information. The most common sonification technique is parameter mapping where for each data point a sonic event is generated whose acoustic attributes are determined from data values by a mapping function. For acoustic data exploration, this mapping must be adjusted or manipulated by the user. We propose the use of hand postures as a particularly natural and intuitive means of parameter manipulation for this data exploration task. As a demonstration prototype we developed a hand posture recognition system for gestural controlling of sound. The presented implementation applies artificial neural networks for the identification of continuous hand postures from camera images and uses a real-time sound synthesis engine. In this paper, we present our system and first applications of the gestural control of sounds. Techniques to apply gestures to control sonification are proposed and sound examples are given.
#### S1: Handposture-driven soundscape control
This example is a real-time synthesis of a “soundscape” which has a relaxing effect and can be thought of as playing a simple instrument just by moving the hand. Mapping of finger positions to pitch and amplitude modulation are described in the paper.
+ [Sound Example 1](https://pub.uni-bielefeld.de/download/2707114/2707115)
#### S2: Hand postures for articulated voice control
This example illustrates how the hand posture control interface is used to render complex vowel transitions using a mapping of features such as formant frequencies and bandwidths and overall gain as described in the paper.
+ [Sound Example 2](https://pub.uni-bielefeld.de/download/2707114/2707131)
#### S3: Interactive sonification of parameter mappings
This is a number of sonifications of the iris data set where the finger bendings control different ranges or centers of the mapping interval. In all sonifications the class label is not used. The sounds provide different auditory views to one and the same dataset.
+ [Sound Example 3.1](https://pub.uni-bielefeld.de/download/2707114/2707117)
+ [Sound Example 3.2](https://pub.uni-bielefeld.de/download/2707114/2707116)
+ [Sound Example 3.3](https://pub.uni-bielefeld.de/download/2707114/2707119)
+ [Sound Example 3.4](https://pub.uni-bielefeld.de/download/2707114/2707118)
+ [Sound Example 3.5](https://pub.uni-bielefeld.de/download/2707114/2707120)
#### S4: Audible Contrast Explorations
This set of sonifications of the iris data set play mappings of all 3 classes of the iris data in a sequence, resulting in the perception of audible contrast between classes. The for sound example in turn are different mappings obtained by using the hand posture control interface.
+ [Sound Example 4.1](https://pub.uni-bielefeld.de/download/2707114/2707124)
+ [Sound Example 4.2](https://pub.uni-bielefeld.de/download/2707114/2707125)
+ [Sound Example 4.3](https://pub.uni-bielefeld.de/download/2707114/2707126)
+ [Sound Example 4.4](https://pub.uni-bielefeld.de/download/2707114/2707127)