InterFACE: new faces for musical expression
暂无分享,去创建一个
[1] Perry R. Cook,et al. Don't forget the laptop: using native input capabilities for expressive musical control , 2007, NIME '07.
[2] Rafael Ramírez,et al. Temporal Control In the EyeHarp Gaze-Controlled Musical Interface , 2012, NIME.
[3] Paul A. Viola,et al. Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.
[4] Kazuhiro Kuwabara,et al. Sonification of Facial Actions for Musical Expression , 2005, NIME.
[5] Luke Dahl,et al. Sound Bounce: Physical Metaphors in Designing Mobile Music Performance , 2010, NIME.
[6] Hideki Kawahara,et al. YIN, a fundamental frequency estimator for speech and music. , 2002, The Journal of the Acoustical Society of America.
[7] Ge Wang,et al. The Laptop Accordion , 2016, NIME.
[8] Eric Singer,et al. Sonic Banana: A Novel Bend-Sensor-Based MIDI Controller , 2003, NIME.
[9] Michael J. Lyons,et al. Facing the music: a facial action controlled musical interface , 2001, CHI Extended Abstracts.