Effects of Continuous Auditory Feedback on Drawing Trajectory-Based Finger Gestures
暂无分享,去创建一个
BoYu Gao | HyungSeok Kim | Jee-In Kim | Jooyoung Lee | Hasup Lee | Boyu Gao | Hyungseok Kim | Jee-In Kim | Hasup Lee | Jooyoung Lee
[1] Shumin Zhai,et al. Optimizing Touchscreen Keyboards for Gesture Typing , 2015, CHI.
[2] Richard Kronland-Martinet,et al. Comparison and Evaluation of Sonification Strategies for Guidance Tasks , 2016, IEEE Transactions on Multimedia.
[3] Stephen A. Brewster,et al. Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.
[4] Robert Pastel,et al. Measuring the difficulty of steering through corners , 2006, CHI.
[5] Jooyoung Lee,et al. Use of Sound to Provide Occluded Visual Information in Touch Gestural Interface , 2015, CHI Extended Abstracts.
[6] Pierre Dragicevic,et al. Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.
[7] Daniel Vogel,et al. Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.
[8] Shumin Zhai,et al. High precision touch screen interaction , 2003, CHI '03.
[9] Uran Oh,et al. Audio-Based Feedback Techniques for Teaching Touchscreen Gestures , 2015, ACM Trans. Access. Comput..
[10] Jizhong Xiao,et al. Being Aware of the World: Toward Using Social Media to Support the Blind With Navigation , 2015, IEEE Transactions on Human-Machine Systems.
[11] Walter F. Bischof,et al. Hands, hover, and nibs: understanding stylus accuracy on tablets , 2015, Graphics Interface.
[12] Shumin Zhai,et al. “Writing with music”: Exploring the use of auditory feedback in gesture interfaces , 2008, TAP.
[13] Jon Froehlich,et al. Evaluating Haptic and Auditory Directional Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras , 2016, ACM Trans. Access. Comput..
[14] Xiang Cao,et al. Effects of Multimodal Error Feedback on Human Performance in Steering Tasks , 2010, J. Inf. Process..
[15] Peter Wolf,et al. Sonification and haptic feedback in addition to visual feedback enhances complex motor task learning , 2014, Experimental Brain Research.
[16] Shumin Zhai,et al. Modeling human performance of pen stroke gestures , 2007, CHI.
[17] Patrick Susini,et al. Investigating three types of continuous auditory feedback in visuo-manual tracking , 2017, Experimental Brain Research.
[18] Sriram Subramanian,et al. Modeling steering within above-the-surface interaction layers , 2007, CHI.
[19] Radu-Daniel Vatavu,et al. Relative accuracy measures for stroke gestures , 2013, ICMI '13.
[20] Jian Zhao,et al. A model of scrolling on touch-sensitive displays , 2014, Int. J. Hum. Comput. Stud..
[21] Jaehoon Kim,et al. Effects of Auditory Feedback on Menu Selection in Hand-Gesture Interfaces , 2015, IEEE MultiMedia.
[22] Shota Yamanaka,et al. Modeling the Steering Time Difference between Narrowing and Widening Tunnels , 2016, CHI.
[23] H S Vitense,et al. Multimodal feedback: an assessment of performance and mental workload , 2003, Ergonomics.
[24] Yvonne Rogers,et al. Fat Finger Worries: How Older and Younger Users Physically Interact with PDAs , 2005, INTERACT.
[25] Matthew J. Jensen,et al. A Customizable Automotive Steering System With a Haptic Feedback Control Strategy for Obstacle Avoidance Notification , 2011, IEEE Transactions on Vehicular Technology.
[26] Richard Kronland-Martinet,et al. From sound to shape: auditory perception of drawing movements. , 2014, Journal of experimental psychology. Human perception and performance.
[27] Andy Cockburn,et al. Multimodal feedback for the acquisition of small targets , 2005, Ergonomics.
[28] Patrick Baudisch,et al. Precise selection techniques for multi-touch screens , 2006, CHI.
[29] Patrick Baudisch,et al. The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.
[30] R. Riener,et al. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review , 2012, Psychonomic Bulletin & Review.
[31] Bin Liu,et al. Using sound in multi-touch interfaces to change materiality and touch behavior , 2014, NordiCHI.
[32] S. Hart,et al. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .
[33] Carl Gutwin,et al. Understanding performance in touch selections: Tap, drag and radial pointing drag with finger, stylus and mouse , 2012, Int. J. Hum. Comput. Stud..
[34] Richard Kronland-Martinet,et al. Controlling the Perceived Material in an Impact Sound Synthesizer , 2011, IEEE Transactions on Audio, Speech, and Language Processing.
[35] Radu-Daniel Vatavu,et al. Understanding the consistency of users' pen and finger stroke gesture articulation , 2013, Graphics Interface.
[36] Tovi Grossman,et al. NanoStylus: Enhancing Input on Ultra-Small Displays with a Finger-Mounted Stylus , 2015, UIST.
[37] P. Fitts. The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.
[38] Shumin Zhai,et al. FFitts law: modeling finger touch with fitts' law , 2013, CHI.
[39] M. Smyth,et al. Functions of vision in the control of handwriting , 1987 .
[40] Xiang Cao,et al. Grips and gestures on a multi-touch pen , 2011, CHI.
[41] Shumin Zhai,et al. A comparative evaluation of finger and pen stroke gestures , 2012, CHI.
[42] Stephen A. Brewster,et al. Multimodal Trajectory Playback for Teaching Shape Information and Trajectories to Visually Impaired Computer Users , 2008, TACC.