Eliciting Mid-Air Gestures for Wall-Display Interaction
暂无分享,去创建一个
[1] Andy Cockburn,et al. User-defined gestures for augmented reality , 2013, INTERACT.
[2] Bongshin Lee,et al. Reducing legacy bias in gesture elicitation studies , 2014, INTR.
[3] Olivier Chapuis,et al. Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.
[4] Daniel Vogel,et al. Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.
[5] Veronica Teichrieb,et al. An Open Catalog of Hand Gestures from Sci-Fi Movies , 2015, CHI Extended Abstracts.
[6] Meredith Ringel Morris,et al. Web on the wall: insights from a multimodal interaction elicitation study , 2012, ITS.
[7] Meredith Ringel Morris,et al. Understanding users' preferences for surface gestures , 2010, Graphics Interface.
[8] Meredith Ringel Morris,et al. User-defined gestures for surface computing , 2009, CHI.
[9] Laurent Grisoni,et al. Small, medium, or large?: estimating the user-perceived scale of stroke gestures , 2013, CHI.
[10] Mikkel Rønne Jakobsen,et al. An exploratory study of how abundant display space may support data analysis , 2012, NordiCHI.
[11] Chris North,et al. Move to improve: promoting physical navigation to increase user performance with large displays , 2007, CHI.
[12] Radu-Daniel Vatavu,et al. User-defined gestures for free-hand TV control , 2012, EuroITV.
[13] Kasper Hornbæk,et al. Vulture: a mid-air word-gesture keyboard , 2014, CHI.
[14] M. Sheelagh T. Carpendale,et al. A comparison of ray pointing techniques for very large displays , 2010, Graphics Interface.
[15] Brad A. Myers,et al. Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.
[16] Jörg Müller,et al. StrikeAPose: revealing mid-air gestures on public displays , 2013, CHI.