Enabling In Situ & Context-Based Motion Gesture Design
暂无分享,去创建一个
[1] Gary M. Weiss,et al. Activity recognition using cell phone accelerometers , 2011, SKDD.
[2] Maribeth Gandy Coleman,et al. The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.
[3] Eric Horvitz,et al. Sensing techniques for mobile interaction , 2000, UIST '00.
[4] Joseph A. Paradiso,et al. The gesture recognition toolkit , 2014, J. Mach. Learn. Res..
[5] Otmar Hilliges,et al. In-air gestures around unmodified mobile devices , 2014, UIST.
[6] Céline Coutrix,et al. Designing guiding systems for gesture-based interaction , 2015, EICS.
[7] Desney S. Tan,et al. Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.
[8] John Krumm,et al. Ubiquitous Computing Fundamentals , 2009 .
[9] Jerry Alan Fails,et al. A design tool for camera-based interaction , 2003, CHI '03.
[10] Pedro M. Domingos,et al. Programming by demonstration: a machine learning approach , 2001 .
[11] Thad Starner. The Challenges of Wearable Computing: Part 2 , 2001, IEEE Micro.
[12] Chih-Jen Lin,et al. LIBSVM: A library for support vector machines , 2011, TIST.
[13] Michel Beaudouin-Lafon,et al. Instrumental interaction: an interaction model for designing post-WIMP user interfaces , 2000, CHI.
[14] Gregory D. Abowd,et al. Charting past, present, and future research in ubiquitous computing , 2000, TCHI.
[15] Jonna Häkkilä,et al. Charting user preferences on wearable visual markers , 2016, SEMWEB.
[16] Ken Hinckley,et al. Sensor synaesthesia: touch in motion, and motion in touch , 2011, CHI.
[17] Da-Yuan Huang,et al. Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices , 2015, CHI.
[18] Ted Selker,et al. Context-aware design and interaction in computer systems , 2000, IBM Syst. J..
[19] Thomas P. Moran,et al. Embodied User Interfaces: Towards Invisible User Interfaces , 1998, EHCI.
[20] Austin Henderson,et al. Interaction design: beyond human-computer interaction , 2002, UBIQ.
[21] William Buxton,et al. Pen + touch = new tools , 2010, UIST.
[22] Jock D. Mackinlay,et al. The design space of input devices , 1990, CHI '90.
[23] Anind K. Dey,et al. a CAPpella: programming by demonstration of context-aware applications , 2004, CHI.
[24] James A. Landay,et al. "Those look similar!" issues in automating gesture design advice , 2001, PUI '01.
[25] Orit Shaer,et al. Reality-based interaction: a framework for post-WIMP interfaces , 2008, CHI.
[26] Desney S. Tan,et al. An ultra-low-power human body motion sensor using static electric field sensing , 2012, UbiComp.
[27] Tracy L. Westeyn,et al. Georgia tech gesture toolkit: supporting experiments in gesture recognition , 2003, ICMI '03.
[28] Ken Hinckley,et al. Synchronous gestures for multiple persons and computers , 2003, UIST '03.
[29] Yang Li,et al. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.
[30] James D. Hollan,et al. Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..
[31] StarnerThad. The Challenges of Wearable Computing , 2001 .
[32] Yücel Yemez,et al. Hotspotizer: end-user authoring of mid-air gestural interactions , 2014, NordiCHI.
[33] Yang Li,et al. Tap, swipe, or move: attentional demands for distracted smartphone input , 2012, AVI.
[34] Dimitre Novatchev,et al. Chunking and Phrasing and the Design of Human-Computer Dialogues - Response , 1986, IFIP Congress.
[35] Peter Andras,et al. On preserving statistical characteristics of accelerometry data using their empirical cumulative distribution , 2013, ISWC '13.
[36] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[37] Gregory D. Abowd,et al. BackTap: robust four-point tapping on the back of an off-the-shelf smartphone , 2013, UIST '13 Adjunct.
[38] Von-Wun Soo,et al. Context-dependent Action Interpretation in Interactive Storytelling Games , 2012, ACHI 2012.
[39] Joseph J. LaViola,et al. Context aware 3D gesture recognition for games and virtual reality , 2015, SIGGRAPH Courses.
[40] Yang Li,et al. Activity-based prototyping of ubicomp applications for long-lived, everyday human activities , 2008, CHI.
[41] Teddy Seyed,et al. User Elicitation on Single-hand Microgestures , 2016, CHI.
[42] Yang Li,et al. Mogeste: mobile tool for in-situ motion gesture design , 2016, UbiComp Adjunct.
[43] Shwetak N. Patel,et al. ContextType: using hand posture information to improve mobile touch screen text entry , 2013, CHI.
[44] Thad Starner,et al. MAGIC summoning: towards automatic suggesting and testing of gestures with low probability of false positives during use , 2013, J. Mach. Learn. Res..
[45] James A. Landay,et al. Implications for a gesture design tool , 1999, CHI '99.
[46] Tony DeRose,et al. Proton: multitouch gestures as regular expressions , 2012, CHI.
[47] James A. Landay,et al. Quill: a gesture design tool for pen-based user interfaces , 2001 .
[48] Shwetak N. Patel,et al. GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.
[49] James A. Landay,et al. Investigating statistical machine learning as a tool for software development , 2008, CHI.
[50] Gregory D. Abowd,et al. What next, ubicomp?: celebrating an intellectual disappearing act , 2012, UbiComp.
[51] Beat Signer,et al. Midas: a declarative multi-touch interaction framework , 2010, TEI.
[52] Pattie Maes,et al. SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.
[53] Tom Igoe,et al. Physical computing: sensing and controlling the physical world with computers , 2004 .
[54] Niels Henze,et al. Gesture recognition with a Wii controller , 2008, TEI.
[55] Meredith Ringel Morris,et al. User-defined gestures for surface computing , 2009, CHI.
[56] Gregory D. Abowd,et al. A Conceptual Framework and a Toolkit for Supporting the Rapid Prototyping of Context-Aware Applications , 2001, Hum. Comput. Interact..
[57] Yang Li,et al. Gesture coder: a tool for programming multi-touch gestures by demonstration , 2012, CHI.
[58] Eamonn J. Keogh,et al. Scaling and time warping in time series querying , 2005, The VLDB Journal.
[59] Brad A. Myers,et al. Marquise: creating complete user interfaces by demonstration , 1993, CHI '93.
[60] Gregory D. Abowd,et al. WatchOut: extending interactions on a smartwatch with inertial sensing , 2016, SEMWEB.
[61] Dean Rubine,et al. Specifying gestures by example , 1991, SIGGRAPH.
[62] Desney S. Tan,et al. Humantenna: using the body as an antenna for real-time whole-body interaction , 2012, CHI.
[63] Gregory D. Abowd,et al. The context toolkit: aiding the development of context-enabled applications , 1999, CHI '99.
[64] Richard A. Bolt,et al. “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.
[65] Nicolai Marquardt,et al. WatchConnect: A Toolkit for Prototyping Smartwatch-Centric Cross-Device Applications , 2015, CHI.
[66] Austin Henderson,et al. Making sense of sensing systems: five questions for designers and researchers , 2002, CHI.
[67] Tony P. Pridmore,et al. Expected, sensed, and desired: A framework for designing sensing-based interaction , 2005, TCHI.
[68] Kent L. Norman,et al. Development of an instrument measuring user satisfaction of the human-computer interface , 1988, CHI '88.
[69] Daqing Zhang,et al. Gesture Recognition with a 3-D Accelerometer , 2009, UIC.
[70] William Buxton,et al. Usability evaluation considered harmful (some of the time) , 2008, CHI.
[71] Henry Lieberman,et al. Watch what I do: programming by demonstration , 1993 .
[72] Yang Li,et al. CrowdLearner: rapidly creating mobile recognizers using crowdsourcing , 2013, UIST.
[73] Scott R. Klemmer,et al. How bodies matter: five themes for interaction design , 2006, DIS '06.
[74] Lawrence R. Rabiner,et al. A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.
[75] Anind K. Dey,et al. Serendipity: Finger Gesture Recognition using an Off-the-Shelf Smartwatch , 2016, CHI.
[76] Brad A. Myers,et al. User interface software tools , 1995, TCHI.
[77] Michael Rohs,et al. User-defined gestures for connecting mobile phones, public displays, and tabletops , 2010, Mobile HCI.
[78] Paul Dourish,et al. Where the action is , 2001 .
[79] Desney S. Tan,et al. SoundWave: using the doppler effect to sense gestures , 2012, CHI.
[80] Stephen A. Brewster,et al. Foot tapping for mobile interaction , 2010, BCS HCI.
[81] Sean A. Munson,et al. Exploring the design space of glanceable feedback for physical activity trackers , 2016, UbiComp.
[82] Scott R. Klemmer,et al. Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition , 2007, CHI.
[83] Alan Wexelblat. Research Challenges in Gesture: Open Issues and Unsolved Problems , 1997, Gesture Workshop.
[84] Joseph J. LaViola,et al. GestureBar: improving the approachability of gesture-based interfaces , 2009, CHI.
[85] K. Hinckley. Input technologies and techniques , 2002 .
[86] Gregory D. Abowd,et al. Whoosh: non-voice acoustics for low-cost, hands-free, and rapid input on smartwatches , 2016, SEMWEB.
[87] Daniel Ashbrook. Enabling mobile microinteractions , 2010 .
[88] Yang Li,et al. DoubleFlip: a motion gesture delimiter for mobile interaction , 2010, UIST '10.
[89] Anoop K. Sinha,et al. Suede: a Wizard of Oz prototyping tool for speech user interfaces , 2000, UIST '00.
[90] Bernt Schiele,et al. A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.
[91] Xiang 'Anthony' Chen,et al. Air+touch: interweaving touch & in-air gestures , 2014, UIST.
[92] Yang Li,et al. Protractor: a fast and accurate gesture recognizer , 2010, CHI.
[93] Ken Hinckley,et al. A survey of design issues in spatial input , 1994, UIST '94.
[94] Perry R. Cook,et al. A Meta-Instrument for Interactive, On-the-Fly Machine Learning , 2009, NIME.
[95] Jun Rekimoto,et al. GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.
[96] Ayanna M. Howard,et al. Towards a canine-human communication system based on head gestures , 2015, Advances in Computer Entertainment.
[97] Christina Boucher,et al. Exploring Non-touchscreen Gestures for Smartwatches , 2016, CHI.
[98] Gregory D. Abowd,et al. TapSkin: Recognizing On-Skin Input for Smartwatches , 2016, ISS.
[99] Hasti Seifi,et al. Exploring the design space of touch-based vibrotactile interactions for smartwatches , 2016, SEMWEB.
[100] Dan R. Olsen,et al. Evaluating user interface systems research , 2007, UIST.
[101] Tek-Jin Nam,et al. CompositeGesture: Creating Custom Gesture Interfaces with Multiple Mobile or Wearable Devices , 2017 .
[102] Gaetano Borriello,et al. Location Systems for Ubiquitous Computing , 2001, Computer.
[103] James A. Landay,et al. Sketching Interfaces: Toward More Human Interface Design , 2001, Computer.
[104] Ivan E. Sutherland,et al. Sketch pad a man-machine graphical communication system , 1964, DAC.
[105] Anind K. Dey,et al. Understanding and Using Context , 2001, Personal and Ubiquitous Computing.
[106] Timothy Sohn,et al. iCAP: Interactive Prototyping of Context-Aware Applications , 2006, Pervasive.
[107] Saul Greenberg,et al. Phidgets: easy development of physical interfaces through physical widgets , 2001, UIST '01.
[108] Kent Lyons,et al. GART: The Gesture and Activity Recognition Toolkit , 2007, HCI.
[109] Michael Rohs,et al. A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors , 2010, IUI '10.
[110] James A. Landay,et al. Interactive sketching for the early stages of user interface design , 1995, CHI '95.
[111] Pengfei Liu,et al. Mobile WEKA as Data Mining Tool on Android , 2012 .
[112] Fabrice Matulic,et al. Sensing techniques for tablet+stylus interaction , 2014, UIST.
[113] Xiang 'Anthony' Chen,et al. Motion and context sensing techniques for pen computing , 2013, Graphics Interface.
[114] Joëlle Coutaz,et al. A design space for multimodal systems: concurrent processing and data fusion , 1993, INTERCHI.
[115] Scott E. Hudson,et al. Concepts, Values, and Methods for Technical Human-Computer Interaction Research , 2014, Ways of Knowing in HCI.
[116] Thad Starner,et al. MAGIC: a motion gesture design tool , 2010, CHI.
[117] Tek-Jin Nam,et al. EventHurdle: supporting designers' exploratory interaction prototyping with gesture-based sensors , 2013, CHI.
[118] Michael Rohs,et al. Protractor3D: a closed-form solution to rotation-invariant 3D gestures , 2011, IUI '11.
[119] Jakob Nielsen,et al. Gestural interfaces: a step backward in usability , 2010, INTR.
[120] Thad Starner,et al. Detecting Mastication: A Wearable Approach , 2015, ICMI.
[121] Pedro M. Domingos. A few useful things to know about machine learning , 2012, Commun. ACM.
[122] J. B. Brooke,et al. SUS: A 'Quick and Dirty' Usability Scale , 1996 .
[123] Anind K. Dey,et al. Toolkit to support intelligibility in context-aware applications , 2010, UbiComp.
[124] Bill Buxton,et al. Sketching User Experiences: Getting the Design Right and the Right Design , 2007 .
[125] Tanja Schultz,et al. Airwriting: a wearable handwriting recognition system , 2013, Personal and Ubiquitous Computing.
[126] Paul Lukowicz,et al. Performance metrics for activity recognition , 2011, TIST.
[127] Li-Wei Chan,et al. CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring , 2015, UIST.
[128] T. Selkar,et al. Context-aware design and interaction in computer systems , 2000 .
[129] Yang Li,et al. User-defined motion gestures for mobile interaction , 2011, CHI.
[130] Brad A. Myers,et al. Past, Present and Future of User Interface Software Tools , 2000, TCHI.
[131] Tek-Jin Nam,et al. M.Gesture: An Acceleration-Based Gesture Authoring System on Multiple Handheld and Wearable Devices , 2016, CHI.
[132] Tony DeRose,et al. Proton++: a customizable declarative multitouch framework , 2012, UIST.
[133] Jerry Alan Fails,et al. Interactive machine learning , 2003, IUI '03.
[134] Ian H. Witten,et al. The WEKA data mining software: an update , 2009, SKDD.
[135] Jeremy Scott,et al. Sensing foot gestures from the pocket , 2010, UIST.
[136] Yang Li,et al. Teaching motion gestures via recognizer feedback , 2014, IUI.
[137] Yang Li,et al. Gesture studio: authoring multi-touch interactions through demonstration and declaration , 2013, CHI.
[138] Gregory D. Abowd,et al. BeyondTouch: Extending the Input Language with Built-in Sensors on Commodity Smartphones , 2015, IUI.
[139] Sriram Subramanian,et al. Would you do that?: understanding social acceptance of gestural interfaces , 2010, Mobile HCI.
[140] Frédéric Bevilacqua,et al. Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces , 2014, TIIS.