The human body as an interactive computing platform
暂无分享,去创建一个
[1] Paul Kabbash,et al. Human performance using computer input devices in the preferred and non-preferred hands , 1993, INTERCHI.
[2] Matthew J. Hertenstein. Touch: Its Communicative Functions in Infancy , 2002, Human Development.
[3] Steven K. Feiner,et al. Exploring interaction with a simulated wrist-worn projection display , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).
[4] Patrick Baudisch,et al. Back-of-device interaction allows creating very small touch devices , 2009, CHI.
[5] Melody Moore Jackson,et al. A galvanic skin response interface for people with severe motor disabilities , 2004, Assets '04.
[6] Regan L. Mandryk,et al. Using psychophysiological techniques to measure user experience with entertainment technologies , 2006, Behav. Inf. Technol..
[7] P. Anderson. Touching: The Human Significance of the Skin , 1978 .
[8] Matheen Siddiqui,et al. Robust real-time upper body limb detection and tracking , 2006, VSSN '06.
[9] Desney S. Tan,et al. Skinput: appropriating the body as an input surface , 2010, CHI.
[10] Andrew D. Wilson. Using a depth camera as a touch sensor , 2010, ITS '10.
[11] Patrick Baudisch,et al. Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device , 2011, UIST.
[12] Maribeth Gandy Coleman,et al. The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.
[13] Yaser Sheikh,et al. Motion capture from body-mounted cameras , 2011, SIGGRAPH 2011.
[14] R W Cholewiak,et al. The generation of vibrotactile patterns on a linear array: Influences of body site, time, and presentation mode , 2000, Perception & psychophysics.
[15] Jun Rekimoto,et al. SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.
[16] David Isaacson,et al. Electrical Impedance Tomography , 2002, IEEE Trans. Medical Imaging.
[17] M. Tomita,et al. Exploratory Study of Touch zones in college students on two campuses , 2008 .
[18] C. Spence,et al. The science of interpersonal touch: An overview , 2010, Neuroscience & Biobehavioral Reviews.
[19] J. Warren. Unencumbered Full Body Interaction in Video Games , 2003 .
[20] W. A. Sarnacki,et al. Brain–computer interface (BCI) operation: optimizing information transfer rates , 2003, Biological Psychology.
[21] Kent Lyons,et al. Quickdraw: the impact of mobility and on-body placement on device access time , 2008, CHI.
[22] Ravin Balakrishnan,et al. Zliding: fluid zooming and sliding for high precision parameter manipulation , 2005, UIST.
[23] Khai N. Truong,et al. Virtual shelves: interactions with orientation aware devices , 2009, UIST '09.
[24] Daniel J. Wigdor,et al. TiltText: using tilt for text input to mobile phones , 2003, UIST '03.
[25] Joshua R. Smith. Field Mice: Extracting Hand Geometry from Electric Field Measurements , 1996, IBM Syst. J..
[26] Dieter Fox,et al. Sparse distance learning for object recognition combining RGB and depth information , 2011, 2011 IEEE International Conference on Robotics and Automation.
[27] Chris Harrison,et al. Whack gestures: inexact and inattentive interaction with mobile devices , 2010, TEI '10.
[28] François Guimbretière,et al. Techniques , 2011, Laboratory Investigation.
[29] Ivan Poupyrev,et al. Motionbeam: a metaphor for character interaction with handheld projectors , 2011, CHI.
[30] Yvonne Rogers,et al. Fat Finger Worries: How Older and Younger Users Physically Interact with PDAs , 2005, INTERACT.
[31] Tovi Grossman,et al. The design and evaluation of multitouch marking menus , 2010, CHI.
[32] Sami Laakso,et al. Design of a body-driven multiplayer game system , 2006, CIE.
[33] Desney S. Tan,et al. Using a low-cost electroencephalograph for task classification in HCI research , 2006, UIST.
[34] Ian H. Witten,et al. The WEKA data mining software: an update , 2009, SKDD.
[35] Masatoshi Ishikawa,et al. Smart laser-scanner for 3D human-machine interface , 2005, CHI EA '05.
[36] Ken Perlin,et al. The UnMousePad: an interpolating multi-touch force-sensing input pad , 2009, SIGGRAPH 2009.
[37] Ken Hinckley,et al. Sensor synaesthesia: touch in motion, and motion in touch , 2011, CHI.
[38] Khai N. Truong,et al. Leveraging proprioception to make mobile phones more accessible to users with visual impairments , 2010, ASSETS '10.
[39] Bruce H. Thomas,et al. Where Does the Mouse Go? An Investigation into the Placement of a Body-Attached TouchPad Mouse for Wearable Computers , 2002, Personal and Ubiquitous Computing.
[40] R. A. Faste,et al. The Role of Aesthetics in Engineering , 1995 .
[41] Gregory D. Abowd,et al. Opportunistic Annexing for Handheld Devices: Opportunities and Challenges , 2003 .
[42] Adam Kendon,et al. How gestures can become like words , 1988 .
[43] Joseph A. Paradiso,et al. Passive acoustic knock tracking for interactive windows , 2002, CHI Extended Abstracts.
[44] Henry Been-Lirn Duh,et al. Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.
[45] Chris Harrison,et al. Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.
[46] D. Patten. What lies beneath: the in living anatomy teaching , 2007 .
[47] Chris Harrison,et al. Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces , 2008, UIST '08.
[48] Paul A. Beardsley,et al. Interaction using a handheld projector , 2005, IEEE Computer Graphics and Applications.
[49] Y. Guiard. Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.
[50] Abigail Sellen,et al. Two-handed input in a compound task , 1994, CHI 1994.
[51] Daniel C. McFarlane,et al. Interactive dirt: increasing mobile work performance with a wearable projector-camera system , 2009, UbiComp.
[52] Eric Lecolinet,et al. MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb , 2009, CHI.
[53] Shogo Nishida,et al. Mobile Interfaces Using Body Worn Projector and Camera , 2009, HCI.
[54] M. Knapp,et al. Nonverbal communication in human interaction , 1972 .
[55] Xiao Li,et al. The vocal joystick:: evaluation of voice-based cursor control techniques , 2006, Assets '06.
[56] John Zimmerman,et al. The SenseChair: the lounge chair as an intelligent assistive device for elders , 2005, DUX '05.
[57] Joseph A. Paradiso,et al. Swept-frequency, magnetically-coupled resonant tags for realtime, continuous, multiparameter control , 1999, CHI EA '99.
[58] W. Buxton,et al. A study in two-handed input , 1986, CHI '86.
[59] Peter Eades,et al. Information Display , 2006, Handbook of Nature-Inspired and Innovative Computing.
[60] Kosuke Sato,et al. PALMbit-Silhouette: A User Interface by Superimposing Palm-Silhouette to Access Wall Displays , 2009, HCI.
[61] Jacob Buur,et al. Getting a grip on tangible interaction: a framework on physical space and social interaction , 2006, CHI.
[62] Patrick Baudisch,et al. Disappearing mobile devices , 2009, UIST '09.
[63] Ivan Poupyrev,et al. Tactile interfaces for small touch screens , 2003, UIST '03.
[64] Shumin Zhai,et al. Camera phone based motion sensing: interaction techniques, applications and performance study , 2006, UIST.
[65] Regan L. Mandryk,et al. A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies , 2007, Int. J. Hum. Comput. Stud..
[66] W. Bu. A MULTI-TOUCH THREE DIMENSIONAL TOUCH-SENSITIVE TABLET , 1985 .
[67] Angela Barnett. The dancing body as a screen: Synchronizing projected motion graphics onto the human form in contemporary dance , 2009, CIE.
[68] F. R. Wilson. The Hand: How Its Use Shapes the Brain, Language, and Human Culture , 1998 .
[69] Ankur Agarwal,et al. Learning to track 3D human motion from silhouettes , 2004, ICML.
[70] G. Matheson,et al. Vibromyography as a quantitative measure of muscle force production. , 1997, Scandinavian journal of rehabilitation medicine.
[71] Andrew D. Wilson. Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.
[72] Richard A. Bolt,et al. “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.
[73] Brett Kaufman,et al. OsteoConduct: wireless body-area communication based on bone conduction , 2007, BODYNETS.
[74] Joseph A. Paradiso,et al. Sensor systems for interactive surfaces , 2000, IBM Syst. J..
[75] Yang Li,et al. Experimental analysis of mode switching techniques in pen-based user interfaces , 2005, CHI.
[76] Roel Vertegaal,et al. PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays , 2011, CHI.
[77] Joshua H. Singer. Sensation and Perception, by Jeremy M. Wolfe, Keith R. Kluender, Dennis M. Levi, et al, Sunderland, Massachusetts, Sinauer Associates, 2006, 407 pp, illus., case bound. Price: US$ 98.95. , 2006 .
[78] John G. Webster,et al. Medical Instrumentation: Application and Design , 1997 .
[79] Joseph A. Paradiso,et al. Tangible Music Interfaces using Passive Magnetic Tags , 2001, NIME.
[80] Robert Rosenberg. The biofeedback pointer: EMG control of a two dimensional pointer , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).
[81] Stefan Rennick Egglestone,et al. Breath control of amusement rides , 2011, CHI.
[82] Simon Rogers,et al. AnglePose: robust, precise capacitive touch tracking via 3d orientation estimation , 2011, CHI.
[83] Desney S. Tan,et al. Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.
[84] Desney S. Tan,et al. Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces , 2008, CHI.
[85] Richard Martin,et al. Design for wearability , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).
[86] Joseph A. Paradiso,et al. Electric Field Sensing For Graphical Interfaces , 1998, IEEE Computer Graphics and Applications.
[87] Joseph A. Paradiso,et al. Applying electric field sensing to human-computer interfaces , 1995, CHI '95.
[88] Claudio S. Pinhanez. The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces , 2001, UbiComp.
[89] Jun Rekimoto,et al. HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.
[90] Shahram Izadi,et al. SideSight: multi-"touch" interaction around small devices , 2008, UIST '08.
[91] Erin Manning,et al. Politics of Touch: Sense, Movement, Sovereignty , 2006 .
[92] Carlo Tomasi,et al. Full-size projection keyboard for handheld devices , 2003, CACM.
[93] Chris Harrison,et al. Where to locate wearable displays?: reaction time performance of visual alerts from tip to toe , 2009, CHI.
[94] J. C. Allen. Out of the lab and into the real world , 2005 .
[95] อนิรุธ สืบสิงห์,et al. Data Mining Practical Machine Learning Tools and Techniques , 2014 .
[96] Chris Harrison,et al. Using shear as a supplemental two-dimensional input channel for rich touchscreen interaction , 2012, CHI.
[97] Leo Donnelly,et al. Virtual human dissector as a learning tool for studying cross-sectional anatomy , 2009, Medical teacher.
[98] Shumin Zhai,et al. More than dotting the i's --- foundations for crossing-based interfaces , 2002, CHI.
[99] Patrick Baudisch,et al. The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.
[100] Scott Counts,et al. Exploring wearable ambient displays for social awareness , 2006, CHI EA '06.
[101] Guy Weinzapfel,et al. One-point touch input of vector information for computer displays , 1978, SIGGRAPH '78.
[102] Geehyuk Lee,et al. Force gestures: augmented touch screen gestures using normal and tangential force , 2011, CHI Extended Abstracts.
[103] Andrew D. Wilson. PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.
[104] Desney S. Tan,et al. Feasibility and pragmatics of classifying working memory load with an electroencephalograph , 2008, CHI.
[105] Feng Wang,et al. Empirical evaluation for finger input properties in multi-touch interaction , 2009, CHI.
[106] Kent Lyons,et al. Augmenting conversations using dual-purpose speech , 2004, UIST '04.
[107] Loren G. Terveen,et al. The sound of one hand: a wrist-mounted bio-acoustic fingertip gesture interface , 2002, CHI Extended Abstracts.
[108] Jacob O. Wobbrock,et al. Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.
[109] Bill Tomlinson,et al. Who are the crowdworkers?: shifting demographics in mechanical turk , 2010, CHI Extended Abstracts.
[110] Joseph A. Paradiso,et al. PingPongPlus: design of an athletic-tangible interface for computer-supported cooperative play , 1999, CHI '99.
[111] Clayton Valli,et al. The Gallaudet Dictionary of American Sign Language , 2021 .
[112] S. Gallagher. How the body shapes the mind , 2005 .
[113] Ali Israr,et al. TeslaTouch: electrovibration for touch surfaces , 2010, UIST.
[114] Patrick Olivier,et al. SurfaceMouse: supplementing multi-touch interaction with a virtual mouse , 2011, Tangible and Embedded Interaction.
[115] Ivan Poupyrev,et al. Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.
[116] Thad Starner,et al. The Role of Speech Input in Wearable Computing , 2002, IEEE Pervasive Comput..
[117] Virpi Roto,et al. Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI , 2005, CHI.
[118] Kosuke Sato,et al. A wearable mixed reality with an on-board projector , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..
[119] Xiang Cao,et al. Multi-user interaction using handheld projectors , 2007, UIST.
[120] Patrick Baudisch,et al. Touch input on curved surfaces , 2011, CHI.
[121] D. Keltner,et al. Touch communicates distinct emotions. , 2006, Emotion.
[122] Desney S. Tan,et al. Making muscle-computer interfaces more practical , 2010, CHI.
[123] Thomas G. Zimmerman,et al. : Near-field , 2022 .
[124] Chris Harrison,et al. Lean and zoom: proximity-aware user interface and content magnification , 2008, CHI.
[125] Manolis I. A. Lourakis,et al. Vision-Based Interpretation of Hand Gestures for Remote Control of a Computer Mouse , 2006, ECCV Workshop on HCI.
[126] J. Löwgren,et al. Touching a Stranger: Designing for Engaging Experience in Embodied Interaction , 2011 .
[127] Per Ola Kristensson,et al. I did that! Measuring users' experience of agency in their own actions , 2012, CHI 2012.
[128] Mircea Nicolescu,et al. Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..
[129] Hrvoje Benko,et al. Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.
[130] Lorna M. Brown,et al. Multidimensional tactons for non-visual information presentation in mobile devices , 2006, Mobile HCI.
[131] H. Harry Asada,et al. Measurement of finger posture and three-axis fingertip touch force using fingernail sensors , 2004, IEEE Transactions on Robotics and Automation.
[132] Stefan Weber,et al. A Portable Image Overlay Projection Device for Computer-Aided Open Liver Surgery , 2011, IEEE Transactions on Biomedical Engineering.
[133] J. A. Wilson,et al. Two-dimensional movement control using electrocorticographic signals in humans , 2008, Journal of neural engineering.
[134] Hal Philipp,et al. Charge transfer sensing , 1999 .
[135] Ivan Poupyrev,et al. SideBySide: ad-hoc multi-user interaction with handheld projectors , 2011, UIST.
[136] Andrew Sears,et al. Improving Touchscreen Keyboards: Design Issues and a Comparison with Other Devices , 1991, Interact. Comput..
[137] Chris Harrison,et al. Minput: enabling interaction on small mobile devices with high-precision, low-cost, multipoint optical tracking , 2010, CHI.
[138] Xiang Cao,et al. ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.
[139] Kent Lyons,et al. An investigation into round touchscreen wristwatch interaction , 2008, Mobile HCI.
[140] Alva Noë,et al. Action in Perception , 2006, Representation and Mind.
[141] K R Foster,et al. Whole-body impedance--what does it measure? , 1996, The American journal of clinical nutrition.
[142] Desney S. Tan,et al. Enhancing input on and above the interactive surface with muscle sensing , 2009, ITS '09.
[143] Kent Lyons,et al. Twiddler typing: one-handed chording text entry for mobile phones , 2004, CHI.
[144] Xiang Cao,et al. Interacting with dynamically defined information spaces using a handheld projector and a pen , 2006, UIST.
[145] Frederick P. Brooks,et al. Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.
[146] A CHAPANIS,et al. THEORY AND METHODS FOR ANALYZING ERRORS IN MAN‐MACHINE SYSTEMS , 1951, Annals of the New York Academy of Sciences.
[147] Chris Harrison,et al. OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.
[148] Alex Pentland,et al. Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[149] Stuart K. Card,et al. Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT , 1987 .
[150] Robert J. K. Jacob,et al. Brain measurement for usability testing and adaptive interfaces: an example of uncovering syntactic workload with functional near infrared spectroscopy , 2009, CHI.
[151] Roel Vertegaal,et al. Organic user interfaces: designing computers in any way, shape, or form , 2007, CACM.
[152] Ken Hinckley,et al. A survey of design issues in spatial input , 1994, UIST '94.
[153] Stanley E. Jones,et al. A naturalistic study of the meanings of touch , 1985 .
[154] Patrick Baudisch,et al. Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.
[155] Philip R. Cohen. The role of natural language in a multimodal interface , 1992, UIST '92.
[156] J. Löwgren. Towards an articulation of interaction aesthetics , 2009 .
[157] Adrian Hilton,et al. A survey of advances in vision-based human motion capture and analysis , 2006, Comput. Vis. Image Underst..