Sensing through structure

We present an approach to designing input devices that focuses on the structure of materials. We explore and visualize how a material reacts under manipulation, and harness the material's properties to design new movement sensors. Two benefits spring out of this approach. One, simpler sensing emerges from making use of existing structure in the material. Two, by working with the natural structure of the material, we create input devices with readily recognizable affordances. We present six projects using this approach. We use the natural structure (coordination) of the human body to enable a mapping from five clothing-mounted accelerometers to high-quality motion capture data, creating a low-cost performance animation system. We design silicone input devices with embedded texture allowing single-camera tracking. We study squishable, conformable materials such as foam and silicone, and create a vocabulary of unit structures (shaped cuts in the material) for harnessing patterns of compression/tension to capture particular manipulations. We use this vocabulary to build soft sensing skeletons for stuffed animals, making foam cores with e-textile versions of our unit structures. We also use this vocabulary to design a tongue input device for a collaboration with Disney Imagineering. Finally, we rethink this vocabulary and apply it to capturing, using air pressure sensors, manipulations of hollow 3D-printed rubber shapes, and 3D-print several interactive robots incorporating the new vocabulary.

[1]  Wojciech Matusik,et al.  Practical motion capture in everyday surroundings , 2007, ACM Trans. Graph..

[2]  Hideki Koike,et al.  PhotoelasticTouch: transparent rubbery tangible interface using an LCD and photoelasticity , 2009, UIST '09.

[3]  Neil Gershenfeld,et al.  E-broidery: Design and fabrication of textile-based computing , 2000, IBM Syst. J..

[4]  David Kim,et al.  Creating malleable interactive surfaces using liquid displacement sensing , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[5]  Takeshi Naemura,et al.  ForceTile: tabletop tangible interface with vision-based force distribution sensing , 2008, SIGGRAPH '08.

[6]  Paul A. Beardsley,et al.  RFIG lamps: interacting with a self-describing world via photosensing wireless tags and projectors , 2004, ACM Trans. Graph..

[7]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[8]  Gerhard Tröster,et al.  SMASH: a distributed sensing and processing garment for the classification of upper body postures , 2008, BODYNETS.

[9]  Paul Lukowicz,et al.  Recognizing Workshop Activity Using Body Worn Microphones and Accelerometers , 2004, Pervasive.

[10]  Masahiko Inami,et al.  Fur interface with bristling effect induced by vibration , 2010, AH.

[11]  Jan Vanfleteren,et al.  An array waveguide sensor for artificial optical skins , 2009, OPTO.

[12]  Satoru Takenawa,et al.  A magnetic type tactile sensor using a two-dimensional array of inductors , 2009, 2009 IEEE International Conference on Robotics and Automation.

[13]  Leah Buechley,et al.  Making textile sensors from scratch , 2010, TEI '10.

[14]  Aaron Powers,et al.  Matching robot appearance and behavior to tasks to improve human-robot cooperation , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[15]  David W. Rosen,et al.  Rapid Prototyping for Robotics , 2005 .

[16]  Hans-Peter Seidel,et al.  Motion reconstruction using sparse accelerometer data , 2011, TOGS.

[17]  J.D. Smith,et al.  Low-Cost Malleable Surfaces with Multi-Touch Pressure Sensitivity , 2007, Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP'07).

[18]  Frédéric Maire,et al.  Hands-free mouse-pointer manipulation using motion-tracking and speech recognition , 2007, OZCHI '07.

[19]  Wookho Son,et al.  A new means of HCI: EMG-MOUSE , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[20]  M. Shimojo,et al.  A tactile sensor sheet using pressure conductive rubber with electrical-wires stitched method , 2004, IEEE Sensors Journal.

[21]  Chris Esposito,et al.  Of mice and monkeys: a specialized input device for virtual body animation , 1995, I3D '95.

[22]  Ramesh Raskar,et al.  Prakash: lighting aware motion capture using photosensing markers and multiplexed illuminators , 2007, ACM Trans. Graph..

[23]  Sarah N. Woods,et al.  Exploring the design space of robots: Children's perspectives , 2006, Interact. Comput..

[24]  Hod Lipson,et al.  MUTLI-MATERIAL FOOD PRINTING WITH COMPLEX INTERNAL STRUCTURE SUITABLE FOR CONVENTIONAL POST-PROCESSING , 2010 .

[25]  Benjamin C. K. Tee,et al.  Highly sensitive flexible pressure sensors with microstructured rubber dielectric layers. , 2010, Nature materials.

[26]  Markus H. Gross,et al.  Combining body sensors and visual sensors for motion training , 2005, ACE '05.

[27]  Edward H. Adelson,et al.  deForm: an interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch , 2011, UIST.

[28]  P. Kay,et al.  Universals and cultural variation in turn-taking in conversation , 2009, Proceedings of the National Academy of Sciences.

[29]  Cynthia Breazeal,et al.  Design of a therapeutic robotic companion for relational, affective touch , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[30]  Hiroshi Ishii,et al.  Illuminating clay: a 3-D tangible interface for landscape analysis , 2002, CHI.

[31]  Melody Moore Jackson,et al.  A galvanic skin response interface for people with severe motor disabilities , 2004, Assets '04.

[32]  Hiroyuki Shinoda,et al.  Acoustic resonant tensor cell for tactile sensing , 1997, Proceedings of International Conference on Robotics and Automation.

[33]  Lotte N. S. Andreasen Struijk,et al.  An Inductive Tongue Computer Interface for Control of Computers and Assistive Devices , 2006, IEEE Transactions on Biomedical Engineering.

[34]  S. Wulff,et al.  The symbolic and object play of children with autism: A review , 1985, Journal of autism and developmental disorders.

[35]  Ivan Poupyrev,et al.  Sensing through structure: designing soft silicone sensors , 2010, TEI.

[36]  Desney S. Tan,et al.  Optically sensing tongue gestures for computer input , 2009, UIST '09.

[37]  S. Hollister Porous scaffold design for tissue engineering , 2005, Nature materials.

[38]  Gregory D. Abowd,et al.  Designing toys with automatic play characterization for supporting the assessment of a child's development , 2008, IDC.

[39]  Brian Scassellati,et al.  Effects related to synchrony and repertoire in perceptions of robot dance , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[40]  Richard Wright,et al.  The Vocal Joystick: A Voice-Based Human-Computer Interface for Individuals with Motor Impairments , 2005, HLT.

[41]  Masayuki Inaba,et al.  A full-body tactile sensor suit using electrically conductive fabric and strings , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[42]  C. Cobelli,et al.  A Markerless Motion Capture System to Study Musculoskeletal Biomechanics: Visual Hull and Simulated Annealing Approach , 2006, Annals of Biomedical Engineering.

[43]  Erik Strommen,et al.  When the interface is a talking dinosaur: learning across media with ActiMates Barney , 1998, CHI.

[44]  Robert D. Lipschutz,et al.  Targeted muscle reinnervation for real-time myoelectric control of multifunction artificial arms. , 2009, JAMA.

[45]  Andrew Sears,et al.  Physical disabilities and computing technologies: an analysis of impairments , 2002 .

[46]  Sang-Hyun Cho,et al.  Wearable Accelerometer System for Measuring the Temporal Parameters of Gait , 2007, 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[47]  Patrick Boissy,et al.  User-based motion sensing and fuzzy logic for automated fall detection in older adults. , 2007, Telemedicine journal and e-health : the official journal of the American Telemedicine Association.

[48]  Noel E. O'Connor,et al.  Combining inertial and visual sensing for human action recognition in tennis , 2010, ARTEMIS '10.

[49]  Jonathan Hook,et al.  A reconfigurable ferromagnetic input device , 2009, UIST '09.

[50]  Paul A. Viola,et al.  Learning silhouette features for control of human motion , 2004, SIGGRAPH '04.

[51]  Florian Vogt,et al.  A malleable surface touch interface , 2004, SIGGRAPH '04.

[52]  Hiroyuki Shinoda,et al.  A Sensitive Skin Using Wireless Tactile Sensing Elements , 2001 .

[53]  Koichi Kuzume,et al.  Input device for disabled persons using expiration and tooth-touch sound signals , 2010, SAC '10.

[54]  Chikamune Wada,et al.  Effects of Visual Stimuli on a Communication Assistive Method Using Sympathetic Skin Response , 2010, ICCHP.

[55]  Hod Lipson,et al.  Multi-Material Freeform Fabrication of Active Systems , 2008 .

[56]  Leah Buechley,et al.  A Construction Kit for Electronic Textiles , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[57]  R. Andrew Russell Compliant-skin tactile sensor , 1987, Proceedings. 1987 IEEE International Conference on Robotics and Automation.

[58]  Naoki Kawakami,et al.  GelForce: a vision-based traction field computer interface , 2005, CHI Extended Abstracts.

[59]  Atsushi Nakazawa,et al.  A puppet interface for retrieval of motion capture data , 2011, SCA '11.

[60]  Luca Benini,et al.  MOCA: A Low-Power, Low-Cost Motion Capture System Based on Integrated Accelerometers , 2007, Adv. Multim..

[61]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[62]  Grigori Evreinov,et al.  "Breath-Joystick" - Graphical Manipulator for Physically Disabled Users , 2000 .

[63]  Jessica K. Hodgins,et al.  A tongue input device for creating conversations , 2011, UIST '11.

[64]  Jörn Loviscach,et al.  A Mobile Low-Cost Motion Capture System Based on Accelerometers , 2006, ISVC.

[65]  Kent Larson,et al.  Real-Time Recognition of Physical Activities and Their Intensities Using Wireless Accelerometers and a Heart Rate Monitor , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[66]  Shumin Zhai,et al.  An isometric tongue pointing device , 1997, CHI.

[67]  Jessica K. Hodgins,et al.  Performance animation from low-dimensional control signals , 2005, ACM Trans. Graph..

[68]  Yuta Sugiura,et al.  FuwaFuwa: detecting shape deformation of soft objects using directional photoreflectivity measurement , 2011, SIGGRAPH '11.

[69]  Hod Lipson,et al.  Printing Embedded Circuits , 2007 .

[70]  Michael Eisenberg,et al.  Why Toys Shouldn't Work "Like Magic": Children's Technology and the Values of Construction and Control , 2007, 2007 First IEEE International Workshop on Digital Game and Intelligent Toy Enhanced Learning (DIGITEL'07).

[71]  Sungmee Park,et al.  The wearable motherboard: a framework for personalized mobile information processing (PMIP) , 2002, Proceedings 2002 Design Automation Conference (IEEE Cat. No.02CH37324).

[72]  Roel Vertegaal,et al.  Programming reality: from transitive materials to organic user interfaces , 2009, CHI Extended Abstracts.

[73]  Katsuhiko Nakamura,et al.  Telemetric Artificial Skin for Soft Robot , 1999 .

[74]  Takeo Igarashi,et al.  Voice as sound: using non-verbal voice input for interactive control , 2001, UIST '01.

[75]  Dinesh K. Pai,et al.  FootSee: an interactive animation system , 2003, SCA '03.

[76]  Hiroshi Ishiguro,et al.  Haptic Communication Between Humans and Robots , 2005, ISRR.

[77]  Paul Lukowicz,et al.  PadNET: wearable physical activity detection network , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[78]  Marek P. Michalowski,et al.  Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .

[79]  Ross T. Smith,et al.  Digital foam interaction techniques for 3D modeling , 2008, VRST '08.

[80]  Ali Mazalek,et al.  Tangible interfaces for real-time 3D virtual environments , 2007, ACE '07.

[81]  Bernd Freisleben,et al.  HaWCoS: the "hands-free" wheelchair control system , 2002, ASSETS.

[82]  Stefan Kopp,et al.  A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction , 2011, 2011 RO-MAN.

[83]  Kerstin Dautenhahn,et al.  Preferences and perceptions of robot appearance and embodiment in human-robot interaction trials , 2009 .

[84]  Peter H Veltink,et al.  Accelerometer and rate gyroscope measurement of kinematics: an inexpensive alternative to optical motion analysis systems. , 2002, Journal of biomechanics.

[85]  G. Baranek,et al.  Object play in infants with autism: methodological issues in retrospective video analysis. , 2005, The American journal of occupational therapy : official publication of the American Occupational Therapy Association.

[86]  Christoph Bartneck,et al.  Perception of affect elicited by robot motion , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[87]  Zengxi Pan,et al.  A flexible full-body tactile sensor of low cost and minimal connections , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[88]  Britta Wrede,et al.  Towards a typology of meaningful signals and cues in social robotics , 2011, 2011 RO-MAN.

[89]  Michael Reed,et al.  Prototyping digital clay as an active material , 2009, TEI.

[90]  David Lee,et al.  Perception of Robot Smiles and Dimensions for Human-Robot Interaction Design , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[91]  A. Rissanen,et al.  Physical Inactivity and Obesity: A Vicious Circle , 2008, Obesity.

[92]  Gregory D. Abowd,et al.  Blui: low-cost localized blowable user interfaces , 2007, UIST '07.

[93]  T. V. Papakostas Tactile sensor: stretching the limits , 2007 .

[94]  Bruce Blumberg,et al.  Sympathetic interfaces: using a plush toy to direct synthetic characters , 1999, CHI '99.

[95]  Sharon L. Oviatt,et al.  Adaptation of users² spoken dialogue patterns in a conversational interface , 2002, INTERSPEECH.

[96]  Jihong Lee,et al.  Sensor fusion and calibration for motion captures using accelerometers , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[97]  Yaser Sheikh,et al.  Motion capture from body-mounted cameras , 2011, ACM Trans. Graph..

[98]  Masayuki Inaba,et al.  Development of soft sensor exterior embedded with multi-axis deformable tactile sensor system , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[99]  Mira Dontcheva,et al.  Layered acting for character animation , 2003, ACM Trans. Graph..

[100]  Hiroshi Ishiguro,et al.  Interactive humanoids and androids as ideal interfaces for humans , 2005, ICMI '05.