Happy Moves, Sad Grooves: Using Theories of Biological Motion and Affect to Design Shape-Changing Interfaces
暂无分享,去创建一个
Selma Sabanovic | Kasper Hornbæk | Haodan Tan | John Tiab | Kasper Hornbæk | S. Šabanović | John Tiab | Haodan Tan | K. Hornbæk
[1] Panos Markopoulos,et al. The design space of shape-changing interfaces: a repertory grid study , 2014, Conference on Designing Interactive Systems.
[2] Sriram Subramanian,et al. Is my phone alive?: a large-scale study of shape change in handheld devices using videos , 2014, CHI.
[3] Selma Sabanovic,et al. Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces , 2014, International Journal of Social Robotics.
[4] Fabian Hemmert,et al. Animate mobiles: proxemically reactive posture actuation as a means of relational interaction with mobile phones , 2013, TEI '13.
[5] S. Marsella,et al. Expressing Emotion Through Posture and Gesture , 2015 .
[6] Wendy Ju,et al. Designing robots with movement in mind , 2014, Journal of Human-Robot Interaction.
[7] Karon E. MacLean,et al. It's alive!: exploring the design space of a gesturing phone , 2013, Graphics Interface.
[8] J. Russell. A circumplex model of affect. , 1980 .
[9] Mark A. Neerincx,et al. Child's recognition of emotions in robot's face and body , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[10] K. Scherer,et al. Emotion expression in body action and posture. , 2012, Emotion.
[11] Sang-Su Lee,et al. Interactivity attributes: a new way of thinking and describing interactivity , 2009, CHI.
[12] Myung-Suk Kim,et al. Make it move: a movement design method of simple standing products based on systematic mapping of torso movements & product messages , 2013, CHI.
[13] K. Scherer,et al. Bodily expression of emotion , 2009 .
[14] Da Young Ju,et al. Emotional Interaction and Nofitication of Flexible Handheld Devices , 2015, CHI Extended Abstracts.
[15] G. Johansson. Visual perception of biological motion and a model for its analysis , 1973 .
[16] Marek P. Michalowski,et al. Keepon , 2009, Int. J. Soc. Robotics.
[17] Markus Löchtefeld,et al. Morphees: toward high "shape resolution" in self-actuated flexible mobile devices , 2013, CHI.
[18] Jamie Zigelbaum,et al. Shape-changing interfaces , 2011, Personal and Ubiquitous Computing.
[19] Youngwoo Park,et al. The Trial of Bendi in a Coffeehouse: Use of a Shape-Changing Device for a Tactile-Visual Phone Conversation , 2015, CHI.
[20] P. Ekman. An argument for basic emotions , 1992 .
[21] Majken Kirkegaard Rasmussen,et al. Shape-changing interfaces: a review of the design space and open research questions , 2012, CHI.
[22] Oliver G. B. Garrod,et al. Dynamic Facial Expressions of Emotion Transmit an Evolving Hierarchy of Signals over Time , 2014, Current Biology.
[23] M. D. Meijer. The contribution of general features of body movement to the attribution of emotions , 1989 .
[24] Tek-Jin Nam,et al. Inflatable mouse: volume-adjustable mouse with air-pressure-sensitive input and haptic feedback , 2008, CHI.
[25] Mason Bretan,et al. Emotionally expressive dynamic physical behaviors in robots , 2015, Int. J. Hum. Comput. Stud..
[26] Sarah Diefenbach,et al. An interaction vocabulary. describing the how of interaction. , 2013, CHI Extended Abstracts.
[27] Kenji Amaya,et al. Emotion from Motion , 1996, Graphics Interface.
[28] Roel Vertegaal,et al. Organic user interfaces: designing computers in any way, shape, or form , 2007, CACM.
[29] Fabian Hemmert,et al. Living interfaces: the thrifty faucet , 2009, TEI.
[30] M. Bradley,et al. Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.
[31] Youngwoo Park,et al. Wrigglo: shape-changing peripheral for interpersonal mobile communication , 2014, CHI.
[32] Jérôme Monceaux,et al. Demonstration: first steps in emotional expression of the humanoid robot Nao , 2009, ICMI-MLMI '09.
[33] Robin R. Murphy,et al. Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).
[34] Youn-kyung Lim,et al. Interactivity Attributes for Expression-oriented Interaction Design , 2011 .
[35] Armin Bruderlin,et al. Perceiving affect from arm movement , 2001, Cognition.
[36] Wendy E. Mackay,et al. CHI '13 Extended Abstracts on Human Factors in Computing Systems , 2013, CHI 2013.
[37] Ben Matthews,et al. Easy doesn’t do it: skill and expression in tangible aesthetics , 2007, Personal and Ubiquitous Computing.
[38] Thomas Hanke. HamNoSys – Representing Sign Language Data in Language Resources and Language Processing Contexts , 2004 .