Shift and Blend: Understanding the hybrid character of computing artefacts on a tool-agent spectrum

In the context of human-agent interaction, we see the emergence of computational artefacts that display hybridity because they can be experienced as tools and agents. In this paper we propose a tool-agent spectrum as an analytical lens that uses 'intention' as a central concept. This spectrum aims to clarify how a computational object can change from being conducive to the intentions of others ('tool') to appearing to have intentions of its own ('agent'), or vice versa. We have applied this analytical lens to unravel people's experiences in two hybrid cases; guide dogs as a living mobility aid for the visually impaired and an experimental wearable object named 'BagSight' as a rudimentary artificial counterpart. We compared both cases through the lens of a tool-agent spectrum and elaborate on these results by discussing some of the principles by which computational artefacts can shift across the spectrum. We conclude by discussing the limitations of this study and provide suggestions for future work.

[1]  F. Heider,et al.  An experimental study of apparent behavior , 1944 .

[2]  Michael Schmitz,et al.  Concepts for life-like interactive objects , 2010, TEI.

[3]  Ben J. A. Kröse,et al.  Products as Agents: Metaphors for Designing the Products of the IoT Age , 2017, CHI.

[4]  Karolina Zawieska,et al.  Suspension of disbelief in social robotics , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[5]  Heiko Wersing,et al.  From Tools Towards Cooperative Assistants , 2017, HAI.

[6]  Katie Salen,et al.  Rules of play: game design fundamentals , 2003 .

[7]  Susanne Bødker,et al.  Complex Mediation , 2005, Hum. Comput. Interact..

[8]  Wendy Ju,et al.  Next Steps for Human-Computer Integration , 2020, CHI.

[9]  Wendy Ju,et al.  Designing robots with movement in mind , 2014, Journal of Human-Robot Interaction.

[10]  Michael E. Bratman,et al.  Faces of Intention: Contents , 1999 .

[11]  Batya Friedman,et al.  Hardware companions?: what online AIBO discussion forums reveal about the human-robotic relationship , 2003, CHI '03.

[12]  Michael E. Bratman,et al.  Faces of Intention: Selected Essays on Intention and Agency , 1999 .

[13]  Nicholas R. Jennings,et al.  Intelligent agents: theory and practice , 1995, The Knowledge Engineering Review.

[14]  Marc Hassenzahl,et al.  Hybridity as Design Strategy for Service Robots to Become Domestic Products , 2020, CHI Extended Abstracts.

[15]  Klaus B. Bærentsen,et al.  An activity theory approach to affordance , 2002, NordiCHI '02.

[16]  Marco C. Rozendaal Objects with intent , 2016, Interactions.

[17]  R. Hepburn,et al.  BEING AND TIME , 2010 .

[18]  J. Gibson The Ecological Approach to Visual Perception , 1979 .

[19]  Márta Gácsi,et al.  Assistance dogs provide a useful behavioral model to enrich communicative skills of assistance robots , 2013, Front. Psychol..

[20]  Steve Benford,et al.  Ambiguity as a resource for design , 2003, CHI '03.

[21]  Betti Marenko,et al.  Animistic design: how to reimagine digital interaction between the human and the nonhuman , 2016, Digit. Creativity.

[22]  Leila Takayama,et al.  Making sense of agentic objects and teleoperation: In-the-moment and reflective perspectives , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  S. Brison The Intentional Stance , 1989 .

[24]  Peter P.C.C. Verbeek,et al.  Cyborg intentionality: Rethinking the phenomenology of human–technology relations , 2008 .

[25]  Mark Coeckelbergh,et al.  Humans, Animals, and Robots: A Phenomenological Approach to Human-Robot Relations , 2011, Int. J. Soc. Robotics.

[26]  Vladimir P. Zinchenko Developing activity theory: the zone of proximal development and beyond , 1995 .

[27]  Maarten Sierhuis,et al.  Coactive design , 2014, J. Hum. Robot Interact..

[28]  Florian Jentsch,et al.  Human-animal teams as an analog for future human-robot teams , 2012 .

[29]  Mark Mulder,et al.  Haptic shared control: smoothly shifting control authority? , 2012, Cognition, Technology & Work.

[30]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[31]  Gabriella Lakatos,et al.  How Can the Ethological Study of Dog-Human Companionship Inform Social Robotics? , 2012 .

[32]  Pieter Jan Stappers,et al.  Ambiguity and Open-Endedness in Behavioural Design , 2018, DRS2018: Catalyst.

[33]  Alekseĭ Nikolaevich Leontʹev Activities. Consciousness. Personality , 2011 .

[34]  Victor Kaptelinin,et al.  Acting with technology: Activity theory and interaction design , 2006, First Monday.

[35]  M. Bakhtin,et al.  The Dialogic Imagination: Four Essays , 1981 .

[36]  V. Braitenberg Vehicles, Experiments in Synthetic Psychology , 1984 .

[37]  Kazunori Terada,et al.  Effects of Behavioral Complexity on Intention Attribution to Robots , 2015, HAI.

[38]  Sarah Diefenbach,et al.  “ Annoying , but in a Nice Way ” : An Inquiry into the Experience of Frictional Feedback , 2015 .

[39]  Tom Ziemke,et al.  The Role of Intention in Cognitive Robotics , 2016, Toward Robotic Socially Believable Behaving Systems.