Figaro: A Tabletop Authoring Environment for Human-Robot Interaction

Human-robot interaction designers and developers navigate a complex design space, which creates a need for tools that support intuitive design processes and harness the programming capacity of state-of-the-art authoring environments. We introduce Figaro, an expressive tabletop authoring environment for mobile robots, inspired by shadow puppetry, that provides designers with a natural, situated representation of human-robot interactions while exploiting the intuitiveness of tabletop and tangible programming interfaces. On the tabletop, Figaro projects a representation of an environment. Users demonstrate sequences of behaviors, or scenes, of an interaction by manipulating instrumented figurines that represent the robot and the human. During a scene, Figaro records the movement of figurines on the tabletop and narrations uttered by users. Subsequently, Figaro employs real-time program synthesis to assemble a complete robot program from all scenes provided. Through a user study, we demonstrate the ability of Figaro to support design exploration and development for human-robot interaction.

[1]  Luca Iocchi,et al.  Representation and Execution of Social Plans through Human-Robot Collaboration , 2014, ICSR.

[2]  Andrea Lockerd Thomaz,et al.  Turn-Taking Based on Information Flow for Fluent Human-Robot Interaction , 2011, AI Mag..

[3]  Joep W. Frens,et al.  Tangible products: redressing the balance between appearance and action , 2004, Personal and Ubiquitous Computing.

[4]  Maya Cakmak,et al.  Neural Semantic Parsing with Anonymization for Command Understanding in General-Purpose Service Robots , 2019, RoboCup.

[5]  Maya Cakmak,et al.  RoboFlow: A flow-based visual programming language for mobile manipulation tasks , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Allison Sauppé,et al.  Bodystorming Human-Robot Interactions , 2019, UIST.

[7]  Andrea Forte,et al.  Reliability and Inter-rater Reliability in Qualitative Research , 2019, Proc. ACM Hum. Comput. Interact..

[8]  Maurizio Seracini,et al.  Wipe‐Off: An Intuitive Interface for Exploring Ultra‐Large Multi‐Spectral Data Sets for Cultural Heritage Diagnostics , 2009, Comput. Graph. Forum.

[9]  Takeo Igarashi,et al.  Picode: inline photos representing posture data in source code , 2013, CHI.

[10]  Kim Halskov,et al.  Tangible 3D tabletops: combining tangible tabletop interaction and 3D projection , 2012, NordiCHI.

[11]  Allison Sauppé,et al.  Authoring and Verifying Human-Robot Interactions , 2018, UIST.

[12]  C. Cordell Green,et al.  What Is Program Synthesis? , 1985, J. Autom. Reason..

[13]  Jane Fulton Suri,et al.  Experience prototyping , 2000, DIS '00.

[14]  Mary Beth Rosson,et al.  Scenario-based design , 2002 .

[15]  Ioannis Stamelos,et al.  Evaluating children performance with graphical and tangible robot programming tools , 2014, Personal and Ubiquitous Computing.

[16]  Bernhard Steffen,et al.  Introduction to Active Automata Learning from a Practical Perspective , 2011, SFM.

[17]  Stavros N. Demetriadis,et al.  Educational Robots Driven by Tangible Programming Languages: A Review on the Field , 2016, EDUROBOTICS.

[18]  Dana Angluin,et al.  Learning Regular Sets from Queries and Counterexamples , 1987, Inf. Comput..

[19]  Maya Cakmak,et al.  Code3: A System for End-to-End Programming of Mobile Manipulator Robots for Novices and Experts , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[20]  Ehud Sharlin,et al.  Three dimensional tangible user interface for controlling a robotic team , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[21]  Ehud Sharlin,et al.  Teaching Robots Style: Designing and Evaluating Style-by-Demonstration for Interactive Robotic Locomotion , 2013, Hum. Comput. Interact..

[22]  Jennifer Lee,et al.  Communicating Robot Motion Intent with Augmented Reality , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  Ehud Sharlin,et al.  Style by demonstration: teaching interactive movement style to robots , 2012, IUI '12.

[24]  Armando Solar-Lezama,et al.  The Sketching Approach to Program Synthesis , 2009, APLAS.

[25]  Takayuki Kanda,et al.  How to approach humans?-strategies for social robots to initiate interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[26]  Allison Sauppé,et al.  Robot Deictics: How Gesture and Context Shape Referential Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[27]  Takayuki Kanda,et al.  May I help you? - Design of Human-like Polite Approaching Behavior- , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[28]  Wafa Johal,et al.  Cellulo: Versatile Handheld Robots for Education , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[29]  John Zimmerman,et al.  A fieldwork of the future with user enactments , 2012, DIS '12.

[30]  Maya Cakmak,et al.  Situated Tangible Robot Programming , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[31]  Maya Cakmak,et al.  Trajectories and keyframes for kinesthetic teaching: A human-robot interaction perspective , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[32]  Allison Sauppé,et al.  Design patterns for exploring and prototyping human-robot interactions , 2014, CHI.

[33]  Antti Oulasvirta,et al.  Understanding contexts by being there: case studies in bodystorming , 2003, Personal and Ubiquitous Computing.

[34]  Stéphane Conversy,et al.  Vizir: A Domain-Specific Graphical Language for Authoring and Operating Airport Automations , 2018, UIST.

[35]  Daniel Neider,et al.  Applications of automata learning in verification and synthesis , 2014 .

[36]  Takayuki Kanda,et al.  Would You Mind Me if I Pass by You? : Socially-Appropriate Behaviour for an Omni-based Social Robot in Narrow Environment , 2020, 2020 15th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[37]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[38]  Brett Browning,et al.  Learning by demonstration with critique from a human teacher , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[39]  Harald Raffelt,et al.  LearnLib: a library for automata learning and experimentation , 2005, FMICS '05.

[40]  Takayuki Kanda,et al.  Capturing Expertise: Developing Interaction Content for a Robot Through Teleoperation by Domain Experts , 2015, Int. J. Soc. Robotics.

[41]  Jean-Christophe Baillie,et al.  URBI: towards a universal robotic low-level programming language , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[42]  Rubaiat Habib Kazi,et al.  Kitty: sketching dynamic and interactive illustrations , 2014, UIST.

[43]  Kentaro Ishii,et al.  Magic cards: a paper tag interface for implicit robot control , 2009, CHI.

[44]  D. Norman The Design of Everyday Things: Revised and Expanded Edition , 2013 .

[45]  Maxime Busy,et al.  qiBullet, a Bullet-based simulator for the Pepper and NAO robots , 2019, ArXiv.

[46]  John Zimmerman,et al.  Rapidly Exploring Application Design Through Speed Dating , 2007, UbiComp.

[47]  Bruno Maisonnier,et al.  Choregraphe: a graphical tool for humanoid robot programming , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[48]  Takayuki Kanda,et al.  Destination Unknown: Walking Side-by-Side without Knowing the Goal , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[49]  Chien-Ming Huang,et al.  PATI: a projection-based augmented table-top interface for robot programming , 2019, IUI.

[50]  Deborah E. White,et al.  Thematic Analysis , 2017 .

[51]  Stefanie Tellex,et al.  Flight, Camera, Action! Using Natural Language and Mixed Reality to Control a Drone , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[52]  Rajeev Alur,et al.  TRANSIT: specifying protocols with concolic snippets , 2013, PLDI.

[53]  Bill Buxton,et al.  Sketching User Experiences: Getting the Design Right and the Right Design , 2007 .

[54]  Crystal Chao,et al.  Timing in multimodal turn-taking interactions , 2012, HRI 2012.

[55]  Takayuki Kanda,et al.  Human-robot interaction design using Interaction Composer eight years of lessons learned , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[56]  Adrian Friday Clearing the decks , 1981, Nature.

[57]  Hiroshi Ishii,et al.  The tangible user interface and its evolution , 2008, CACM.

[58]  Ehud Sharlin,et al.  Puppet Master: designing reactive character behavior by demonstration , 2008, SCA '08.

[59]  Takeo Igarashi,et al.  Roboshop: multi-layered sketching interface for robot housework assignment and management , 2011, CHI.

[60]  Maya Cakmak,et al.  Robot Programming by Demonstration with Interactive Action Visualizations , 2014, Robotics: Science and Systems.

[61]  Takayuki Kanda,et al.  Data-Driven HRI: Learning Social Behaviors by Example From Human–Human Interaction , 2016, IEEE Transactions on Robotics.

[62]  Fan Li,et al.  V.Ra: An In-Situ Visual Authoring System for Robot-IoT Task Planning with Augmented Reality , 2019, CHI Extended Abstracts.

[63]  Eric Horvitz,et al.  A study in scene shaping: Adjusting F-formations in the wild , 2018 .

[64]  Ouzhan zcan Cultures, the Traditional Shadow Play, and Interactive Media Design , 2002, Design Issues.

[65]  Jun Kato,et al.  Reactile: Programming Swarm User Interfaces through Direct Physical Manipulation , 2018, CHI.

[66]  Bilge Mutlu,et al.  Robot behavior toolkit: Generating effective social behaviors for robots , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[67]  Tomoo Inoue,et al.  Virtual stage linked with a physical miniature stage to support multiple users in planning theatrical productions , 2012, IUI '12.