Help me help you: interfaces for personal robots

The communication bottleneck between robots and people [1] presents an enormous challenge to the human-robot interaction community. Rather than exclusively focusing on improving robot object learning, task learning, and natural language understanding, we propose also designing interfaces that make up for low communication bandwidth by thoughtfully accounting for the constrained capabilities of robots [2]. People are adept at compensating for communication limitations, changing their communicative strategies for talking to pets, babies [3], foreigners [4], and robots [5]. Communicative accommodation already exists. Thus, instead of requiring robots to perfectly understand natural language, gestures, etc., there is a wide variety of research and design to be done in the space of alternative communicative modalities. We propose to approach this problem by accounting for limitations in robot abilities and taking advantage of already familiar human-computer interaction models, leveraging a communication model based upon Information Theory. Using this design perspective, we present three different mobile user interfaces that were fully developed and implemented on a PR2 (Personal Robot 2) [6] for task domains in navigation, perception, learning and manipulation.