Finding People by Contingency : An Infomax Controller Approach

We frame social interaction as a problem in real-time system's identification and control. System's identification refers to the task of identifying the properties of a system we are trying to control. Control refers to the problem of sending an input sequence to a system in order to maximize expected returns with respect to desired goals. The brain faces a control problem when sending motor commands to the limbs, where it has to account for the inertial properties of the arm and the delays and levels of uncertainty in the efferent and afferent transmission lines. Riding a bicycle, shooting baskets, and playing a musical instrument are also control problems. Social interaction can be approached from the point of view of stochastic control theory under conditions of random delays and uncertainty much larger than those found when controlling non-social instruments. We illustrate this framework on the problem of finding people via contingency. There is strong evidence that infants use contingency analysis to identify other humans (Watson 1972, 1979) and that in fact contingency information may be more powerful than morphological properties of human faces (like the presence of eyes). Indeed we found that by ten moths of age infants used contingency information in a very active manner, ascertaining in a matter of seconds, whether a robot was or was not responsive to them (Movellan and Watson, 1987, 2002). We formalize the control problem in terms of a generative model in which observed behaviors can be produced by three different causes: (1) Self feedback (e.g., when we hear our own vocalization). (2) Responses from other humans that are related to our activity. (3) Background reponses unrelated to our activities. There are two possible control conditions: (1) A human is responding to us; and (2) Human are not responding to us. Given the random delays and noise typical of social interaction, the goal of the controller is to generate a sequence of behaviors that gather as much information as possible in a minimum amount of time about which of the two conditions we are operating under. We call this an infomax controller. Interestingly the controller exhibits patterns of behavior very similar to those found in 10 month infants when interacting with a robot system for the first time. The controller is currently being implemented on RUBI, a social robot under development at our laboratory, and will be ready for demonstration at ICDL2004.