This paper discusses associative learning of a partner robots through interaction with people. Human interaction based on gestures is very important to realize the natural communication. The meaning of gestures can be understood through the actual interaction with a human and the imitation of a human. Therefore, we propose a method for associative learning based on imitation and conversation to realize the natural communication. Steady-state genetic algorithms are applied for detecting human face and objects in image processing. Spiking neural networks are applied for memorizing spatio-temporal patterns of human hand motions, and relationship among perceptual information. Furthermore, we conduct several experiments of the partner robot on the interaction based on imitation and conversation with people. The experimental results show that the proposed method can refine the relationship among the perceptual information, and can reflect the updated relationship to the natural communication with a human. Human interaction based on gestures is very important to realize the natural communication. The meaning of gestures can be understood through the actual interaction with a human and imitation of a human. Therefore, we propose a method for associative learning based on imitation and conversation to realize the natural communication. Basically, imitative learning is composed of model observation and model reproduction. Furthermore, model learning is required to memorize and generalize motion patterns as gestures. In addition, the model clustering is required to distinguish a specific gesture from others, and model selection is also performed for the human interaction. In this way, the imitative learning requires various learning capabilities of model observation, model clustering, model selection, model reproduction, and model learning simultaneously. We proposed a method for imitative learning of partner robots based on visual perception (11,12). First of all, the robot detects a human based on image processing with a steady-state genetic algorithm (SSGA) (13). Next, a series of the movements of the human hand are extracted by SSGA used as model observation, and the hand motion pattern is extracted by a spiking neural network (SNN) . Furthermore, SSGA is used for generating a trajectory similar to the human hand motion pattern as model reproduction (14). In addition to the imitative learning, the robot requires the capability of extracting necessary perceptual information in finite time for the natural communication with a human. Associative memory in the cognitive development is very important for the perception. Therefore we propose a method for the simultaneous associative learning of various types of perceptual information such as colors, shapes, and gestures related with symbolic information used for conversation with a human. Symbolic information used in utterances is very important and helpful for the associative learning, because human language has been improved and refined for long time. The meaning of symbols is neither exact nor precise among people, but the use of linguistic information is very useful and helpful for robots in order to share the meanings of patterns in visual images with people. We apply SNN for associative learning of perceptual information. Furthermore, we conduct several experiments of the partner robot on the interaction with people.
[1]
Justine Cassell,et al.
Embodied Conversational Agents: Representation and Intelligence in User Interfaces
,
2001,
AI Mag..
[2]
Naoyuki Kubota,et al.
Computational intelligence for structured learning of a partner robot based on imitation
,
2005,
Inf. Sci..
[3]
Naoyuki Kubota,et al.
Behavior Coordination of A Partner Robot based on Imitation
,
2004
.
[4]
Alex A. Freitas,et al.
Evolutionary Computation
,
2002
.
[5]
M. Eysenck.
Psychology : an integrated approach
,
1998
.
[6]
Yasushi Nakauchi,et al.
A Social Robot that Stands in Line
,
2000,
Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).
[7]
Rolf Pfeifer,et al.
Understanding intelligence
,
2020,
Inequality by Design.
[8]
A Taxonomy of Computational and Social Learning
,
2001
.
[9]
Michael Don Palmer,et al.
Reflections on language
,
1977
.
[10]
Naoyuki Kubota,et al.
Development of Internal Models for Communication of A Partner Robot based on Computational Intelligence
,
2005
.
[11]
Naoyuki Kubota,et al.
Temporal Coding in Spiking Neural Network for Gestures Recognition of A Partner Robot
,
2006
.
[12]
Gilbert Syswerda,et al.
A Study of Reproduction in Generational and Steady State Genetic Algorithms
,
1990,
FOGA.
[13]
Christopher J. Bishop,et al.
Pulsed Neural Networks
,
1998
.
[14]
Tetsuo Ono,et al.
Physical relation and expression: joint attention for human-robot interaction
,
2003,
IEEE Trans. Ind. Electron..
[15]
N. Kubota,et al.
Computational inttelligence for human detection of a partner robot
,
2004,
Proceedings World Automation Congress, 2004..
[16]
D. Sperber,et al.
Relevance: Communication and Cognition
,
1997
.
[17]
D. Ruppert.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
,
2004
.