Using Weighted Networks to Represent Classification Knowledge in Noisy Domains

Experience is not always a benign teacher. Examples often contain irrelevant features, the relevant features may be noisy, and the results may not be categorical. Most learning systems, however, assume that the teacher is benign, and in particular that the training cases are free of noise. This paper describes a system, IWN, which can learn classification knowledge from a relatively small number of training cases but whose performance does not rapidly deteriorate when the training cases contain noise. IWN uses a network of weighted links to represent classification knowledge. This use of a non-discrete knowledge representation enables IWN to be more robust in the face of noisy data. In this paper we describe IWN's procedure for building its weighted networks and how IWN's performance compares with other systems. A central focus will be on how several different evaluation functions for propagating values in the network affect the tradeoff between handling noisy data and handling exceptional cases.