Forming global representations with extended backpropagation

The authors present an alternative to fixed microfeature encoding. Meaningful global representations are developed automatically while learning the processing task. When the backward error propagation is extended to the input layer the representations of the input items evolve to reflect the underlying relations relevant to the processing task. No microfeatures and no discrete categorization can be seen in the resulting representation, i.e., all aspects of a concept are distributed over the whole set of units as an activity profile. The representation is determined by all the contexts where the concept has been encountered, and consequently it is also a representation of all these contexts.<<ETX>>