The `detailed balance' net: a stable asymmetric artificial neural system for unsupervised learning

A dynamically stable artificial neural network with graded-response neurons employing an unsupervised learning rule for connection weights of restricted asymmetry is investigated. In particular, the quality of performance of the network after fixing the constants entering the model (passive decay constants, forgetting constants, asymmetry factors, and the steepness of the signal function) is discussed. Subsequent to an estimation of the passive decay and forgetting constants, based on the stationary solutions of the differential equations describing the dynamics of the net, and of the asymmetry factors, the constants are quantified further by optimizing the recognition rate in a computer simulation for a specific model problem in the highly nonlinear (high-gain) limit. Working in the high-gain limit is justified from the behavior of the storage capacity of the net as a function of the steepness of the signal function. First results for applications to a real-world problem (work-piece recognition) indicate that the numerical values obtained for the constants are independent of net size