Neural network applications for interconnection networks
暂无分享,去创建一个
Artificial Neural Networks (ANNs) have features that can be exploited to solve computational problems. First, the inherently parallel structure of some ANNs makes them suitable for use as parallel computers. Second, the fact that some ANNs can learn input/output mappings and generalize is a feature that has generated a huge amount of interest.
In this thesis, ANNs are used to solve certain problems for Interconnection Networks (INs). The first subject that is examined is routing through INs. Specifically, an ANN is presented that solves the circuit switching routing problem for Multistage Interconnection Networks (MINs). The method involves reducing the routing problem into a constraint satisfaction problem that can be solved using an energy relaxation ANN, such as a Hopfield network. The ANN routing methodology is compared to exhaustive search routing and greedy routing. For MINs that are not too large, the three routing methodologies produce solutions that are of similar quality. However, since the ANN is itself a parallel computer, it is likely to have speed advantages over the other routing approaches.
The representational abilities of certain Recurrent Neural Networks (RNNs) is the second subject that is addressed. It is proved that a first order, Single Layer RNN (SLRNN) with hard limiting transfer functions is incapable of implementing a finite state recognizer for certain regular languages, including parity. It is also demonstrated that a second order SLRNN can implement a finite state recognizer for each regular language.
The third subject that is investigated is the ability of RNNs to learn the structure of an IN from examples. The approach is essentially an expansion of the grammatical inference problem that has been explained in the RNN literature. Gradient descent methods are used to train a second order SLRNN, and the IN structure can then be extracted from the SLRNN using simple clustering algorithms.