Backpropagation algorithm which varies the number of hidden units
暂无分享,去创建一个
Summary form only given, as follows. A backpropagation algorithm is presented that varies the number of hidden units. The algorithm is expected to escape local minima and makes it no longer necessary to decide on the number of hidden units. Exclusive-OR training and 8*8 dot alphanumeric font training using this algorithm are explained. In exclusive-OR training, the probability of being trapped in local minima is reduced. In alphanumeric font training, the network converted two to three times faster than the conventional backpropagation algorithm. >