Transformation and signal-separation neural networks
暂无分享,去创建一个
This chapter presents transformation and signal-separation neural networks into three parts: the first part describes the neurodynamical aspects of neural networks, the second part deals with principal component analysis (PCA) and with related neural networks, and the third part deals with independent component analysis (ICA) and neural architectures performing signal separation. Neural networks are excellent candidates for feature extraction and selection and also for signal separation. The underlying architectures mostly employ unsupervised learning algorithms and are viewed as nonlinear dynamical systems. Several neural network models such as the generalized Hebbian algorithm, adaptive principal component extraction, and the linear and nonlinear Oja's algorithms are reviewed. As an application for principal component analysis, the emerging area of medical image coding is chosen. ICA is gaining importance in artifact separation in medical imaging. The chapter reviews the most important algorithms for ICA such as the Infomax, FastICA, and topographic ICA. Imaging brain dynamics is becoming the key to the understanding of cognitive processes associated with the human brain. The chapter describes artifact separation based on ICA for two modalities of imaging brain dynamics, magnetoencephalographic recordings and the functional magnetic resonance imaging. The chapter examines two unsupervised learning laws: signal Hebbian learning law and competitive learning law or Grossberg law. It also presents neural network architectures that extract the principal components in a self-organized manner and discusses the PCA-type neural networks by examining their use for solving medical image coding.