Novel approaches in machine learning and computational intelligence

This special issue of Neurocomputing presents 18 original papers, which are extended versions of selected papers from the 20th European Symposium on Artificial Neural Networks (ESANN). ESANN is a single-track conference held annually in Bruges, Belgium, one of the most beautiful medieval towns in Europe, whose atmosphere is favorable to efficient work but also to enjoyable cultural visits and relations as a UNESCO World Heritage site. ESANN is organized by Prof. Michel Verleysen from Universite Catholique de Louvain, Belgium. In addition to regular sessions, the conference welcomes special sessions focused on particular topics such as machine learning for spectral data or multimedia applications, new trends in kernel design, motion recognition, the effects and handling of missing data or interpretable models. The contributions in this special issue show that ESANN covers a broad range of topics in neuro-computing, machine learning and neuroscience from theoretical aspects to state-of-the-art applications and many related themes in signal processing and computational intelligence. More than 130 researcher from 19 countries and five continents participated in the 20th ESANN in April, 2012. They presented 105 contributions, and enjoyed the especially communicative atmosphere in Bruges. Based on the recommendations of special-session organizers, the reviews of the conference papers, and the quality of the presentations made at the conference, a number of authors were invited to submit an extended version of their conference paper for this special issue of Neurocomputing. All of these papers were thoroughly reviewed once more by at least two independent experts and, finally the 18 papers presented in this volume were accepted for publication. In this special issue we can find a multitude of examples using neuro-computing and related techniques in different branches of research. The first six papers analyze theoretical aspects of different learning systems and identify results on the learning dynamic, potential optimization schemes and novel strategies to improve learning under different constraints. The first paper by Orrite et al. about Magnitude Sensitive Competitive Learning presents a new view on competitive learning. Standard methods distribute the representative data points, the final model consists of, according to the data density. The method allows for a new type of flexibility during learning such that any magnitude calculated from the input data inside its Voronoi region can be used to control the competition process. An important topic in kernel machines is iterative construction of more complex kernels from a collection of simpler ones. This topic is considered in the paper of Belanche et al. Averaging of kernel functions and studies one particular way of building such