Special Issue on Autonomous Learning

ion of learning processes as observed in neurobiology; quite a few early learning models are based on biological paradigms such as Hebbian learning, i.e. the corresponding algorithms are based on heuristics—the perceptron learning rule for classification or Oja’s rule for dimensionality reduction constitute prominent examples [11]. Albeit these learning rules often reveal a striking simplicity, their mathematical foundation can be quite complicated, even partially unsolved [4, 9]. With time, more and more mathematical formalisms emerged within machine learning which enable researchers to derive a learning rule based on an explicit mathematical modelling of the underlying goal: most modern techniques are accompanied by mathematical cost functions such as the least squares error for regression, constraint margin maximisation for robust classification in support vector machines, or the data log likelihood for model inference [5]. While this mathematical treatment leads to highly efficient learning techniques, it often narrows the applicability of the techniques to specific areas which are covered by the considered cost function and data representation. This way, the machine learning algorithm is uncoupled from the question what is a good cost function and parameterisation of it, what is a good data representation, and what is an informative data set based on which learning becomes possible. These crucial ingredients of machine & Barbara Hammer bhammer@techfak.uni-bielefeld.de Marc Toussaint marc.toussaint@informatik.uni-stuttgart.de 1 CITEC Centre of Excellence, Bielefeld University, 33594 Bielefeld, Germany 2 Machine Learning and Robotics Lab, University of Stuttgart, 70569 Stuttgart, Germany 123 Kunstl Intell (2015) 29:323–327 DOI 10.1007/s13218-015-0392-x

[1]  Pat Langley,et al.  The changing science of machine learning , 2011, Machine Learning.

[2]  Thomas Villmann,et al.  Similarity-Based Clustering, Recent Developments and Biomedical Applications [outcome of a Dagstuhl Seminar] , 2009, Similarity-Based Clustering.

[3]  Michael Biehl,et al.  Statistical Mechanics of On-line Learning , 2009, Similarity-Based Clustering.

[4]  J. Armstrong The Natural Learning Project , 2005 .

[5]  Marc Toussaint,et al.  Planning with Noisy Probabilistic Relational Rules , 2010, J. Artif. Intell. Res..

[6]  Jean-Claude Fort,et al.  SOM's mathematics , 2006, Neural Networks.

[7]  Alessandro Sperduti,et al.  Special issue on neural networks and kernel methods for structured domains , 2005, Neural Networks.

[8]  Thomas G. Dietterich,et al.  Structured machine learning: the next ten years , 2008, Machine Learning.

[9]  Marc Toussaint,et al.  Kognitive Robotik — Herausforderungen an unser Verständnis natürlicher Umgebungen , 2013, Autom..

[10]  John N. Tsitsiklis,et al.  Neuro-Dynamic Programming , 1996, Encyclopedia of Machine Learning.

[11]  Stephen Grossberg,et al.  Future Challenges for the Science and Engineering of Learning July 23-25 , 2007 National Science Foundation Organizers , .

[12]  Yoshua Bengio,et al.  Scaling learning algorithms towards AI , 2007 .

[13]  Gábor Lugosi,et al.  Introduction to Statistical Learning Theory , 2004, Advanced Lectures on Machine Learning.

[14]  Christian Igel,et al.  Evolutionary tuning of multiple SVM parameters , 2005, ESANN.

[15]  Barbara Hammer Challenges in Neural Computation , 2012, KI - Künstliche Intelligenz.

[16]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .