Introduction to VC learning theory
暂无分享,去创建一个
In recent years, there has been an explosive growth of methods for estimating (learning) dependencies from data. The learning methods have been developed in the fields of statistics, neural networks, signal processing, fizzy systems etc. These methods have a common goal of estimating unknown dependencies from available (historical) data (samples). Estimated dependencies are then used for accurate prediction of future data (generalization). Hence this problem is known as Predictive Learning. Statistical Learning Theory (aka VC-theory or VapnikChervonenkis theory) has recently emerged as a general conceptual and mathematical framework for estimating (learning) dependencies from finite samples. Unfortunately, perhaps because of its mathematical rigor and complexity, this theory is not well known in the financial engineering community. Hence, the purpose of this tutorial is to discuss: