On sample complexity of neural networks

We consider functions defined by deep neural networks as definable objects in an o-miminal expansion of the real field, and derive an almost linear (in the number of weights) bound on sample complexity of such networks.

[1]  Lou van den Dries,et al.  THE REAL FIELD WITH CONVERGENT GENERALIZED POWER SERIES , 1998 .

[2]  Eduardo D. Sontag,et al.  Neural Networks with Quadratic VC Dimension , 1995, J. Comput. Syst. Sci..

[3]  Patrick Speissegger,et al.  O-minimal preparation theorems , 2002 .

[4]  Lou van den Dries,et al.  The Field of Reals with Multisummable Series and the Exponential Function , 2000 .

[5]  M. Karpinski,et al.  Approximating Volumes and Integrals in o-Minimal and p-Minimal Theories , 1997 .

[6]  A. Wilkie Model completeness results for expansions of the ordered field of real numbers by restricted Pfaffian functions and the exponential function , 1996 .

[7]  Deirdre Haskell,et al.  Vapnik-Chervonenkis Density in some Theories without the Independence Property, II , 2015 .

[8]  Peter L. Bartlett,et al.  Vapnik-Chervonenkis dimension of neural nets , 2003 .

[9]  Eduardo Sontag VC dimension of neural networks , 1998 .

[10]  Michael C. Laskowski,et al.  Compression Schemes, Stable Definable Families, and o-Minimal Structures , 2010, Discret. Comput. Geom..

[11]  Roi Livni,et al.  On the Computational Efficiency of Training Neural Networks , 2014, NIPS.

[12]  Alf Onshuus,et al.  Additivity of the dp-rank , 2011 .

[13]  L. Dries,et al.  On the real exponential field with restricted analytic functions , 1994 .

[14]  Marek Karpinski,et al.  Polynomial Bounds for VC Dimension of Sigmoidal and General Pfaffian Neural Networks , 1997, J. Comput. Syst. Sci..