Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- Part one Pattern Classification with Binary-Output Neural Networks
- 2 The Pattern Classification Problem
- 3 The Growth Function and VC-Dimension
- 4 General Upper Bounds on Sample Complexity
- 5 General Lower Bounds on Sample Complexity
- 6 The VC-Dimension of Linear Threshold Networks
- 7 Bounding the VC-Dimension using Geometric Techniques
- 8 Vapnik-Chervonenkis Dimension Bounds for Neural Networks
- Part two Pattern Classification with Real-Output Networks
- Part three Learning Real-Valued Functions
- Part four Algorithmics
- Appendix 1 Useful Results
- Bibliography
- Author index
- Subject index
8 - Vapnik-Chervonenkis Dimension Bounds for Neural Networks
Published online by Cambridge University Press: 26 February 2010
- Frontmatter
- Contents
- Preface
- 1 Introduction
- Part one Pattern Classification with Binary-Output Neural Networks
- 2 The Pattern Classification Problem
- 3 The Growth Function and VC-Dimension
- 4 General Upper Bounds on Sample Complexity
- 5 General Lower Bounds on Sample Complexity
- 6 The VC-Dimension of Linear Threshold Networks
- 7 Bounding the VC-Dimension using Geometric Techniques
- 8 Vapnik-Chervonenkis Dimension Bounds for Neural Networks
- Part two Pattern Classification with Real-Output Networks
- Part three Learning Real-Valued Functions
- Part four Algorithmics
- Appendix 1 Useful Results
- Bibliography
- Author index
- Subject index
- Type
- Chapter
- Information
- Neural Network LearningTheoretical Foundations, pp. 108 - 130Publisher: Cambridge University PressPrint publication year: 1999