Book contents
- Frontmatter
- Contents
- Preface
- 1 Getting Started
- 2 Perceptron Learning – Basics
- 3 A Choice of Learning Rules
- 4 Augmented Statistical Mechanics Formulation
- 5 Noisy Teachers
- 6 The Storage Problem
- 7 Discontinuous Learning
- 8 Unsupervised Learning
- 9 On-line Learning
- 10 Making Contact with Statistics
- 11 A Bird's Eye View: Multifractals
- 12 Multilayer Networks
- 13 On-line Learning in Multilayer Networks
- 14 What Else?
- Appendices
- Bibliography
- Index
6 - The Storage Problem
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Preface
- 1 Getting Started
- 2 Perceptron Learning – Basics
- 3 A Choice of Learning Rules
- 4 Augmented Statistical Mechanics Formulation
- 5 Noisy Teachers
- 6 The Storage Problem
- 7 Discontinuous Learning
- 8 Unsupervised Learning
- 9 On-line Learning
- 10 Making Contact with Statistics
- 11 A Bird's Eye View: Multifractals
- 12 Multilayer Networks
- 13 On-line Learning in Multilayer Networks
- 14 What Else?
- Appendices
- Bibliography
- Index
Summary
There is an important extreme case of learning from a noisy source as discussed in the previous chapter which deserves special consideration. It concerns the situation of an extremely noisy teacher in which the added noise is so strong that it completely dominates the teacher's output. The task for the student is then to reproduce a mapping with no correlations between input and output so that the notion of a teacher actually becomes obsolete. The central question is how many input–output pairs can typically be implemented by an appropriate choice of the couplings J. This is the so-called storage problem. Its investigation yields a measure for the flexibility of the network under consideration with respect to the implementation of different mappings between input and output.
The reason why we include a discussion of this case in the present book, which is mainly devoted to the generalization behaviour of networks, is threefold. Firstly, there is a historical point: in the physics community the storage properties of neural networks were discussed before emphasis was laid on their ability to learn from examples, and several important concepts have been introduced in connection with these earlier investigations. Secondly, in several situations the storage problem is somewhat simpler to analyse and therefore forms a suitable starting point for the more complicated investigation of the generalization performance. Thirdly, we will see in chapter 10 that the flexibility of a network architecture with respect to the implementation of different input–output relations also gives useful information on its generalization behaviour.
- Type
- Chapter
- Information
- Statistical Mechanics of Learning , pp. 85 - 108Publisher: Cambridge University PressPrint publication year: 2001