Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgements
- 1 Neural Networks: A Control Approach
- 2 Pseudoinverses and Tensor Products
- 3 Associative Memories
- 4 The Gradient Method
- 5 Nonlinear Neural Networks
- 6 External Learning Algorithm for Feedback Controls
- 7 Internal Learning Algorithm for Feedback Controls
- 8 Learning Processes of Cognitive Systems
- 9 Qualitative Analysis of Static Problems
- 10 Dynamical Qualitative Simulation
- Appendix 1 Convex and Nonsmooth Analysis
- Appendix 2 Control of an AUV
- Bibliography
- Index
3 - Associative Memories
Published online by Cambridge University Press: 05 August 2012
- Frontmatter
- Contents
- Preface
- Acknowledgements
- 1 Neural Networks: A Control Approach
- 2 Pseudoinverses and Tensor Products
- 3 Associative Memories
- 4 The Gradient Method
- 5 Nonlinear Neural Networks
- 6 External Learning Algorithm for Feedback Controls
- 7 Internal Learning Algorithm for Feedback Controls
- 8 Learning Processes of Cognitive Systems
- 9 Qualitative Analysis of Static Problems
- 10 Dynamical Qualitative Simulation
- Appendix 1 Convex and Nonsmooth Analysis
- Appendix 2 Control of an AUV
- Bibliography
- Index
Summary
Introduction
We investigate in this chapter the case of linear neural networks, named associative memories by T. Kohonen (Figure 3.1). We begin by specializing the heavy algorithm we have studied in the general case of adaptive systems to the case of neural networks, where controls are matrices. It shows how to modify the last synaptic matrix that has learned a set of patterns for learning a new pattern without forgetting the previous patterns.
Because right-inverses of tensor products are tensor products of right-inverses, we observe that the heavy algorithm has a Hebbian character: The heavy algorithm states that the correction of a synaptic matrix during learning is the product of activities in both presynaptic and postsynaptic neurons. This added feature that plain vectors do not enjoy justifies the specifics of systems controlled by matrices instead of vectors.
We then proceed with associative memories with postprocessing, with multilayer and continuous-layer associative memories. We conclude this chapter with associative memories with gates, where the synaptic matrices link conjuncts (i.e., subsets) of presynaptic neurons with each postsynaptic neuron. They allow computation of any Boolean function. They require a short presentation of fuzzy sets.
- Type
- Chapter
- Information
- Neural Networks and Qualitative PhysicsA Viability Approach, pp. 44 - 60Publisher: Cambridge University PressPrint publication year: 1996