13 - Neural networks
from Part III - Learning and memory
Summary
In previous chapters we encountered examples of learning and memory that varied widely in complexity, from rats learning to press a bar at one end, to humans trying to remember lessons in physics at the other. Ideally, we would like a theory that could encompass all these forms of learning, from rats to humans, from classical conditioning to language learning. In short, a theory of everything.
This might at first seem an outrageous requirement – or, at any rate, one exceedingly unlikely to be fulfilled – but a theory has recently emerged that supporters claim has the potential to meet it. The new theory sets out to explain virtually every aspect of learning, from classical conditioning in animals to language learning in humans. And it does all this using just a single, almost unbelievably simple principle, that when two neurons are active at the same time, the connection between them will be strengthened.
- Type
- Chapter
- Information
- Learning and Memory , pp. 477 - 503Publisher: Cambridge University PressPrint publication year: 2011