Part Two - Information Theory and Artificial Networks
Published online by Cambridge University Press: 04 May 2010
Summary
information theory has two main contributions to studying neural systems: one is that it provides a theoretically clear and task-independent objective function for unsupervised learning; the other is that it serves as an analytical tool in evaluating the performance of a given model or a biological neural system. We must keep in mind, however, that an information measure in itself is not some kind of Holy Grail; the evolutionary success of an agent eventually comes down to its performance or fitness in a specific ecological niche. A frog may be exposed to the information in the patterns on the pages of a book, the pattern of clouds or a huge number of other possible pattern configurations in its sensory input stream, but the vast majority of this information is irrelevant to it, and it will do better concentrating just on small dark moving spots and large dark blobs in its proximity. The nervous system of higher animals, e.g. mammals, however, has a much more ambitious goal which allows it to operate in a much more flexible way in environments that are completely novel and unexpected on an evolutionary time scale. The goal is to build a model of the sensory environment. This model is still subject to biological constraints, such as the nature and resolution of its sensors of the physical signals but the ways in which it can combine these signals becomes gradually more sophisticated.
- Type
- Chapter
- Information
- Information Theory and the Brain , pp. 79 - 83Publisher: Cambridge University PressPrint publication year: 2000