Abstract
Neural networks have often been motivated by superficial analogy with biological nervous systems. Recently, however, it has become widely recognised that the effective application of neural networks requires instead a deeper understanding of the theoretical foundations of these models. Insight into neural networks comes from a number of fields including statistical pattern recognition, computational learning theory, statistics, information geometry and statistical mechanics. As an illustration of the importance of understanding the theoretical basis for neural network models, we consider their application to the solution of multi-valued inverse problems. We show how a naive application of the standard least-squares approach can lead to very poor results, and how an appreciation of the underlying statistical goals of the modelling process allows the development of a more general and more powerful formalism which can tackle the problem of multi-modality.
Original language | English |
---|---|
Title of host publication | Proceedings of Physics Computing 96 |
Editors | P. Borcherds, M. Bubak, A. Maksymowicz |
Place of Publication | Krakow |
Publisher | Academic Computer Centre |
Pages | 500-507 |
Number of pages | 8 |
Publication status | Published - 1996 |
Event | Physics Computing '96 - Duration: 1 Jan 1996 → 1 Jan 1996 |
Conference
Conference | Physics Computing '96 |
---|---|
Period | 1/01/96 → 1/01/96 |
Keywords
- neural networks
- nervous systems
- statistical pattern recognition
- computational learning theory
- statistics
- information geometry
- statistical mechanics