Limitations and Future Trends in Neural Computation

Share
Editors
Ablameyko, S., Gori, M., Goras, L., Piuri, V.
Pub. date
May 2003
Pages
256
Binding
hardcover
Volume
186 of NATO Science Series, III: Computer and Systems Sciences
ISBN print
978-1-58603-324-8
Subject
Computer & Communication Sciences, Computer Science
€116 / US$168 Excl. VAT
Order Limitations and Future Trends in Neural Computation ISBN @ €116.00

This book reports critical analyses on complexity issues in the continuum setting and on generalization to new examples, which are two basic milestones in learning from examples in connectionist models. The problem of loading the weights of neural networks, which is often framed as continuous optimization, has been the target of many criticisms, since the potential solution of any learning problem is severely limited by the presence of local minimal in the error function. The maturity of the field requires to convert the quest for a general solution to all learning problems into the understanding of which learning problems are likely to be solved efficiently. Likewise, the notion of efficient solution needs to be formalized so as to provide useful comparisons with the traditional theory of computational complexity in the discrete setting. The book covers these topics focussing also on recent developments in computational mathematics, where interesting notions of computational complexity emerge in the continuum setting.