One of the most challenging and fascinating problems of the theory of neural nets is that of asymptotic behavior, of how a system behaves as time proceeds. This is of particular relevance to many practical applications. Here we focus on association, generalization, and representation. We turn to the last topic first. The introductory chapter, "Global Analysis of Recurrent Neural Net works," by Andreas Herz presents an in-depth analysis of how to construct a Lyapunov function for various types of dynamics and neural coding. It includes a review of the recent work with John Hopfield on…mehr
One of the most challenging and fascinating problems of the theory of neural nets is that of asymptotic behavior, of how a system behaves as time proceeds. This is of particular relevance to many practical applications. Here we focus on association, generalization, and representation. We turn to the last topic first. The introductory chapter, "Global Analysis of Recurrent Neural Net works," by Andreas Herz presents an in-depth analysis of how to construct a Lyapunov function for various types of dynamics and neural coding. It includes a review of the recent work with John Hopfield on integrate-and fire neurons with local interactions. The chapter, "Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns" by Ken Miller, explains how the primary visual cortex may asymptotically gain its specific structure through a self-organization process based on Hebbian learning. His argu ment since has been shown to be rather susceptible to generalization.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
1. Global Analysis of Recurrent Neural Networks.- 1.1 Global Analysis-Why?.- 1.2 A Framework for Neural Dynamics.- 1.3 Fixed Points.- 1.4 Periodic Limit Cycles and Beyond.- 1.5 Synchronization of Action Potentials.- 1.6 Conclusions.- References.- 2. Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns.- 2.1 Introduction.- 2.2 Correlation-Based Models.- 2.3 The Problem of Map Structure.- 2.4 The Computational Significance of Correlatin-Based Rules.- 2.5 Open Questions.- References.- 3. Associative Data Storage and Retrieval in Neural Networks.- 3.1 Introduction and Overview.- 3.1.1 Memory and Representation.- 3.2 Neural Associatve Memory Models.- 3.3 Analysis of the Retrieval Process.- 3.4 Information Theory of the Memory Process.- 3.5 Model Performance.- 3.6 Discussion.- Appendix 3.1.- Appendix 3.2.- References.- 4. Inferences Modeled with Neural Networks.- 4.1 Introduction.- 4.2 Model for Cognitive Systems and for Experiences.- 4.3 Inductive Inference.- 4.4 External Memory.- 4.5 Limited Use of External Memory.- 4.6 Deductive Inference.- 4.7 Conclusion.- References.- 5. Statistical Mechanics of Generalization.- 5.1 Introduction.- 5.2 General Results.- 5.3 The Perceptron.- 5.4 Geometry in Phase Space and Asymptotic Scaling.- 5.5 Applications to Perceptrons.- 5.6 Summary and Outlook.- Appendix 5.1: Proof of Sauer's Lemma.- Appendix 5.2: Order Parameters for ADALINE.- References.- 6. Bayesian Methods for Backpropagation Networks.- 6.1 Probability Theory and Occam's Razor.- 6.2 Neural Networks as Probabilistic Models.- 6.3 Setting Regularization Constants ? and ?.- 6.4 Model Comparison.- 6.5 Error Bars and Predictions.- 6.6 Pruning.- 6.7 Automatic Relevance Determination.- 6.8 Implicit Priors.- 6.9 Cheap and CheerfulImplementations.- 6.10 Discussion.- References.- 7. Penacée: A Neural Net System for Recognizing On-Line Handwriting.- 7.1 Introduction.- 7.2 Description of the Building Blocks.- 7.3 Applications.- 7.4 Conclusion.- References.- 8. Topology Representing Network in Robotics.- 8.1 Introduction.- 8.2 Problem Description.- 8.3 Topology Representing Network Algorithm.- 8.4 Experimental Results and Discussion.- References.
1. Global Analysis of Recurrent Neural Networks.- 1.1 Global Analysis-Why?.- 1.2 A Framework for Neural Dynamics.- 1.3 Fixed Points.- 1.4 Periodic Limit Cycles and Beyond.- 1.5 Synchronization of Action Potentials.- 1.6 Conclusions.- References.- 2. Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns.- 2.1 Introduction.- 2.2 Correlation-Based Models.- 2.3 The Problem of Map Structure.- 2.4 The Computational Significance of Correlatin-Based Rules.- 2.5 Open Questions.- References.- 3. Associative Data Storage and Retrieval in Neural Networks.- 3.1 Introduction and Overview.- 3.1.1 Memory and Representation.- 3.2 Neural Associatve Memory Models.- 3.3 Analysis of the Retrieval Process.- 3.4 Information Theory of the Memory Process.- 3.5 Model Performance.- 3.6 Discussion.- Appendix 3.1.- Appendix 3.2.- References.- 4. Inferences Modeled with Neural Networks.- 4.1 Introduction.- 4.2 Model for Cognitive Systems and for Experiences.- 4.3 Inductive Inference.- 4.4 External Memory.- 4.5 Limited Use of External Memory.- 4.6 Deductive Inference.- 4.7 Conclusion.- References.- 5. Statistical Mechanics of Generalization.- 5.1 Introduction.- 5.2 General Results.- 5.3 The Perceptron.- 5.4 Geometry in Phase Space and Asymptotic Scaling.- 5.5 Applications to Perceptrons.- 5.6 Summary and Outlook.- Appendix 5.1: Proof of Sauer's Lemma.- Appendix 5.2: Order Parameters for ADALINE.- References.- 6. Bayesian Methods for Backpropagation Networks.- 6.1 Probability Theory and Occam's Razor.- 6.2 Neural Networks as Probabilistic Models.- 6.3 Setting Regularization Constants ? and ?.- 6.4 Model Comparison.- 6.5 Error Bars and Predictions.- 6.6 Pruning.- 6.7 Automatic Relevance Determination.- 6.8 Implicit Priors.- 6.9 Cheap and CheerfulImplementations.- 6.10 Discussion.- References.- 7. Penacée: A Neural Net System for Recognizing On-Line Handwriting.- 7.1 Introduction.- 7.2 Description of the Building Blocks.- 7.3 Applications.- 7.4 Conclusion.- References.- 8. Topology Representing Network in Robotics.- 8.1 Introduction.- 8.2 Problem Description.- 8.3 Topology Representing Network Algorithm.- 8.4 Experimental Results and Discussion.- References.
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497