CONNNECTIONISM is a "hands on" introduction to connectionist modeling. Three different types of connectionist architectures - distributed associative memory, perceptron, and multilayer perceptron - are explored. In an accessible style, Dawson provides a brief overview of each architecture, a detailed introduction on how to use a program to explore this network, and a series of practical exercises that are designed to highlight the advantages, and disadvantages, of each and to provide a "road map" to the field of cognitive modeling. This book is designed to be used as a stand-alone volume, or…mehr
CONNNECTIONISM is a "hands on" introduction to connectionist modeling. Three different types of connectionist architectures - distributed associative memory, perceptron, and multilayer perceptron - are explored. In an accessible style, Dawson provides a brief overview of each architecture, a detailed introduction on how to use a program to explore this network, and a series of practical exercises that are designed to highlight the advantages, and disadvantages, of each and to provide a "road map" to the field of cognitive modeling. This book is designed to be used as a stand-alone volume, or alongside Minds and Machines: Connectionism and Psychological Modeling (Blackwell Publishing, 2004). An accompanying website is available at www.bcp.psych.ualberta.ca/%7emike/book3/index.html and includes practice exercises and software, as well as the files and blank exercise sheets that are required for performing the exercises.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Meet the author Eric Dawson AKA Brick, Born andraised in the streets of Jersey City,NJ. Like everyoneelse he ran the streets, he got caught up in a 79account indictment that lead to violent & multi countydrug cases which he stood tall. He went through hisups and down, but whatever life threw his way hetook it like a champ.Eric is 31 of age he's currently finishing his bid in aNJ State Prison. He started writing books to passtime and grew passion for it. You can follow authoron Instagram @ (bossdonbrick) or feel free to contacthim on the J-Pay app. Eric Dawson 500462-D.
Inhaltsangabe
1. Hands-on Connectionism. 1.1 Connectionism In Principle And In Practice. 1.2 The Organization Of This Book. 2. The Distributed Associative Memory. 2.1 The Paired Associates Task. 2.2 The Standard Pattern Associator. 2.3 Exploring The Distributed associative memory. 3. The James Program. 3.1 Introduction. 3.2 Installing The Program. 3.3 Teaching A Distributed Memory. 3.4 Testing What The Memory Has Learned. 3.5 Using The Program. 4. Introducing Hebb Learning. 4.1 Overview Of The Exercises. 4.2 Hebb Learning Of Basis Vectors. 4.3 Hebb Learning Of Orthonormal, Non-Basis Vectors. 5. Limitations Of Hebb Learning. 5.1 Introduction. 5.2 The Effect Of Repetition. 5.3 The Effect Of Correlation. 6. Introducing The Delta Rule. 6.1 Introduction. 6.2 The Delta Rule. 6.3 The Delta Rule And The Effect Of Repetition. 6.4 The Delta Rule And The Effect Of Correlation. 7. Distributed Networks And Human Memory. 7.1 Background On The Paired Associate Paradigm. 7.2 The Effect Of Similarity On The Distributed Associative Memory. 8. Limitations Of Delta Rule Learning. 8.1 Introduction. 8.2 The Delta Rule And Linear Dependency. 9. The Perceptron. 9.1 Introduction. 9.2 The Limits Of Distributed Associative Memories, And Beyond. 9.3 Properties Of The Perceptron. 9.4 What Comes Next. 10. The Rosenblatt Program. 10.1 Introduction. 10.2 Installing The Program. 10.3 Training A Perceptron. 10.4 Testing What The Memory Has Learned. 11. Perceptrons And Logic Gates. 11.1 Introduction. 11.2 Boolean Algebra. 11.3 Perceptrons And Two-Valued Algebra. 12. Performing More Logic With Perceptrons. 12.1 Two-Valued Algebra And Pattern Spaces. 12.2 Perceptrons And Linear Separability. 12.3 Appendix Concerning The DawsonJots Font. 13. Value Units And Linear Nonseparability. 13.1 Linear Separability And Its Implications. 13.2 Value Units And The Exclusive-Or Relation. 13.3 Value Units And Connectedness. 14. Network By Problem Type Interactions. 14.1 All Networks Were Not Created Equally. 14.2 Value Units And The Two-Valued Algebra. 15. Perceptrons And Generalization. 15.1 Background. 15.2 Generalization And Savings For The 9-Majority Problem. 16. Animal Learning Theory And Perceptrons. 16.1 Discrimination Learning. 16.2 Linearly Separable Versions Of Patterning. 17. The Multilayer Perceptron. 17.1 Creating Sequences Of Logical Operations. 17.2 Multilayer Perceptrons And The Credit Assignment Problem. 17.3 The Implications Of The Generalized Delta Rule. 18. The Rumelhart Program. 18.1 Introduction. 18.2 Installing The Program. 18.3 Training A Multilayer Perceptron. 18.4 Testing What The Network Has Learned. 19. Beyond The Perceptron's Limits. 19.1 Introduction. 19.2 The Generalized Delta Rule And Exclusive-Or. 20. Symmetry As A Second Case Study. 20.1 Background. 20.2 Solving Symmetry Problems With Multilayer Perceptrons. 21. How Many Hidden Units?. 21.1 Background. 21.2 How Many Hidden Value Units Are Required For 5-Bit Parity?. 22. Scaling Up With The Parity Problem. 22.1 Overview Of The Exercises. 22.2 Background. 22.3 Exploring The Parity Problem. 23. Selectionism And Parity. 23.1 Background. 23.2 From Connectionism To Selectionism. 24. Interpreting A Small Network. 24.1 Background. 24.2 A Small Network. 24.3 Interpreting This Small Network. 25. Interpreting Networks Of Value Units. 25.1 Background. 25.2 Banding In The First Monks Problem. 25.3 Definite Features In The First Monks Problem. 26. Interpreting Distributed Representations. 26.1 Background. 26.2 Interpreting A 5-Parity Network. 27. Creating Your Own Training Sets. 27.1 Background. 27.2 Designing And Building A Training Set. References.
1. Hands-on Connectionism. 1.1 Connectionism In Principle And In Practice. 1.2 The Organization Of This Book. 2. The Distributed Associative Memory. 2.1 The Paired Associates Task. 2.2 The Standard Pattern Associator. 2.3 Exploring The Distributed associative memory. 3. The James Program. 3.1 Introduction. 3.2 Installing The Program. 3.3 Teaching A Distributed Memory. 3.4 Testing What The Memory Has Learned. 3.5 Using The Program. 4. Introducing Hebb Learning. 4.1 Overview Of The Exercises. 4.2 Hebb Learning Of Basis Vectors. 4.3 Hebb Learning Of Orthonormal, Non-Basis Vectors. 5. Limitations Of Hebb Learning. 5.1 Introduction. 5.2 The Effect Of Repetition. 5.3 The Effect Of Correlation. 6. Introducing The Delta Rule. 6.1 Introduction. 6.2 The Delta Rule. 6.3 The Delta Rule And The Effect Of Repetition. 6.4 The Delta Rule And The Effect Of Correlation. 7. Distributed Networks And Human Memory. 7.1 Background On The Paired Associate Paradigm. 7.2 The Effect Of Similarity On The Distributed Associative Memory. 8. Limitations Of Delta Rule Learning. 8.1 Introduction. 8.2 The Delta Rule And Linear Dependency. 9. The Perceptron. 9.1 Introduction. 9.2 The Limits Of Distributed Associative Memories, And Beyond. 9.3 Properties Of The Perceptron. 9.4 What Comes Next. 10. The Rosenblatt Program. 10.1 Introduction. 10.2 Installing The Program. 10.3 Training A Perceptron. 10.4 Testing What The Memory Has Learned. 11. Perceptrons And Logic Gates. 11.1 Introduction. 11.2 Boolean Algebra. 11.3 Perceptrons And Two-Valued Algebra. 12. Performing More Logic With Perceptrons. 12.1 Two-Valued Algebra And Pattern Spaces. 12.2 Perceptrons And Linear Separability. 12.3 Appendix Concerning The DawsonJots Font. 13. Value Units And Linear Nonseparability. 13.1 Linear Separability And Its Implications. 13.2 Value Units And The Exclusive-Or Relation. 13.3 Value Units And Connectedness. 14. Network By Problem Type Interactions. 14.1 All Networks Were Not Created Equally. 14.2 Value Units And The Two-Valued Algebra. 15. Perceptrons And Generalization. 15.1 Background. 15.2 Generalization And Savings For The 9-Majority Problem. 16. Animal Learning Theory And Perceptrons. 16.1 Discrimination Learning. 16.2 Linearly Separable Versions Of Patterning. 17. The Multilayer Perceptron. 17.1 Creating Sequences Of Logical Operations. 17.2 Multilayer Perceptrons And The Credit Assignment Problem. 17.3 The Implications Of The Generalized Delta Rule. 18. The Rumelhart Program. 18.1 Introduction. 18.2 Installing The Program. 18.3 Training A Multilayer Perceptron. 18.4 Testing What The Network Has Learned. 19. Beyond The Perceptron's Limits. 19.1 Introduction. 19.2 The Generalized Delta Rule And Exclusive-Or. 20. Symmetry As A Second Case Study. 20.1 Background. 20.2 Solving Symmetry Problems With Multilayer Perceptrons. 21. How Many Hidden Units?. 21.1 Background. 21.2 How Many Hidden Value Units Are Required For 5-Bit Parity?. 22. Scaling Up With The Parity Problem. 22.1 Overview Of The Exercises. 22.2 Background. 22.3 Exploring The Parity Problem. 23. Selectionism And Parity. 23.1 Background. 23.2 From Connectionism To Selectionism. 24. Interpreting A Small Network. 24.1 Background. 24.2 A Small Network. 24.3 Interpreting This Small Network. 25. Interpreting Networks Of Value Units. 25.1 Background. 25.2 Banding In The First Monks Problem. 25.3 Definite Features In The First Monks Problem. 26. Interpreting Distributed Representations. 26.1 Background. 26.2 Interpreting A 5-Parity Network. 27. Creating Your Own Training Sets. 27.1 Background. 27.2 Designing And Building A Training Set. References.
Rezensionen
"This is a first-rate textbook, Enabling readers to performsimulations described, it provides a very user-friendlyintroduction to the essential material, which it sets in anengaging, historically informed context." Anne JaapJacobson, University of Houston
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497
USt-IdNr: DE450055826