Introduction to Neural and Cognitive Modeling   

Daniel S. Levine 

Mahwah, NJ: Lawrence Erlbaum Associates, 2nd Edition, 2000 491 pp.

Paperback:  ISBN 0-8058-2006-X, $36.00
Cloth:  ISBN 0-8058-2005-1, $99.00

$36.00 (paperback) (go to book search and perform a simple search by entering the above ISBN number)

Yes, the interdisciplinary field of neural networks has swept the world, and there are many different introductory textbooks around.  But Levine's book, which has sold close to 2,000 copies since the first edition was published in 1991, has many distinct advantages over its competitors:

Systematic development. Levine's book is organized on a flow chart, building from simple cognitive functions in the first four chapters to more complex functions in the next three.  Chapter 1 lays out the premises of the field and Chapter 2 traces its historical development.  Chapter 3 discusses simple models of long-term memory and learning, whereas Chapter 4 deals with lateral inhibition, perception, and short-term memory.  Chapter 5 discusses how learning and perceptual processes are combined in models of conditioning.  Chapter 6 goes into combining these same processes in different combinations to model coding and categorization.  That chapter also includes three of the most popular large networks: back propagation, brain-state-in-a-box (BSB), and adaptive resonance theory (ART).  Chapter 7 moves beyond these processes into the field's frontier areas: optimization, control, knowledge representation, models of specific brain regions, and models of mental illness.

Historical perspective.  Neural networks are widely thought to have sprung into being with a few well-known publications from the mid-1980s, that challenged symbolic artificial intelligence and restored interest in brain-like computation.  The surveys in Levine's book make it clear that many of the key currently popular ideas about neural networks are rooted in the work of pioneers that wrote between 1943 and 1969.  Several researchers who wrote in the 1960s (Amari, Anderson, Grossberg, Kohonen, and Widrow, for example) are still active leaders in the field.

Accessibility to a varied audience.  The book is of interest to psychologists, biologists, and cognitive scientists who want an introduction to neural network ideas with a minimum of technical prerequisites.  It is also of interest to engineers and computer scientists who wish not only to design networks to perform specific task, but to understand the principles about why they are built a certain way.  The discussion in each chapter relies not on equations and on formulas, but on intuitive design principles and network diagrams.  The equations and formulas can be skipped on a first reading, since most of them are listed at the end of each chapter, preceding the homework exercises.  Some of the exercises rely on computer simulation of existing networks based on those equations, whereas others involve open-ended, qualitative thought experiments.   The book also includes two appendices.  One lays out the basic facts of neurobiology for those lacking a background in that area.  The other reviews a few facts of calculus and differential equations, and demonstrates with code examples how to program some simple neural network equations on a computer.

Price. Very few technical books carry this much information for under $40.  The price in Europe and Asia is only slightly higher.

From the back cover:

This thoroughly and thoughtfully revised edtion of a very successful textbook makes the principles and the details of neural network modeling accessible to cognitive scientists of all varieties as well as other scholars interested in these models. Research since the publication of the first edition has been systematically incorporated into a framework of proven pedagogical value.

Features of the second edition include:

As in the first edition, the text includes extensive introductions to neuroscience and to differential and difference equations as appendices for students without the requisite background in these areas. As graphically revealed in the flowchart in the front of the book, the text begins with simpler processes and builds up to more complex multilevel functional systems.

Table of contents:

Chapters 2 through 7 each include equations and exercises (computational, mathematical, and qualitative) at the end of the chapter. The text sections are as follows.

Flow Chart of the Book
Preface
Preface to the Second Edition
Chapter 1: Brain and Machine: The Same Principles?
What Are Neural Networks?
Is Biological Realism a Virtue?
What Are Some Principles of Neural Network Theory?
Methodological Considerations
Chapter 2: Historical Outline
2.1. Digital Approaches
The McCulloch-Pitts Network
Early Approaches to Modeling Learning: Hull and Hebb 
Rosenblatt’s Perceptrons
Some Experiments With Perceptrons
The Divergence of Artificial Intelligence and Neural Modeling
2.2. Continuous and Random Net Approaches
Rashevsky’s Work
Early Random Net Models
Reconciling Randomness and Specificity
Chapter 3: Associative Learning and Synaptic Plasticity
3.1. Physiological Bases for Learning
3.2. Rules for Associative Learning
Outstars and Other Early Models of Grossberg
Anderson’s Connection Matrices
Kohonen’s Early Work
3.3. Learning Rules Related to Changes in Node Activities
Klopf’s Hedonistic Neurons and the Sutton-Barto Learning Rule
Error Correction and Back Propagation
The Differential Hebbian Idea
Gated Dipole Theory
3.4. Associative Learning of Patterns
Kohonen’s Recent Work: Autoassociation and Heteroassociation
Kosko’s Bidirectional Associative Memory
Chapter 4: Competition, Lateral Inhibition, and Short-Term Memory
4.1. Contrast Enhancement, Competition, and Normalization
Hartline and Ratliff’s Work, and Other Early Visual Models
Nonrecurrent Versus Recurrent Lateral Inhibition
4.2. Lateral Inhibition and Excitation Between Sensory Representations
Wilson and Cowan’s Work
Work of Grossberg and Colleagues
Work of Amari and Colleagues
Energy Functions in the Cohen-Grossberg and Hopfield-Tank Models
The Implications of Approach to Equilibrium
Networks With Synchronized Oscillations
4.3. Visual Pattern Recognition Models
Visual Illusions
Boundary Detection Versus Feature Detection
Binocular and Stereoscopic Vision
Visual Motion
Comparison of Grossberg’s and Marr’s Approaches
4.4. Uses of Lateral Inhibition in Higher Level Processing
Chapter 5: Conditioning, Attention, and Reinforcement
5.1. Network Models of Classical Conditioning
Early Work: Brindley and Uttley
Rescorla and Wagner’s Psychological Model
Grossberg: Drive Representations and Synchronization
Aversive Conditioning and Extinction
Differential Hebbian Theory Versus Gated Dipole Theory
5.2. Attention and Short-Term Memory in Conditioning Models
Grossberg’s Approach to Attention
Sutton and Barto’s Approach: Blocking and Interstimulus Interval Effects
Some Contrasts Between the Grossberg and Sutton-Barto Approaches
Further Connections With Invertebrate Neurophysiology
Further Connections With Vertebrate Neurophysiology
Gated Dipoles, Aversive Conditioning, and Timing
Chapter 6: Coding and Categorization
6.1. Interactions Between Short- and Long-Term 
Memory in Code Development
Malsburg’s Model With Synaptic Conservation
Grossberg’s Model With Pattern Normalization
Mathematical Results of Grossberg and Amari
Feature Detection Models With Stochastic Elements
From Feature Coding to Categorization
6.2. Supervised Classification Models
The Back Propagation Network and its Variants
The RCE Model
6.3. Unsupervised Classification Models
The Rumelhart-Zipser Competitive Learning Algorithm
Adaptive Resonance Theory
Edelman and Neural Darwinism
6.4. Models that Combine Supervised and Unsupervised Parts
ARTMAP and Other Supervised Adaptive Resonance Networks
Brain-State-in-a-Box (BSB) Models
6.5. Translation and Scale Invariance
6.6. Processing Spatiotemporal Patterns
Chapter 7:Optimization, Control, Decision, and Knowledge Representation
7.1. Optimization and Control
Classical Optimization Problems
Simulated Annealing and Boltzmann Machines
Motor Control: The Example of Eye Movements
Motor Control: Arm Movements
Speech Recognition and Synthesis
Robotic and Other Industrial Control Problems
7.2. Decision Making and Knowledge Representation
What, If Anything, Do Biological Organisms Optimize?
Affect, Habit, and Novelty in Neural Network Theories
Knowledge Representation: Letters and Words
Knowledge Representation: Concepts and Inference
7.3. Neural Control Circuits, Mental Illness, and Brain Areas
Overarousal, Underarousal, Parkinsonism, and Depression
Frontal Lobe Function and Dysfunction
Disruption of Cognitive-Motivational Interactions
Impairment of Motor Task Sequencing
Disruption of Context Processing
Models of Specific Brain Areas
Models of the Cerebellum
Models of the Hippocampus
Models of the Basal Ganglia
Models of the Cerebral Cortex
Chapter 8: A Few Recent Technical Advances
8.1. Some "Toy" and Real World Computing Applications
Pattern Recognition
Knowledge Engineering
Financial Engineering
"Oddball" Applications
8.2. Some Neurobiological Discoveries

Appendix 1: Basic Facts of Neurobiology
The Neuron
Synapses, Transmitters, Messengers, and Modulators
Invertebrate and Vertebrate Nervous Systems
Functions of Vertebrate Subcortical Regions
Functions of the Mammalian Cerebral Cortex

Appendix 2: Difference And Differential Equations in Neural Networks
Example: The Sutton-Barto Difference Equations
Differential Versus Difference Equations
Outstar Equations: Network Interpretation and Numerical Implementation
The Chain Rule and Back Propagation
Dynamical Systems: Steady States, Limit Cycles, and Chaos

ABOUT THE AUTHOR: Daniel S. Levine is Professor of Psychology at the University of Texas at Arlington. A former president of the International Neural Network Society, he is the organizer of the MIND conferences, which bring together leading neural network researchers from academia and industry. Since 1975, he has written nearly 100 books, articles, and chapters for various audiences interested in neural networks.