Author: Dana H. Ballard
Publisher: MIT Press
Release Date: 2015-02-20
The vast differences between the brain's neural circuitry and a computer's silicon circuitry might suggest that they have nothing in common. In fact, as Dana Ballard argues in this book, computational tools are essential for understanding brain function. Ballard shows that the hierarchical organization of the brain has many parallels with the hierarchical organization of computing; as in silicon computing, the complexities of brain computation can be dramatically simplified when its computation is factored into different levels of abstraction.Drawing on several decades of progress in computational neuroscience, together with recent results in Bayesian and reinforcement learning methodologies, Ballard factors the brain's principal computational issues in terms of their natural place in an overall hierarchy. Each of these factors leads to a fresh perspective. A neural level focuses on the basic forebrain functions and shows how processing demands dictate the extensive use of timing-based circuitry and an overall organization of tabular memories. An embodiment level organization works in reverse, making extensive use of multiplexing and on-demand processing to achieve fast parallel computation. An awareness level focuses on the brain's representations of emotion, attention and consciousness, showing that they can operate with great economy in the context of the neural and embodiment substrates.
Author: Dana H. Ballard
Publisher: Computational Neuroscience Series
Release Date: 2016-12-13
Genre: Computational neuroscience
The vast differences between the brain's neural circuitry and a computer's silicon circuitry might suggest that they have nothing in common. In fact, as Dana Ballard argues in this book, computational tools are essential for understanding brain function. Ballard shows that the hierarchical organisation of the brain has many parallels with the hierarchical organisation of computing; as in silicon computing, the complexities of brain computation can be dramatically simplified when its computation is factored into different levels of abstraction.
In a culmination of humanity's millennia-long quest for self knowledge, the sciences of the mind are now in a position to offer concrete, empirically validated answers to the most fundamental questions about human nature. What does it mean to be a mind? How is the mind related to the brain? How are minds shaped by their embodiment and environment? What are the principles behind cognitive functions such as perception, memory, language, thought, and consciousness? By analyzing the tasks facing any sentient being that is subject to stimulation and a pressure to act, Shimon Edelman identifies computation as the common denominator in the emerging answers to all these questions. Any system composed of elements that exchange signals with each other and occasionally with the rest of the world can be said to be engaged in computation. A brain composed of neurons is one example of a system that computes, and the computations that the neurons collectively carry out constitute the brain's mind. Edelman presents a computational account of the entire spectrum of cognitive phenomena that constitutes the mind. He begins with sentience, and uses examples from visual perception to demonstrate that it must, at its very core, be a type of computation. Throughout his account, Edelman acknowledges the human mind's biological origins. Along the way, he also demystifies traits such as creativity, language, and individual and collective consciousness, and hints at how naturally evolved minds can transcend some of their limitations by moving to computational substrates other than brains. The account that Edelman gives in this book is accessible, yet unified and rigorous, and the big picture he presents is supported by evidence ranging from neurobiology to computer science. The book should be read by anyone seeking a comprehensive and current introduction to cognitive psychology.
Author: Dana Harry Ballard
Publisher: MIT Press
Release Date: 1999
"This is a wonderful book that brings together in one place the modern view of computation as found in nature. It is well written and has something for everyone from the undergraduate to the advanced researcher." -- Terrence J. Sejnowski, Howard Hughes Medical Institute at The Salk Institute for Biological Studies, La Jolla, California It is now clear that the brain is unlikely to be understood without recourse to computational theories. The theme of An "Introduction to Natural Computation" is that ideas from diverse areas such as neuroscience, information theory, and optimization theory have recently been extended in ways that make them useful for describing the brain's programs. This book provides a comprehensive introduction to the computational material that forms the underpinnings of the currently evolving set of brain models. It stresses the broad spectrum of learning models--ranging from neural network learning through reinforcement learning to genetic learning--and situates the various models in their appropriate neural context. To write about models of the brain before the brain is fully understood is a delicate matter. Very detailed models of the neural circuitry risk losing track of the task the brain is trying to solve. At the other extreme, models that represent cognitive constructs can be so abstract that they lose all relationship to neurobiology. An "Introduction to Natural Computation" takes the middle ground and stresses the computational task while staying near the neurobiology. The material is accessible to advanced undergraduates as well as beginning graduate students. CONTENTS: 1. Introduction Part I "Core Concepts" 2. Fitness 3. Programs 4. Data 5. Dynamics 6. Optimization Part II "Memories" 7. Content Addressible Memories 8. Supervised Learning 9. Unsupervised Learning Part III "Programs" 10. Markov Models 11. Reinforcement Learning Part IV "Systems" 12. Genetic Algorithms
Author: Tomaso A. Poggio
Publisher: MIT Press
Release Date: 2016-09-23
The ventral visual stream is believed to underlie object recognition in primates. Over the past fifty years, researchers have developed a series of quantitative models that are increasingly faithful to the biological architecture. Recently, deep learning convolution networks -- which do not reflect several important features of the ventral stream architecture and physiology -- have been trained with extremely large datasets, resulting in model neurons that mimic object recognition but do not explain the nature of the computations carried out in the ventral stream. This book develops a mathematical framework that describes learning of invariant representations of the ventral stream and is particularly relevant to deep convolutional learning networks. The authors propose a theory based on the hypothesis that the main computational goal of the ventral stream is to compute neural representations of images that are invariant to transformations commonly encountered in the visual environment and are learned from unsupervised experience. They describe a general theoretical framework of a computational theory of invariance (with details and proofs offered in appendixes) and then review the application of the theory to the feedforward path of the ventral stream in the primate visual cortex.
Author: Chris Eliasmith
Publisher: Oxford University Press
Release Date: 2013-04-16
How to Build a Brain provides a detailed exploration of a new cognitive architecture - the Semantic Pointer Architecture - that takes biological detail seriously, while addressing cognitive phenomena. Topics ranging from semantics and syntax, to neural coding and spike-timing-dependent plasticity are integrated to develop the world's largest functional brain model.
Author: Department of Psychiatry and Biobehavioral Sciences School of Medicine Joaquin M. Fuster Professor
Publisher: Oxford University Press, USA
Release Date: 2002-09-19
This book presents a unique synthesis of the current neuroscience of cognition by one of the world's authorities in the field. The guiding principle to this synthesis is the tenet that the entirety of our knowledge is encoded by relations, and thus by connections, in neuronal networks of our cerebral cortex. Cognitive networks develop by experience on a base of widely dispersed modular cell assemblies representing elementary sensations and movements. As they develop cognitive networks organize themselves hierarchically by order of complexity or abstraction of their content. Because networks intersect profusely, sharing commong nodes, a neuronal assembly anywhere in the cortex can be part of many networks, and therefore many items of knowledge. All cognitive functions consist of neural transactions within and between cognitive networks. After reviewing the neurobiology and architecture of cortical networks (also named cognits), the author undertakes a systematic study of cortical dynamics in each of the major cognitive functions--perception, memory, attention, language, and intelligence. In this study, he makes use of a large body of evidence from a variety of methodologies, in the brain of the human as well as the nonhuman primate. The outcome of his interdisciplinary endeavor is the emergence of a structural and dynamic order in the cerebral cortex that, though still sketchy and fragmentary, mirrors with remarkable fidelity the order in the human mind.
Author: Michael A. Arbib
Release Date: 2001-04-02
Computing the Brain provides readers with an integrated view of current informatics research related to the field of neuroscience. This book clearly defines the new work being done in neuroinformatics and offers information on resources available on the Web to researchers using this new technology. It contains chapters that should appeal to a multidisciplinary audience with introductory chapters for the nonexpert reader. Neuroscientists will find this book an excellent introduction to informatics technologies and the use of these technologies in their research. Computer scientists will be interested in exploring how these technologies might benefit the neuroscience community. An integrated view of neuroinformatics for a multidisciplinary audience Explores and explains new work being done in neuroinformatics Cross-disciplinary with chapters for computer scientists and neuroscientists An excellent tool for graduate students coming to neuroinformatics research from diverse disciplines and for neuroscientists seeking a comprehensive introduction to the subject Discusses, in-depth, the structuring of masses of data by a variety of computational models Clearly defines computational neuroscience - the use of computational techniques and metaphors to investigate relations between neural structure and function Offers a guide to resources and algorithms that can be found on the Web Written by internationally renowned experts in the field
Author: Patricia S. Churchland
Publisher: MIT Press
Release Date: 2016-10-28
Before The Computational Brain was published in 1992, conceptual frameworks for brain function were based on the behavior of single neurons, applied globally. In The Computational Brain, Patricia Churchland and Terrence Sejnowski developed a different conceptual framework, based on large populations of neurons. They did this by showing that patterns of activities among the units in trained artificial neural network models had properties that resembled those recorded from populations of neurons recorded one at a time. It is one of the first books to bring together computational concepts and behavioral data within a neurobiological framework. Aimed at a broad audience of neuroscientists, computer scientists, cognitive scientists, and philosophers, The Computational Brain is written for both expert and novice. This anniversary edition offers a new preface by the authors that puts the book in the context of current research.This approach influenced a generation of researchers. Even today, when neuroscientists can routinely record from hundreds of neurons using optics rather than electricity, and the 2013 White House BRAIN initiative heralded a new era in innovative neurotechnologies, the main message of The Computational Brain is still relevant.
Author: Dana Harry Ballard
Publisher: Prentice Hall
Release Date: 1982
Investigates fundamental concepts & recent developments in computer vision & describes the representations & mechanisms that allow image information & prior knowledge to interact in image understanding.
Author: Olaf Sporns
Publisher: MIT Press
Release Date: 2010-10-01
Over the last decade, the study of complex networks has expanded across diverse scientific fields. Increasingly, science is concerned with the structure, behavior, and evolution of complex systems ranging from cells to ecosystems. In Networks of the Brain, Olaf Sporns describes how the integrative nature of brain function can be illuminated from a complex network perspective. Highlighting the many emerging points of contact between neuroscience and network science, the book serves to introduce network theory to neuroscientists and neuroscience to those working on theoretical network models. Sporns emphasizes how networks connect levels of organization in the brain and how they link structure to function, offering an informal and nonmathematical treatment of the subject. Networks of the Brain provides a synthesis of the sciences of complex networks and the brain that will be an essential foundation for future research.