This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance. The book will be of value to advanced undergraduate and graduate students and researchers in the areas of computer science, neuroscience, physics, and engineering. ISBN: 978-3-319-43221-2 (Print), 978-3-319-43222-9 (Online)
The intimate relation between biology and cognition can be formally examined through statistical models constrained by the asymptotic limit theorems of communication theory, augmented by methods from statistical mechanics and nonequilibrium thermodynamics. Cognition, often involving submodules that act as information sources, is ubiquitous across the living state. Less metabolic free energy is consumed by permitting crosstalk between biological information sources than by isolating them, leading to evolutionary exaptations that assemble shifting, tunable cognitive arrays at multiple scales, and levels of organization to meet dynamic patterns of threat and opportunity. Cognition is thus necessary for life, but it is not sufficient: An organism represents a highly patterned outcome of path-dependent, blind, variation, selection, interaction, and chance extinction in the context of an adequate flow of free energy and an environment fit for development. Complex, interacting cognitive processes within an organism both record and instantiate those evolutionary and developmental trajectories.
🔖 Human Evolution: Our Brains and Behavior by Robin Dunbar (Oxford University Press) marked as want to read.
Official release date: November 1, 2016
09/14/16: downloaded a review copy via NetGalley
Syndicated copies to:
The story of human evolution has fascinated us like no other: we seem to have an insatiable curiosity about who we are and where we have come from. Yet studying the “stones and bones” skirts around what is perhaps the realest, and most relatable, story of human evolution – the social and cognitive changes that gave rise to modern humans.
In Human Evolution: Our Brains and Behavior, Robin Dunbar appeals to the human aspects of every reader, as subjects of mating, friendship, and community are discussed from an evolutionary psychology perspective. With a table of contents ranging from prehistoric times to modern days, Human Evolution focuses on an aspect of evolution that has typically been overshadowed by the archaeological record: the biological, neurological, and genetic changes that occurred with each “transition” in the evolutionary narrative. Dunbar’s interdisciplinary approach – inspired by his background as both an anthropologist and accomplished psychologist – brings the reader into all aspects of the evolutionary process, which he describes as the “jigsaw puzzle” of evolution that he and the reader will help solve. In doing so, the book carefully maps out each stage of the evolutionary process, from anatomical changes such as bipedalism and increase in brain size, to cognitive and behavioral changes, such as the ability to cook, laugh, and use language to form communities through religion and story-telling. Most importantly and interestingly, Dunbar hypothesizes the order in which these evolutionary changes occurred-conclusions that are reached with the “time budget model” theory that Dunbar himself coined. As definitive as the “stones and bones” are for the hard dates of archaeological evidence, this book explores far more complex psychological questions that require a degree of intellectual speculation: What does it really mean to be human (as opposed to being an ape), and how did we come to be that way?
Methods originally developed in Information Theory have found wide applicability in computational neuroscience. Beyond these original methods there is a need to develop novel tools and approaches that are driven by problems arising in neuroscience. A number of researchers in computational/systems neuroscience and in information/communication theory are investigating problems of information representation and processing. While the goals are often the same, these researchers bring different perspectives and points of view to a common set of neuroscience problems. Often they participate in different fora and their interaction is limited. The goal of the workshop is to bring some of these researchers together to discuss challenges posed by neuroscience and to exchange ideas and present their latest work. The workshop is targeted towards computational and systems neuroscientists with interest in methods of information theory as well as information/communication theorists with interest in neuroscience.
Inspiration for artificial biologically inspired computing is often drawn from neural systems. This article shows how to analyze neural systems using information theory with the aim of obtaining constraints that help to identify the algorithms run by neural systems and the information they represent. Algorithms and representations identified this way may then guide the design of biologically inspired computing systems. The material covered includes the necessary introduction to information theory and to the estimation of information-theoretic quantities from neural recordings. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is partitioned into component processes of information storage, transfer, and modification – locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems.