🔖 Upcoming Special Issue “Information Theory in Neuroscience” | Entropy (MDPI)

Special Issue "Information Theory in Neuroscience" (Entropy | MDPI)
As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Application of information theory to neuroscience has spurred the development of principled theories of brain function, has led to advances in the study of consciousness, and to the development of analytical techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions. This Special Issue emphasizes contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. Research work at the interface of neuroscience, Information Theory and other disciplines is also welcome. A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory". Deadline for manuscript submissions: 1 December 2017
Syndicated copies to:

👓 Steven Pinker Explains the Neuroscience of Swearing | Open Culture

Steven Pinker Explains the Neuroscience of Swearing by Matthias Rascher (Open Culture)
Pinker talking about his then new book, The Stuff of Thought: Language as a Window into Human Nature, and doing what he does best: combining psychology and neuroscience with linguistics. The result is as entertaining as it is insightful.

Continue reading “👓 Steven Pinker Explains the Neuroscience of Swearing | Open Culture”

Syndicated copies to:

🔖 An Introduction to Transfer Entropy: Information Flow in Complex Systems

An Introduction to Transfer Entropy: Information Flow in Complex Systems by Terry Bossomaier, Lionel Barnett, Michael Harré, Joseph T. Lizier (Springer; 1st ed. 2016 edition)
This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance. The book will be of value to advanced undergraduate and graduate students and researchers in the areas of computer science, neuroscience, physics, and engineering. ISBN: 978-3-319-43221-2 (Print), 978-3-319-43222-9 (Online)

Want to read; h/t to Joseph Lizier.
Continue reading “🔖 An Introduction to Transfer Entropy: Information Flow in Complex Systems”

Syndicated copies to:

🔖 Cognition and biology: perspectives from information theory

Cognition and biology: perspectives from information theory by Roderick Wallace (ncbi.nlm.nih.gov)
The intimate relation between biology and cognition can be formally examined through statistical models constrained by the asymptotic limit theorems of communication theory, augmented by methods from statistical mechanics and nonequilibrium thermodynamics. Cognition, often involving submodules that act as information sources, is ubiquitous across the living state. Less metabolic free energy is consumed by permitting crosstalk between biological information sources than by isolating them, leading to evolutionary exaptations that assemble shifting, tunable cognitive arrays at multiple scales, and levels of organization to meet dynamic patterns of threat and opportunity. Cognition is thus necessary for life, but it is not sufficient: An organism represents a highly patterned outcome of path-dependent, blind, variation, selection, interaction, and chance extinction in the context of an adequate flow of free energy and an environment fit for development. Complex, interacting cognitive processes within an organism both record and instantiate those evolutionary and developmental trajectories.
Syndicated copies to:

🔖 Human Evolution: Our Brains and Behavior by Robin Dunbar (Oxford University Press)

🔖 Human Evolution: Our Brains and Behavior by Robin Dunbar (Oxford University Press) marked as want to read.
Official release date: November 1, 2016
09/14/16: downloaded a review copy via NetGalley

human-evolution-our-brains-and-behavior-by-robin-dunbar-11-01-16

Description
The story of human evolution has fascinated us like no other: we seem to have an insatiable curiosity about who we are and where we have come from. Yet studying the “stones and bones” skirts around what is perhaps the realest, and most relatable, story of human evolution – the social and cognitive changes that gave rise to modern humans.

In Human Evolution: Our Brains and Behavior, Robin Dunbar appeals to the human aspects of every reader, as subjects of mating, friendship, and community are discussed from an evolutionary psychology perspective. With a table of contents ranging from prehistoric times to modern days, Human Evolution focuses on an aspect of evolution that has typically been overshadowed by the archaeological record: the biological, neurological, and genetic changes that occurred with each “transition” in the evolutionary narrative. Dunbar’s interdisciplinary approach – inspired by his background as both an anthropologist and accomplished psychologist – brings the reader into all aspects of the evolutionary process, which he describes as the “jigsaw puzzle” of evolution that he and the reader will help solve. In doing so, the book carefully maps out each stage of the evolutionary process, from anatomical changes such as bipedalism and increase in brain size, to cognitive and behavioral changes, such as the ability to cook, laugh, and use language to form communities through religion and story-telling. Most importantly and interestingly, Dunbar hypothesizes the order in which these evolutionary changes occurred-conclusions that are reached with the “time budget model” theory that Dunbar himself coined. As definitive as the “stones and bones” are for the hard dates of archaeological evidence, this book explores far more complex psychological questions that require a degree of intellectual speculation: What does it really mean to be human (as opposed to being an ape), and how did we come to be that way?

Syndicated copies to:

Workshop on Methods of Information Theory in Computational Neuroscience | CNS 2016

Workshop on Methods of Information Theory in Computational Neuroscience (CNS 2016) by Joseph T. Lizier (lizier.me)
Methods originally developed in Information Theory have found wide applicability in computational neuroscience. Beyond these original methods there is a need to develop novel tools and approaches that are driven by problems arising in neuroscience. A number of researchers in computational/systems neuroscience and in information/communication theory are investigating problems of information representation and processing. While the goals are often the same, these researchers bring different perspectives and points of view to a common set of neuroscience problems. Often they participate in different fora and their interaction is limited. The goal of the workshop is to bring some of these researchers together to discuss challenges posed by neuroscience and to exchange ideas and present their latest work. The workshop is targeted towards computational and systems neuroscientists with interest in methods of information theory as well as information/communication theorists with interest in neuroscience.
Syndicated copies to:

Bits from Brains for Biologically Inspired Computing | Computational Intelligence

Bits from Brains for Biologically Inspired Computing by Michael Wibral, Joseph T. Lizier, and Viola Priesemann (Frontiers in Robotics and AI | Computational Intelligence journal.frontiersin.org)
Inspiration for artificial biologically inspired computing is often drawn from neural systems. This article shows how to analyze neural systems using information theory with the aim of obtaining constraints that help to identify the algorithms run by neural systems and the information they represent. Algorithms and representations identified this way may then guide the design of biologically inspired computing systems. The material covered includes the necessary introduction to information theory and to the estimation of information-theoretic quantities from neural recordings. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is partitioned into component processes of information storage, transfer, and modification – locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems.
Syndicated copies to: