👓 Does Donald Trump write his own tweets? Sometimes | The Boston Globe

Read Does Donald Trump write his own tweets? Sometimes (The Boston Globe)
It’s not always Trump tapping out a tweet, even when it sounds like his voice.

I wonder how complicated/in-depth the applied information theory is behind the Twitter bot described here?

Syndicated copies to:

Following Michael Levin

Followed Michael Levin (ase.tufts.edu)

Investigating information storage and processing in biological systems

We work on novel ways to understand and control complex pattern formation. We use techniques of molecular genetics, biophysics, and computational modeling to address large-scale control of growth and form. We work in whole frogs and flatworms, and sometimes zebrafish and human tissues in culture. Our projects span regeneration, embryogenesis, cancer, and learning plasticity – all examples of how cellular networks process information. In all of these efforts, our goal is not only to understand the molecular mechanisms necessary for morphogenesis, but also to uncover and exploit the cooperative signaling dynamics that enable complex bodies to build and remodel themselves toward a correct structure. Our major goal is to understand how individual cell behaviors are orchestrated towards appropriate large-scale outcomes despite unpredictable environmental perturbations.

Syndicated copies to:

👓 How Many Genes Do Cells Need? Maybe Almost All of Them | Quanta Magazine

Read How Many Genes Do Cells Need? Maybe Almost All of Them (Quanta Magazine)
An ambitious study in yeast shows that the health of cells depends on the highly intertwined effects of many genes, few of which can be deleted together without consequence.

There could be some interesting data to play with here if available.

I also can’t help but wonder about applying some of Stuart Kauffman’s ideas to something like this. In particular, this sounds very reminiscent to his analogy of what happens when one strings thread randomly among a pile of buttons and the resulting complexity.

Syndicated copies to:

👓 Voting me, voting you: Eurovision | The Economist (Espresso)

Read Voting me, voting you: Eurovision (Economist Espresso)
​The competition, whose finals play out tonight, is as famed for its politics as its cheesy

I often read the Economist’s Espresso daily round up, but don’t explicitly post that I do. I’m making an exception in this case because I find the voting partnerships mentioned here quite interesting. Might be worth delving into some of the underlying voting statistics for potential application to other real life examples. I’m also enamored of the nice visualization they provide. I wonder what the overlap of this data is with other related world politics looks like?

Syndicated copies to:

🔖 The Theory of Quantum Information by John Watrous

Bookmarked The Theory of Quantum Information by Tom Watrous (cs.uwaterloo.ca)

To be published by Cambridge University Press in April 2018.

Upon publication this book will be available for purchase through Cambridge University Press and other standard distribution channels. Please see the publisher's web page to pre-order the book or to obtain further details on its publication date.

A draft, pre-publication copy of the book can be found below. This draft copy is made available for personal use only and must not be sold or redistributed.

This largely self-contained book on the theory of quantum information focuses on precise mathematical formulations and proofs of fundamental facts that form the foundation of the subject. It is intended for graduate students and researchers in mathematics, computer science, and theoretical physics seeking to develop a thorough understanding of key results, proof techniques, and methodologies that are relevant to a wide range of research topics within the theory of quantum information and computation. The book is accessible to readers with an understanding of basic mathematics, including linear algebra, mathematical analysis, and probability theory. An introductory chapter summarizes these necessary mathematical prerequisites, and starting from this foundation, the book includes clear and complete proofs of all results it presents. Each subsequent chapter includes challenging exercises intended to help readers to develop their own skills for discovering proofs concerning the theory of quantum information.

h/t to @michael_nielsen via Nuzzel

Syndicated copies to:

👓 Large Cache of Texts May Offer Insight Into One of Africa’s Oldest Written Languages | Smithsonian Magazine

Read Large Cache of Texts May Offer Insight Into One of Africa's Oldest Written Languages (Smithsonian)
Archaeologists in Sudan have uncovered the largest assemblage of Meroitic inscriptions to date

This is a cool discovery, in great part because their documentation was interesting enough to be able to suggest further locations to check for more archaeological finds. This might also be something one could apply some linguistic analysis and information theory to in an attempt to better pull apart the language and grammar.

h/t to @ArtsJournalNews, bookmarked on April 17, 2018 at 08:16AM

Syndicated copies to:

🔖 Special Issue : Information Dynamics in Brain and Physiological Networks | Entropy

Bookmarked Special Issue "Information Dynamics in Brain and Physiological Networks" (mdpi.com)

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory".

Deadline for manuscript submissions: 30 December 2018

It is, nowadays, widely acknowledged that the brain and several other organ systems, including the cardiovascular, respiratory, and muscular systems, among others, exhibit complex dynamic behaviors that result from the combined effects of multiple regulatory mechanisms, coupling effects and feedback interactions, acting in both space and time.

The field of information theory is becoming more and more relevant for the theoretical description and quantitative assessment of the dynamics of the brain and physiological networks, defining concepts, such as those of information generation, storage, transfer, and modification. These concepts are quantified by several information measures (e.g., approximate entropy, conditional entropy, multiscale entropy, transfer entropy, redundancy and synergy, and many others), which are being increasingly used to investigate how physiological dynamics arise from the activity and connectivity of different structural units, and evolve across a variety of physiological states and pathological conditions.

This Special Issue focuses on blending theoretical developments in the new emerging field of information dynamics with innovative applications targeted to the analysis of complex brain and physiological networks in health and disease. To favor this multidisciplinary view, contributions are welcome from different fields, ranging from mathematics and physics to biomedical engineering, neuroscience, and physiology.

Prof. Dr. Luca Faes
Prof. Dr. Alberto Porta
Prof. Dr. Sebastiano Stramaglia
Guest Editors
Syndicated copies to:

👓 Living Bits: Information and the Origin of Life | PBS

Read Living Bits: Information and the Origin of Life by Christoph Adami (pbs.org)
What is life? When Erwin Schrödinger posed this question in 1944, in a book of the same name, he was 57 years old. He had won the Nobel in Physics eleven years earlier, and was arguably past his glory days. Indeed, at that time he was working mostly on his ill-fated “Unitary Field Theory.” By all accounts, the publication of “What is Life?”—venturing far outside of a theoretical physicist’s field of expertise—raised many eyebrows. How presumptuous for a physicist to take on one of the deepest questions in biology! But Schrödinger argued that science should not be compartmentalized: “Some of us should venture to embark on a synthesis of facts and theories, albeit with second-hand and incomplete knowledge of some of them—and at the risk of making fools of ourselves.” Schrödinger’s “What is Life” has been extraordinarily influential, in one part because he was one of the first who dared to ask the question seriously, and in another because it was the book that was read by a good number of physicists—famously both Francis Crick and James Watson independently, but also many a member of the “Phage group,” a group of scientists that started the field of bacterial genetics—and steered them to new careers in biology. The book is perhaps less famous for the answers Schrödinger suggested, as almost all of them have turned out to be wrong.

Highlights, Quotes, & Marginalia

our existence can succinctly be described as “information that can replicate itself,” the immediate follow-up question is, “Where did this information come from?”

from an information perspective, only the first step in life is difficult. The rest is just a matter of time.

Through decades of work by legions of scientists, we now know that the process of Darwinian evolution tends to lead to an increase in the information coded in genes. That this must happen on average is not difficult to see. Imagine I start out with a genome encoding n bits of information. In an evolutionary process, mutations occur on the many representatives of this information in a population. The mutations can change the amount of information, or they can leave the information unchanged. If the information changes, it can increase or decrease. But very different fates befall those two different changes. The mutation that caused a decrease in information will generally lead to lower fitness, as the information stored in our genes is used to build the organism and survive. If you know less than your competitors about how to do this, you are unlikely to thrive as well as they do. If, on the other hand, you mutate towards more information—meaning better prediction—you are likely to use that information to have an edge in survival.

There are some plants with huge amounts of DNA compared to their “peers”–perhaps these would be interesting test cases for potential experimentation of this?

Syndicated copies to:

🔖 Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

Bookmarked Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory by Jun Kitazono, Ryota Kanai, Masafumi Oizumi (MDPI)
The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ( Φ ) in the brain is related to the level of consciousness. IIT proposes that, to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that, if a measure of Φ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of Φ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of Φ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure Φ in large systems within a practical amount of time.

h/t Christoph Adami, Erik Hoel, and @kanair

👓 Stephen Hawking, Who Examined the Universe and Explained Black Holes, Dies at 76 | The New York Times

Read Stephen Hawking, Who Examined the Universe and Explained Black Holes, Dies at 76 by Dennis Overbye (nytimes.com)
A physicist and best-selling author, Dr. Hawking did not allow his physical limitations to hinder his quest to answer “the big question: Where did the universe come from?”

Some sad news after getting back from Algebraic Geometry class tonight. RIP Stephen Hawking.

Syndicated copies to:

The Physics of Life: Summer School | Center for the Physics of Biological Function

Bookmarked The Physics of Life: Summer School | Center for the Physics of Biological Function (biophysics.princeton.edu)
A summer school for advanced undergraduates June 11-22, 2018 @ Princeton University What would it mean to have a physicist’s understanding of life? How do DYNAMICS and the EMERGENCE of ORDER affect biological function? How do organisms process INFORMATION, LEARN, ADAPT, and EVOLVE? See how physics problems emerge from thinking about developing embryos, communicating bacteria, dynamic neural networks, animal behaviors, evolution, and more. Learn how ideas and methods from statistical physics, simulation and data analysis, optics and microscopy connect to diverse biological phenomena. Explore these questions, tools, and concepts in an intense two weeks of lectures, seminars, hands-on exercises, and projects.
Syndicated copies to:

👓 On digital archaeology | Andrew Eckford

Read On digital archaeology by Andrew Eckford (A Random Process)
The year is 4018. German is widely studied by scholars of classical antiquity, but all knowledge of the mysterious English language has died out. Scene: A classics department faculty lounge; a few professors are relaxing.

I worry about things like this all the time. Apparently it’s a terrible affliction that strikes those with a background in information theory at higher rates than the general public.

Syndicated copies to:

🔖 9th International Conference on Complex Systems | NECSI

Bookmarked 9th International Conference on Complex Systems | NECSI (necsi.edu)
The International Conference on Complex Systems is a unique interdisciplinary forum that unifies and bridges the traditional domains of science and a multitude of real world systems. Participants will contribute and be exposed to mind expanding concepts and methods from across the diverse field of complex systems science. The conference will be held July 22-27, 2018, in Cambridge, MA, USA. Special Topic - Artificial Intelligence: This year’s conference will include a day on AI, including its development and potential future. This session will be chaired by Iyad Rahwan of MIT's Media Lab.

A great looking conference coming up with a strong line up of people who’s work I appreciate. It could certainly use some more balance however as it’s almost all white men.

In particular I’d want to see:
Albert-László Barabási (Northeastern University, USA)
Nassim Nicholas Taleb (Real World Risk Institute, USA)
Stuart Kauffman (Institute for Systems Biology, USA)
Simon DeDeo (Carnegie Mellon University, USA)
Stephen Wolfram (Wolfram Research)
César Hidalgo (MIT Media Lab, USA)

Others include:
Marta González (University of California Berkeley, USA)
Peter Turchin (University of Connecticut, USA)
Mercedes Pascual (University of Chicago, USA) Pending confirmation
Iyad Rahwan (MIT Media Lab, USA)
Sandy Pentland (MIT Media Lab, USA)
Theresa Whelan (U.S. Department of Defense) Pending DOD approval
H. Eugene Stanley (Boston University, USA)
Ricardo Hausmann (Harvard University, USA)
Stephen Grossberg (Boston University, USA)
Daniela Rus (MIT Computer Science & Artificial Intelligence Lab, USA) Pending confirmation
Olaf Sporns (Indiana University Network Science Institute, USA)
Michelle Girvan (University of Maryland, USA) Pending confirmation
Cameron Kerry (MIT Media Lab, USA)
Irving Epstein (Brandeis University, USA)

Syndicated copies to:

🔖 Decoding Anagrammed Texts Written in an Unknown Language and Script

Bookmarked Decoding Anagrammed Texts Written in an Unknown Language and Script by Bradley Hauer, Grzegorz Kondrak (Transactions of the Association for Computational Linguistics)
Algorithmic decipherment is a prime example of a truly unsupervised problem. The first step in the decipherment process is the identification of the encrypted language. We propose three methods for determining the source language of a document enciphered with a monoalphabetic substitution cipher. The best method achieves 97% accuracy on 380 languages. We then present an approach to decoding anagrammed substitution ciphers, in which the letters within words have been arbitrarily transposed. It obtains the average decryption word accuracy of 93% on a set of 50 ciphertexts in 5 languages. Finally, we report the results on the Voynich manuscript, an unsolved fifteenth century cipher, which suggest Hebrew as the language of the document.

Aside: It’s been ages since I’ve seen someone with Refbacks listed on their site!