👓 How Many Genes Do Cells Need? Maybe Almost All of Them | Quanta Magazine

Read How Many Genes Do Cells Need? Maybe Almost All of Them (Quanta Magazine)
An ambitious study in yeast shows that the health of cells depends on the highly intertwined effects of many genes, few of which can be deleted together without consequence.
There could be some interesting data to play with here if available.

I also can’t help but wonder about applying some of Stuart Kauffman’s ideas to something like this. In particular, this sounds very reminiscent to his analogy of what happens when one strings thread randomly among a pile of buttons and the resulting complexity.

👓 Voting me, voting you: Eurovision | The Economist (Espresso)

Read Voting me, voting you: Eurovision (Economist Espresso)
​The competition, whose finals play out tonight, is as famed for its politics as its cheesy
I often read the Economist’s Espresso daily round up, but don’t explicitly post that I do. I’m making an exception in this case because I find the voting partnerships mentioned here quite interesting. Might be worth delving into some of the underlying voting statistics for potential application to other real life examples. I’m also enamored of the nice visualization they provide. I wonder what the overlap of this data is with other related world politics looks like?

🔖 The Theory of Quantum Information by John Watrous

Bookmarked The Theory of Quantum Information by Tom Watrous (cs.uwaterloo.ca)

To be published by Cambridge University Press in April 2018.

Upon publication this book will be available for purchase through Cambridge University Press and other standard distribution channels. Please see the publisher's web page to pre-order the book or to obtain further details on its publication date.

A draft, pre-publication copy of the book can be found below. This draft copy is made available for personal use only and must not be sold or redistributed.

This largely self-contained book on the theory of quantum information focuses on precise mathematical formulations and proofs of fundamental facts that form the foundation of the subject. It is intended for graduate students and researchers in mathematics, computer science, and theoretical physics seeking to develop a thorough understanding of key results, proof techniques, and methodologies that are relevant to a wide range of research topics within the theory of quantum information and computation. The book is accessible to readers with an understanding of basic mathematics, including linear algebra, mathematical analysis, and probability theory. An introductory chapter summarizes these necessary mathematical prerequisites, and starting from this foundation, the book includes clear and complete proofs of all results it presents. Each subsequent chapter includes challenging exercises intended to help readers to develop their own skills for discovering proofs concerning the theory of quantum information.

h/t to @michael_nielsen via Nuzzel

👓 Large Cache of Texts May Offer Insight Into One of Africa’s Oldest Written Languages | Smithsonian Magazine

Read Large Cache of Texts May Offer Insight Into One of Africa's Oldest Written Languages (Smithsonian)
Archaeologists in Sudan have uncovered the largest assemblage of Meroitic inscriptions to date
This is a cool discovery, in great part because their documentation was interesting enough to be able to suggest further locations to check for more archaeological finds. This might also be something one could apply some linguistic analysis and information theory to in an attempt to better pull apart the language and grammar.

h/t to @ArtsJournalNews, bookmarked on April 17, 2018 at 08:16AM

🔖 Special Issue : Information Dynamics in Brain and Physiological Networks | Entropy

Bookmarked Special Issue "Information Dynamics in Brain and Physiological Networks" (mdpi.com)

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory".

Deadline for manuscript submissions: 30 December 2018

It is, nowadays, widely acknowledged that the brain and several other organ systems, including the cardiovascular, respiratory, and muscular systems, among others, exhibit complex dynamic behaviors that result from the combined effects of multiple regulatory mechanisms, coupling effects and feedback interactions, acting in both space and time.

The field of information theory is becoming more and more relevant for the theoretical description and quantitative assessment of the dynamics of the brain and physiological networks, defining concepts, such as those of information generation, storage, transfer, and modification. These concepts are quantified by several information measures (e.g., approximate entropy, conditional entropy, multiscale entropy, transfer entropy, redundancy and synergy, and many others), which are being increasingly used to investigate how physiological dynamics arise from the activity and connectivity of different structural units, and evolve across a variety of physiological states and pathological conditions.

This Special Issue focuses on blending theoretical developments in the new emerging field of information dynamics with innovative applications targeted to the analysis of complex brain and physiological networks in health and disease. To favor this multidisciplinary view, contributions are welcome from different fields, ranging from mathematics and physics to biomedical engineering, neuroscience, and physiology.

Prof. Dr. Luca Faes
Prof. Dr. Alberto Porta
Prof. Dr. Sebastiano Stramaglia
Guest Editors

👓 Living Bits: Information and the Origin of Life | PBS

Highlights, Quotes, & Marginalia

our existence can succinctly be described as “information that can replicate itself,” the immediate follow-up question is, “Where did this information come from?”

from an information perspective, only the first step in life is difficult. The rest is just a matter of time.

Through decades of work by legions of scientists, we now know that the process of Darwinian evolution tends to lead to an increase in the information coded in genes. That this must happen on average is not difficult to see. Imagine I start out with a genome encoding n bits of information. In an evolutionary process, mutations occur on the many representatives of this information in a population. The mutations can change the amount of information, or they can leave the information unchanged. If the information changes, it can increase or decrease. But very different fates befall those two different changes. The mutation that caused a decrease in information will generally lead to lower fitness, as the information stored in our genes is used to build the organism and survive. If you know less than your competitors about how to do this, you are unlikely to thrive as well as they do. If, on the other hand, you mutate towards more information—meaning better prediction—you are likely to use that information to have an edge in survival.

There are some plants with huge amounts of DNA compared to their “peers”–perhaps these would be interesting test cases for potential experimentation of this?

🔖 Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

Bookmarked Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory by Jun Kitazono, Ryota Kanai, Masafumi Oizumi (MDPI)
The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ( Φ ) in the brain is related to the level of consciousness. IIT proposes that, to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that, if a measure of Φ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of Φ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of Φ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure Φ in large systems within a practical amount of time.
h/t Christoph Adami, Erik Hoel, and @kanair

👓 Stephen Hawking, Who Examined the Universe and Explained Black Holes, Dies at 76 | The New York Times

Read Stephen Hawking, Who Examined the Universe and Explained Black Holes, Dies at 76 by Dennis Overbye (nytimes.com)
A physicist and best-selling author, Dr. Hawking did not allow his physical limitations to hinder his quest to answer “the big question: Where did the universe come from?”
Some sad news after getting back from Algebraic Geometry class tonight. RIP Stephen Hawking.

The Physics of Life: Summer School | Center for the Physics of Biological Function

Bookmarked The Physics of Life: Summer School | Center for the Physics of Biological Function (biophysics.princeton.edu)
A summer school for advanced undergraduates June 11-22, 2018 @ Princeton University What would it mean to have a physicist’s understanding of life? How do DYNAMICS and the EMERGENCE of ORDER affect biological function? How do organisms process INFORMATION, LEARN, ADAPT, and EVOLVE? See how physics problems emerge from thinking about developing embryos, communicating bacteria, dynamic neural networks, animal behaviors, evolution, and more. Learn how ideas and methods from statistical physics, simulation and data analysis, optics and microscopy connect to diverse biological phenomena. Explore these questions, tools, and concepts in an intense two weeks of lectures, seminars, hands-on exercises, and projects.

👓 On digital archaeology | Andrew Eckford

Read On digital archaeology by Andrew Eckford (A Random Process)
The year is 4018. German is widely studied by scholars of classical antiquity, but all knowledge of the mysterious English language has died out. Scene: A classics department faculty lounge; a few professors are relaxing.
I worry about things like this all the time. Apparently it’s a terrible affliction that strikes those with a background in information theory at higher rates than the general public.

🔖 9th International Conference on Complex Systems | NECSI

Bookmarked 9th International Conference on Complex Systems | NECSI (necsi.edu)
The International Conference on Complex Systems is a unique interdisciplinary forum that unifies and bridges the traditional domains of science and a multitude of real world systems. Participants will contribute and be exposed to mind expanding concepts and methods from across the diverse field of complex systems science. The conference will be held July 22-27, 2018, in Cambridge, MA, USA. Special Topic - Artificial Intelligence: This year’s conference will include a day on AI, including its development and potential future. This session will be chaired by Iyad Rahwan of MIT's Media Lab.
A great looking conference coming up with a strong line up of people who’s work I appreciate. It could certainly use some more balance however as it’s almost all white men.

In particular I’d want to see:
Albert-László Barabási (Northeastern University, USA)
Nassim Nicholas Taleb (Real World Risk Institute, USA)
Stuart Kauffman (Institute for Systems Biology, USA)
Simon DeDeo (Carnegie Mellon University, USA)
Stephen Wolfram (Wolfram Research)
César Hidalgo (MIT Media Lab, USA)

Others include:
Marta González (University of California Berkeley, USA)
Peter Turchin (University of Connecticut, USA)
Mercedes Pascual (University of Chicago, USA) Pending confirmation
Iyad Rahwan (MIT Media Lab, USA)
Sandy Pentland (MIT Media Lab, USA)
Theresa Whelan (U.S. Department of Defense) Pending DOD approval
H. Eugene Stanley (Boston University, USA)
Ricardo Hausmann (Harvard University, USA)
Stephen Grossberg (Boston University, USA)
Daniela Rus (MIT Computer Science & Artificial Intelligence Lab, USA) Pending confirmation
Olaf Sporns (Indiana University Network Science Institute, USA)
Michelle Girvan (University of Maryland, USA) Pending confirmation
Cameron Kerry (MIT Media Lab, USA)
Irving Epstein (Brandeis University, USA)

🔖 Decoding Anagrammed Texts Written in an Unknown Language and Script

Bookmarked Decoding Anagrammed Texts Written in an Unknown Language and Script by Bradley Hauer, Grzegorz Kondrak (Transactions of the Association for Computational Linguistics)
Algorithmic decipherment is a prime example of a truly unsupervised problem. The first step in the decipherment process is the identification of the encrypted language. We propose three methods for determining the source language of a document enciphered with a monoalphabetic substitution cipher. The best method achieves 97% accuracy on 380 languages. We then present an approach to decoding anagrammed substitution ciphers, in which the letters within words have been arbitrarily transposed. It obtains the average decryption word accuracy of 93% on a set of 50 ciphertexts in 5 languages. Finally, we report the results on the Voynich manuscript, an unsolved fifteenth century cipher, which suggest Hebrew as the language of the document.
Aside: It’s been ages since I’ve seen someone with Refbacks listed on their site!

👓 Sexual harassment allegations roil Princeton University | WHYY

Read Sexual harassment allegations roil Princeton University by Avi Wolfman-Arent (WHYY)
Another high-profile instance of sexual harassment has rocked a major institution — this time Princeton University in New Jersey. And students say administrators didn’t act transparently or strongly enough when disciplining the alleged perpetrator, a decorated professor.
Once you start reaching Sergio Verdu’s age, and particularly with his achievements, your value to the University becomes more geared toward service. How much service can a professor do with an albatross like this hanging around their neck?

It would be nice if Universities were required to register offenders like this so that applicants to programs would be aware of them prior to applying–a sort of Megan’s Law for the professoriate. Naturally they don’t do this because it goes against their interests, but by the same token this is how a lot of issues run out of control within their sports programs as well. If someone did create such a website, I imagine the chilling effects on colleges and universities would be such that they might change their tunes about how these cases are handled. Immediately recent cases like Michigan State’s athletics problem, USC’s Medical School Dean issues, Christian Ott at Caltech come to mind, but I’m sure there must be hundreds if not thousands of others.

Maybe we need a mashup site that’s a cross between RateMyProfessors.com and California’s Megan’s Law site, but which specifically targeted Universities?

Fortunately even given Sergio’s accomplishments and profile, it will probably take forever for web searches for his name to not surface the story within the top couple of links, but this is sad consolation, particularly in a field like Information Theory which is heavily underrepresented already.

👓 Read Professor Verdu’s emails to student where he invites her over to watch explicit film before sexually harassing her | The Tab

Read Read Professor Verdu’s emails to student where he invites her over to watch explicit film before sexually harassing her (Princeton University)
‘P.S. Please call me Sergio ☺️’
I was just wondering why Sergio Verdu was so quiet on Twitter. Then I wondered why his Twitter account had disappeared.

Now I know the sad and painfully disappointing answer.