Einstein’s Equations From Entanglement

In a lecture at Caltech, Brian Swingle reviews the idea that entanglement is the glue which holds spacetime together and shows how Einstein's equations plausibly emerge from this perspective. One ubiquitous feature of these dynamical equations is the formation of black holes, so he concludes by discussing some new ideas about the nature of spacetime inside a black hole.

Brian Swingle Colloquium at Caltech

From the Physics Research Conference 2015-2016
on Thursday, November 19, 2015 at 4:00 pm
at the California Institute of Technology, East Bridge 201 – Norman Bridge Laboratory of Physics, East

All talks are intended for a broad audience, and everyone is encouraged to attend. A list of future conferences can be found here.
Sponsored by Division of Physics, Mathematics and Astronomy

In recent years we have learned that the physics of quantum information plays a crucial role in the emergence of spacetime from microscopic degrees of freedom.

I will review the idea that entanglement is the glue which holds spacetime together and show how Einstein’s equations plausibly emerge from this perspective. One ubiquitous feature of these dynamical equations is the formation of black holes, so I will conclude by discussing some new ideas about the nature of spacetime inside a black hole.

Brian Swingle, postdoctoral fellow at the Stanford Institute for Theoretical Physics and physicist focusing on quantum matter, quantum information, and quantum gravity
in Physics Research Conference | Caltech

Click here for full screen presentation.

Syndicated copies to:

Why Information Grows: The Evolution of Order, from Atoms to Economies

I just ordered a copy of Why Information Grows: The Evolution of Order, from Atoms to Economies by Cesar Hidalgo. Although it seems more focused on economics, the base theory seems to fit right into some similar thoughts I’ve long held about biology.

Why Information Grows: The Evolutiion of Order from Atoms to Economies by Cesar Hidalgo
Why Information Grows: The Evolutiion of Order from Atoms to Economies by Cesar Hidalgo

 

From the book description:

“What is economic growth? And why, historically, has it occurred in only a few places? Previous efforts to answer these questions have focused on institutions, geography, finances, and psychology. But according to MIT’s antidisciplinarian César Hidalgo, understanding the nature of economic growth demands transcending the social sciences and including the natural sciences of information, networks, and complexity. To understand the growth of economies, Hidalgo argues, we first need to understand the growth of order.

At first glance, the universe seems hostile to order. Thermodynamics dictates that over time, order–or information–will disappear. Whispers vanish in the wind just like the beauty of swirling cigarette smoke collapses into disorderly clouds. But thermodynamics also has loopholes that promote the growth of information in pockets. Our cities are pockets where information grows, but they are not all the same. For every Silicon Valley, Tokyo, and Paris, there are dozens of places with economies that accomplish little more than pulling rocks off the ground. So, why does the US economy outstrip Brazil’s, and Brazil’s that of Chad? Why did the technology corridor along Boston’s Route 128 languish while Silicon Valley blossomed? In each case, the key is how people, firms, and the networks they form make use of information.

Seen from Hidalgo’s vantage, economies become distributed computers, made of networks of people, and the problem of economic development becomes the problem of making these computers more powerful. By uncovering the mechanisms that enable the growth of information in nature and society, Why Information Grows lays bear the origins of physical order and economic growth. Situated at the nexus of information theory, physics, sociology, and economics, this book propounds a new theory of how economies can do, not just more, but more interesting things.”

Syndicated copies to:

Videos from the NIMBioS Workshop on Information and Entropy in Biological Systems

Videos from the NIMBioS workshop on Information and Entropy in Biological Systems from April 8-10, 2015 are slowly starting to appear on YouTube.

Videos from the April 8-10, 2015, NIMBioS workshop on Information and Entropy in Biological Systems are slowly starting to appear on YouTube.

John Baez, one of the organizers of the workshop, is also going through them and adding some interesting background and links on his Azimuth blog as well for those who are looking for additional details and depth

Additonal resources from the Workshop:

 

Syndicated copies to:

Schools of Thought in the Hard and Soft Sciences

A framework for determining the difference between the hard and soft sciences.

A recent post in one of the blogs at Discover Magazine the other day had me thinking about the shape of science over time.

Neuroscientists don’t seem to disagree on the big issues. Why are there no big ideas in neuroscience?

Neuroskeptic, Where Are The Big Ideas in Neuroscience? (Part 1)

The article made me wonder about the divide between the ‘soft’ and ‘hard’ sciences, and how we might better define and delineate them. Perhaps in a particular field, the greater the proliferation of “schools of though,” the more likely something is to be a soft science? (Or mathematically speaking, there’s an inverse relationship in a field between how well supported it is and the number of schools of thought it has.) I consider a school of thought to be a hypothetical/theoretical proposed structure meant to potentially help advance the state of the art and adherents join one of many varying camps while evidence is built up (or not) until one side carries the day.

Firmness of Science vs. # of Schools of Thought
Simple linear approximation of the relationship, though honestly something more similar to y=1/x which is asymptotic to the x and y axes is far more realistic.

Theorem: The greater the proliferation of “schools of though,” the more likely something is to be a soft science.

Generally in most of the hard sciences like physics, biology, or microbiology, there don’t seem to be any opposing or differing schools of thought. While in areas like psychology or philosophy they abound, and often have long-running debates between schools without any hard data or evidence to truly allow one school to win out over another. Perhaps as the structure of a particular science becomes more sound, the concept of schools of thought become more difficult to establish?

For some of the hard sciences, it would seem that schools of thought only exist at the bleeding edge of the state-of-the-art where there isn’t yet enough evidence to swing the field one way or another to firmer ground.

Example: Evolutionary Biology

We might consider the area of evolutionary biology in which definitive evidence in the fossil record is difficult to come by, so there’s room for the opposing thoughts for gradualism versus punctuated equilibrium to be individual schools. Outside of this, most of evolutionary theory is so firmly grounded that there aren’t other schools.

Example: Theoretical Physics

The relatively new field of string theory might be considered a school of thought, though there don’t seem to be a lot of other opposing schools at the moment. If it does, such a school surely exists, in part, because there isn’t the ability to validate it with predictions and current data. However, because of the strong mathematical supporting structure, I’ve yet to hear anyone use the concept of school of thought to describe string theory, which sits in a community which seems to believe its a foregone conclusion that it or something very close to it represents reality. (Though for counterpoint, see Lee Smolin’s The Trouble with Physics.)

Example: Mathematics

To my knowledge, I can’t recall the concept of school of thought ever being applied to mathematics except in the case of the Pythagorean School which historically is considered to have been almost as much a religion as a science. Because of its theoretical footings, I suppose there may never be competing schools, for even in the case of problems like P vs. NP, individuals may have some gut reaction to which way things are leaning, everyone ultimately knows it’s going to be one or the other (P=NP or P \neq NP). Many mathematicians also know that it’s useful to try to prove a theorem during the day and then try to disprove it (or find a counterexample) by night, so even internally and individually they’re self-segregating against creating schools of thought right from the start.

Example: Religion

Looking at the furthest end of the other side of the spectrum, because there is no verifiable way to prove that God exists, there has been an efflorescence of religions of nearly every size and shape since the beginning of humankind. Might we then presume that this is the softest of the ‘sciences’?

What examples or counter examples can you think of?

Syndicated copies to:

String Theory, Black Holes, and Information

Amanda Peet presented the a lecture entitled "String Theory Legos for Black Holes" at the Perimeter Institute for Theoretical Physics.

Four decades ago, Stephen Hawking posed the black hole information paradox about black holes and quantum theory. It still challenges the imaginations of theoretical physicists today. Yesterday, Amanda Peet (University of Toronto) presented the a lecture entitled “String Theory Legos for Black Holes” yesterday at the Perimeter Institute for Theoretical Physics. A quick overview/teaser trailer for the lecture follows along with some additional information and the video of the lecture itself.

The “Information Paradox” with Amanda Peet (teaser trailer)

“Black holes are the ‘thought experiment’ par excellence, where the big three of physics – quantum mechanics, general relativity and thermodynamics – meet and fight it out, dragging in brash newcomers such as information theory and strings for support. Though a unification of gravity and quantum field theory still evades string theorists, many of the mathematical tools and ideas they have developed find applications elsewhere.

One of the most promising approaches to resolving the “information paradox” (the notion that nothing, not even information itself, survives beyond a black hole’s point-of-no-return event horizon) is string theory, a part of modern physics that has wiggled its way into the popular consciousness.

On May 6, 2015, Dr. Amanda Peet, a physicist at the University of Toronto, will describe how the string toolbox allows study of the extreme physics of black holes in new and fruitful ways. Dr. Peet will unpack that toolbox to reveal the versatility of strings and (mem)branes, and will explore the intriguing notion that the world may be a hologram.

Amanda Peet Amanda Peet is an Associate Professor of Physics at the University of Toronto. She grew up in the South Pacific island nation of Aotearoa/New Zealand, and earned a B.Sc.(Hons) from the University of Canterbury in NZ and a Ph.D. from Stanford University in the USA. Her awards include a Radcliffe Fellowship from Harvard and an Alfred P. Sloan Foundation Research Fellowship. She was one of the string theorists interviewed in the three-part NOVA PBS TV documentary “Elegant Universe”.

Web site: http://ap.io/home/.

Dr. Amanda Peet’s Lecture “String Theory Legos for Black Holes”

Syndicated copies to:

Richard Dawkins Interview: This Is My Vision Of “Life” | Edge.org

Bookmarked This Is My Vision Of "Life" by John Brockman (edge.org)
The Edge.org's interview with Richard Dawkins.

Richard Dawkins [4.30.15]

“My vision of life is that everything extends from replicators, which are in practice DNA molecules on this planet. The replicators reach out into the world to influence their own probability of being passed on. Mostly they don’t reach further than the individual body in which they sit, but that’s a matter of practice, not a matter of principle. The individual organism can be defined as that set of phenotypic products which have a single route of exit of the genes into the future. That’s not true of the cuckoo/reed warbler case, but it is true of ordinary animal bodies. So the organism, the individual organism, is a deeply salient unit. It’s a unit of selection in the sense that I call a “vehicle”.  There are two kinds of unit of selection. The difference is a semantic one. They’re both units of selection, but one is the replicator, and what it does is get itself copied. So more and more copies of itself go into the world. The other kind of unit is the vehicle. It doesn’t get itself copied. What it does is work to copy the replicators which have come down to it through the generations, and which it’s going to pass on to future generations. So we have this individual replicator dichotomy. They’re both units of selection, but in different senses. It’s important to understand that they are different senses.”

Richard Dawkins
Richard Dawkins

RICHARD DAWKINS is an evolutionary biologist; Emeritus Charles Simonyi Professor of the Public Understanding of Science, Oxford; Author, The Selfish Gene; The Extended Phenotype; Climbing Mount Improbable; The God Delusion; An Appetite For Wonder; and (forthcoming) A Brief Candle In The Dark.

Watch the entire video interview and read the transcript at Edge.org.

Syndicated copies to:

NIMBioS Workshop: Information Theory and Entropy in Biological Systems

Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag #entropyWS, and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

Resources for Information Theory and Biology

RSS Icon  RSS Feed for BoffoSocko posts tagged with #ITBio

 

Syndicated copies to:

Free E-Book: Neural Networks and Deep Learning by Michael Nielsen

Bookmarked Neural networks and deep learning (neuralnetworksanddeeplearning.com)

Michael A. Nielsen, the author of one of our favorite books on Quantum Computation and Quantum Information, is writing a new book entitled Neural Networks and Deep Learning. He’s been releasing portions of it for free on the internet in draft form every two or three months since 2013. He’s also maintaining an open code repository for the book on GitHub.

Michael A. Nielsen
Michael A. Nielsen
Syndicated copies to:

Introduction to Lie Groups and Lie Algebras (Part 2) | UCLA Extension

Dr. Mike Miller has just opened up registration for the second course in the series. His courses are always clear, entertaining, and invigorating, and I highly recommend them to anyone who is interested in math, science, or engineering.

Dr. Mike Miller, who had previously announced a two quarter sequence of classes on Lie Groups at UCLA, has just opened up registration for the second course in the series. His courses are always clear, entertaining, and invigorating, and I highly recommend them to anyone who is interested in math, science, or engineering.

Philosophy is written in this grand book, the universe which stands continually open to our gaze. But the book cannot be understood unless one first learns to comprehend the language and read the letters in which it is composed. It is written in the language of mathematics, and its characters are triangles, circles and other geometric figures without which it is humanly impossible to understand a single word of it; without these, one wanders about in a dark labyrinth.

Galileo Galilee (1564–1642) in Il saggiatore (The assayer)

Prior to the first part of the course, I’d written some thoughts about the timbre and tempo of his lecture style and philosophy and commend those interested to take a peek. I also mentioned some additional resources for the course there as well.  For those who missed the first portion, I’m happy to help fill you in and share some of my notes if necessary. The recommended minimum prerequisites for this class are linear algebra and some calculus.


Introduction to Lie Groups and Lie Algebras (Part 2)

Math X 450.7 / 3.00 units / Reg. # 251580W
Professor: Michael Miller, Ph.D.
Start Date: January 13, 2015
Location: UCLA, 5137 Math Sciences Building
Tuesday, 7-10pm
January 13 – March 24
11 meetings total
Class will not meet on one Tuesday to be annouced.

Register here: https://www.uclaextension.edu/Pages/Course.aspx?reg=251580

Course Description

A Lie group is a differentiable manifold that is also a group for which the product and inverse maps are differentiable. A Lie algebra is a vector space endowed with a binary operation that is bilinear, alternating, and satisfies the so-called Jacobi identity. This course is the second in a 2-quarter sequence that offers an introductory survey of Lie groups, their associated Lie algebras, and their representations. Its focus is split between continuing last quarter’s study of matrix Lie groups and their representations and reconciling this theory with that for the more general manifold setting. Topics to be discussed include the Weyl group, complete reducibility, semisimple Lie algebras, root systems, and Cartan subalgebras. This is an advanced course, requiring a solid understanding of linear algebra, basic analysis, and, ideally, the material from the previous quarter.Internet access required to retrieve course materials.

Recommended Textbook

Hall, Brian. Lie Groups, Lie Algebras, & Representations (Springer, 2004) ISBN: 9781441923134

 

A photograph of Sophus Lie's very full and bushy beard.
“I wouldn’t lie to you. This is Sophus’s beard.”

 

Syndicated copies to:

BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:

BIRS: Biological and Bio-Inspired Information Theory

A 5 Day workshop on Biology and Information Theory hosted by the Banff International Research Station

  1. Wishing I was at the Gene Regulation and Information Theory meeting starting tomorrow  http://bit.ly/XnHRZs  #ITBio
  2. Mathematical and Statistical Models for Genetic Coding starts today.  http://www.am.hs-mannheim.de/genetic_code_2013.php?id=1  @andreweckford might borrow attendees for BIRS
  3. Mathematical Foundations for Information Theory in Diffusion-Based Molecular Communications  http://bit.ly/1aTVR2c  #ITBio
  4. Bill Bialek giving plenary talk “Information flow & order in real biological networks” at Feb 2014 workshop  http://mnd.ly/19LQH8f  #ITBio
  5. CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna. http://t.co/F4Kn0ICIaT #ITBio http://t.co/Ty8dEIXQUT

    CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna.  http://jhu.md/1faLR8t  #ITBio pic.twitter.com/Ty8dEIXQUT
  6. Last RT: wonder what the weather is going to be like at the end of October for my @BIRS_Math workshop
  7. @JoVanEvery I’m organizing a workshop in Banff in October … hopefully this isn’t a sign of weather to come!
  8. Banff takes its name from the town of Banff, Scotland, not to be confused with Bamff, also Scotland.
  9. Good morning from beautiful Banff. How can you not love the mountains? http://t.co/mxYBNz7yzl

    Good morning from beautiful Banff. How can you not love the mountains? pic.twitter.com/mxYBNz7yzl
  10. “Not an obvious connection between utility and information, just as there is no obvious connection between energy and entropy” @BIRS_Math
  11. Last RT: a lot of discussion of my signal transduction work with Peter Thomas.
  12. Live now: Nicolo Michelusi of @USCViterbi on Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/live  #ITBio
  13. Nicolo Michelusi (University of Southern California), A Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271450-Michelusi.mp4 
  14. Listening to the always awesome @cnmirose talk about the ultimate limits of molecular communication.
  15. “Timing is fundamental … subsumes time-varying concentration channel” @cnmirose @BIRS_Math
  16. Standard opening quote of these talks: “I’m not a biologist, but …” @BIRS_Math
  17. Stefan Moser (ETH Zurich), Capacity Bounds of the Memoryless AIGN Channel – a Toy-Model for Molecular Communicat…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271610-Moser.mp4 
  18. Weisi Guo (University of Warwick), Communication Envelopes for Molecular Diffusion and Electromagnetic Wave Propag…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271643-Guo.mp4 
  19. .@ChrisAldrich @andreweckford @Storify @BIRS_Math Sounds like a fascinating workshop on bioinformation theory in Banff.
  20. Toby Berger, winner of the 2002 Shannon award, speaking right now. @BIRS_Math
  21. Naftali Tishby (Hebrew University of Jerusalem), Sensing and acting under information constraints – a principled a…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281032-Tishby.mp4 
  22. “…places such as BIRS and the Banff Centre exist to facilitate the exchange and pursuit of knowledge.” S. Sundaram  http://www.birs.ca/testimonials/#testimonial-1454 
  23. We’re going for a hike tomorrow. Many thanks to Lukas at the @ParksCanada info centre in Banff for helpful advice! @BIRS_Math
  24. Alexander Dimitrov (Washington State University), Invariant signal processing in auditory biological systems  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281416-Dimitrov.mp4 
  25. Joel Zylberberg (University of Washington), Communicating with noisy signals: lessons learned from the mammalian v…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281450-Zylberberg.mp4 
  26. Robert Schober (Universitat Erlangen-Nurnberg), Intersymbol interference mitigation in diffusive molecular communi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281549-Schober.mp4 
  27. Rudolf Rabenstein (Friedrich-Alexander-Universitat Erlangen-Nurnberg (FAU)), Modelling Molecular Communication Cha…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281627-Rabenstein.mp4 
  28. THis week @BIRS_Math ” Biological and Bio-Inspired Information Theory ” @thebanffcentre #biology #math @NSF
  29. “Your theory might match the data, but the data might be wrong” – Crick @BIRS_Math
  30. So information theory seems to be a big deal in ecology. @BIRS_Math
  31. Tom Schneider (National Institutes of Health), Three Principles of Biological States: Ecology and Cancer  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410290904-Schneider.mp4 
  32. “In biodiversity, the entropy of an ecosystem is the expected … information we gain about an organism by learning its species” @BIRS_Math
  33. Seriously, I’m blown away by this work in information theory in ecology. Huge body of work; I had no idea. @BIRS_Math
  34. I encourage @BIRS_Math attendees at Biological & Bio-Inspired Information Theory to contribute references here:  http://bit.ly/1jQwObk 
  35. Christoph Adami (Michigan State University), Some Information-Theoretic Musings Concerning the Origin and Evolutio…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291114-Adami.mp4 
  36. .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio http://t.co/VA8komuuSW

    .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio pic.twitter.com/VA8komuuSW
  37. ICYMI @ChristophAdami had great paper: Information-theoretic Considerations on Origin of Life on arXiv  http://bit.ly/1yIhK2Q  @BIRS_Math
  38. Baez has a post on Tishby's talk "Sensing &  Acting Under Information Constraints" http://t.co/t1nPVI1pxa @BIRS_Math http://t.co/dFuiVLFSGC

    Baez has a post on Tishby’s talk “Sensing & Acting Under Information Constraints”  http://bit.ly/1yIDonR  @BIRS_Math pic.twitter.com/dFuiVLFSGC
  39. INFORMATION THEORY is the new central ...

    INFORMATION THEORY is the new central …
  40. I’m listening to a talk on the origin of life at a workshop on Biological and Bio-Inspired Information Theory. …  https://plus.google.com/117562920675666983007/posts/gqFL7XY3quF 
  41. Now accepting applications for the #Research Collaboration Workshop for Women in #MathBio at NIMBioS  http://ow.ly/DzeZ7 
  42. We removed a faulty microphone from our lecture room this morning. We’re now fixing the audio buzz in this week’s videos, and reposting.
  43. Didn’t get enough information theory & biology this week @BIRS_Math? Apply for NIMBioS workshop in April 2015  http://bit.ly/1yIeiWe  #ITBio
  44. Amin Emad (University of Illinois at Urbana-Champaign), Applications of Discrete Mathematics in Bioinformatics  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301329-Emad.mp4 
  45. Paul Bogdan (University of Southern California), Multiscale Analysis Reveals Complex Behavior in Bacteria Populati…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301401-Bogdan.mp4 
  46. Lubomir Kostal (Institute of Physiology, Academy of Sciences of the Czech Republic), Efficient information transmi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301534-Kostal.mp4 
  47. Banff ☀️❄️🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲❤️
  48. @conservativelez I’m a big fan of your dad’s research & was reminded of much of it via a workshop on Biological Information Theory
  49. @conservativelez Though he may not have been able to attend, he can catch most of the talks online if he’d like  https://www.birs.ca/events/2014/5-day-workshops/14w5170 
  50. Depressed that @BIRS_Math Workshop on Biological & Bio-Inspired Information Theory is over? Relive it here:  http://bit.ly/1rF3G4B  #ITBio
  51. A few thoughts about that workshop while I wait for my flight back to Toronto.
  52. 1/ Everyone I talked to said it was the best workshop they’d ever been to, and they’d like to do a follow-up workshop @BIRS_Math
  53. 2/ There is an amazing diversity of work under the umbrella of “information theory”. @BIRS_Math
  54. 3/ Much of this work is outside the IT mainstream, and an issue is that people use different terms for related concepts. @BIRS_Math
  55. 4/ Some community building is in order. I think this workshop was a good first step. @BIRS_Math
  56. 5/ Many many thanks to @BIRS_Math and huge kudos to @NGhoussoub for excellent service to the Canadian scientific community. BIRS is a gem.
  57. 6/ Also many thanks to the participants for their excellent talks, and to @ChrisAldrich for maintaining a Storify.


Information Theory is the New Central Discipline

Replied to Information Theory is the new central discipline. by Nassim Nicholas Taleb (facebook.com)

INFORMATION THEORY is the new central discipline. This graph was from 20y ago in the seminal book Cover and Thomas, as the field was starting to be defined. Now Information Theory has been expanded to swallow even more fields.

Born in, of all disciplines, Electrical Engineering, the field has progressively infiltrating probability theory, computer science, statistical physics, data science, gambling theory, ruin problems, complexity, even how one deals with knowledge, epistemology. It defines noise/signal, order/disorder, etc. It studies cellular automata. You can use it in theology (FREE WILL & algorithmic complexity). As I said, it is the MOTHER discipline.

I am certain much of Medicine will naturally grow to be a subset of it, both operationally, and in studying how the human body works: the latter is an information machine. Same with linguistics. Same with political “science”, same with… everything.

I am saying this because I figured out what the long 5th volume of the INCERTO will be. Cannot say now with any precision but it has to do with a variant of entropy as the core natural generator of Antifragility.

[Revised to explain that it is not *replacing* other disciplines, just infiltrating them as the point was initially misunderstood…]

Nassim Nicholas Taleb via Facebook

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Venn Diagram of how information theory relates to other fields.
Figure 1.1 [page 2] from
Thomas M. Cover and Joy Thomas’s textbook Elements of Information Theory, Second Edition
(John Wiley & Sons, Inc., 2006) [First Edition, 1991]
Syndicated copies to:

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Dr. Christoph Salge asked me to cross-post this notice from the Entropy site here.

Editor’s Note: Dr. Christoph Salge asked me to cross-post the following notice from the Entropy site here. I’m sure many of the readers of the blog will find the topic to be of interest.


 

Logo for the journal Entropy

 

Dear Colleagues,

In recent years, ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. However, how can information, or more formally Information Theory, increase our understanding of life, or life-like systems?

Information Theory not only has a profound mathematical basis, but also typically provides an intuitive understanding of processes, such as learning, behavior and evolution terms of information processing.

In this special issue, we are interested in both:

  1. the information-theoretic formalization and quantification of different aspects of life, such as driving forces of learning and behavior generation, information flows between neurons, swarm members and social agents, and information theoretic aspects of evolution and adaptation, and
  2. the simulation and creation of life-like systems with previously identified principles and incentives.

Topics with relation to artificial and natural systems:

  • information theoretic intrinsic motivations
  • information theoretic quantification of behavior
  • information theoretic guidance of artificial evolution
  • information theoretic guidance of self-organization
  • information theoretic driving forces behind learning
  • information theoretic driving forces behind behavior
  • information theory in swarms
  • information theory in social behavior
  • information theory in evolution
  • information theory in the brain
  • information theory in system-environment distinction
  • information theory in the perception action loop
  • information theoretic definitions of life

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

Deadline for manuscript submissions: 28 February 2015

Special Issue Editors

Guest Editor
Dr. Christoph Salge
Adaptive Systems Research Group,University of Hertfordshire, College Lane, AL10 9AB Hatfield, UK
Website: http://homepages.stca.herts.ac.uk/~cs08abi
E-Mail: c.salge@herts.ac.uk
Phone: +44 1707 28 4490
Interests: Intrinsic Motivation (Empowerment); Self-Organization; Guided Self-Organization; Information-Theoretic Incentives for Social Interaction; Information-Theoretic Incentives for Swarms; Information Theory and Computer Game AI

Guest Editor
Dr. Georg Martius
Cognition and Neurosciences, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://www.mis.mpg.de/jjost/members/georg-martius.html
E-Mail: martius@mis.mpg.de
Phone: +49 341 9959 545
Interests: Autonomous Robots; Self-Organization; Guided Self-Organization; Information Theory; Dynamical Systems; Machine Learning; Neuroscience of Learning; Optimal Control

Guest Editor
Dr. Keyan Ghazi-Zahedi
Information Theory of Cognitive Systems, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://personal-homepages.mis.mpg.de/zahedi
E-Mail: zahedi@mis.mpg.de
Phone: +49 341 9959 535
Interests: Embodied Artificial Intelligence; Information Theory of the Sensorimotor Loop; Dynamical Systems; Cybernetics; Self-organisation; Synaptic plasticity; Evolutionary Robotics

Guest Editor
Dr. Daniel Polani
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Website: http://homepages.feis.herts.ac.uk/~comqdp1/
E-Mail: d.polani@herts.ac.uk
Interests: artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

Syndicated copies to:

Introduction to Lie Groups and Lie Algebras | UCLA Extension

Looking for some serious entertainment on Tuesday nights this fall? Professor Mike Miller has got you covered!

Exercise Your Brain

As many may know or have already heard, Dr. Mike Miller, a retired mathematician from RAND and long-time math professor at UCLA, is offering a course on Introduction to Lie Groups and Lie Algebras this fall through UCLA Extension.  Whether you’re a professional mathematician, engineer, physicist, physician, or even a hobbyist interested in mathematics you’ll be sure to get something interesting out of this course, not to mention the camaraderie of 20-30 other “regulars” with widely varying backgrounds (actors to surgeons and evolutionary theorists to engineers) who’ve been taking almost everything Mike has offered over the years (and yes, he’s THAT good — we’re sure you’ll be addicted too.)

“Beginners” Welcome!

Even if it’s been years since you last took Calculus or Linear Algebra, Mike (and the rest of the class) will help you get quickly back up to speed to delve into what is often otherwise a very deep subject.  If you’re interested in advanced physics, quantum mechanics, quantum information or string theory, this is one of the topics that is de rigueur for delving in deeply and being able to understand them better. The topic is also one near and dear to the hearts of those in robotics, graphics, 3-D modelling, gaming, and areas utilizing multi-dimensional rotations. And naturally, it’s simply a beautiful and elegant subject for those who have no need to apply it to anything, but who just want to meander their way through higher mathematics for the fun of it (this will comprise the largest majority of the class by the way.)

Whether you’ve been away from serious math for decades or use it every day or even if you’ve never gone past Calculus or Linear Algebra, this is bound to be the most entertaining thing you can do with your Tuesday nights in the fall.  If you’re not sure what you’re getting into (or are scared a bit by the course description), I highly encourage to come and join us for at least the first class before you pass up on the opportunity.  I’ll mention that the greater majority of new students to Mike’s classes join the ever-growing group of regulars who take almost everything he teaches subsequently. (For the reticent, I’ll mention that one of the first courses I took from Mike was Algebraic Topology which generally requires a few semesters of Abstract Algebra and a semester of Topology as prerequisites.  I’d taken neither of these prerequisites, but due to Mike’s excellent lecture style and desire to make everything comprehensible, I was able to do exceedingly well in the course.) I’m happy to chat with those who may be reticent. Also keep in mind that you can register to take the class for a grade, pass/fail, or even no grade at all to suit your needs/lifestyle.

My classes have the full spectrum of students from the most serious to the hobbyist to those who are in it for the entertainment and  ‘just enjoy watching it all go by.’

Mike Miller, Ph.D.

As a group, some of us have a collection of a few dozen texts in the area which we’re happy to loan out as well.  In addition to the one recommended text (Mike always gives such comprehensive notes that any text for his classes is purely supplemental at best), several of us have also found some good similar texts:

Given the breadth and diversity of the backgrounds of students in the class, I’m sure Mike will spend some reasonable time at the beginning [or later in the class, as necessary] doing a quick overview of some linear algebra and calculus related topics that will be needed later in the quarter(s).

Further information on the class and a link to register can be found below. If you know of others who might be interested in this, please feel free to forward it along – the more the merrier.

I hope to see you all soon.


Introduction to Lie Groups and Lie Algebras

MATH X 450.6  /  3.00 units /  Reg. # 249254W
Professor: Michael Miller, Ph.D.
Start Date: 9/30/2014
Location UCLA: 5137 Math Sciences Building
Tuesday, 7-10pm
September 30 – December 16, 2014
11 meetings total (no mtg 11/11)
Register here: https://www.uclaextension.edu/Pages/Course.aspx?reg=249254

Course Description

Lie group is a differentiable manifold that is also a group for which the product and inverse maps are differentiable. A Lie algebra is a vector space endowed with a binary operation that is bilinear, alternating, and satisfies the so-called Jacobi identity. This course, the first in a 2-quarter sequence, is an introductory survey of Lie groups, their associated Lie algebras, and their representations. This first quarter will focus on the special case of matrix Lie groups–including general linear, special linear, orthogonal, unitary, and symplectic. The second quarter will generalize the theory developed to the case of arbitrary Lie groups. Topics to be discussed include compactness and connectedness, homomorphisms and isomorphisms, exponential mappings, the Baker-Campbell-Hausdorff formula, covering groups, and the Weyl group. This is an advanced course, requiring a solid understanding of linear algebra and basic analysis.

Recommended Textbook

Hall, Brian. Lie Groups, Lie Algebras, & Representations (Springer, 2004) ISBN: 9781441923134

 

Portrait of Sophus Lie (1842-1899)
Sophus Lie (1842-1899)

If I had a dollar for every time someone invited me to a Lie Algebra class, I’d be a…

 

Syndicated copies to:

Workshop Announcement: Entropy and Information in Biological Systems

I rarely, if ever, reblog anything here, but this particular post from John Baez’s blog Azimuth is so on-topic, that attempting to embellish it seems silly.

Entropy and Information in Biological Systems (Part 2)

John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend:

Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015.

Click the link, read the stuff and scroll down to “CLICK HERE” to apply. The deadline is 12 November 2014.

Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.

The idea
Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.

So, here are the goals of our workshop:

  •  To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
  • To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
  • To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
  • To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
  • To study the interplay between information theory and the thermodynamics of individual cells and organelles.

For more details, go here.

If you’ve got colleagues who might be interested in this, please let them know. You can download a PDF suitable for printing and putting on a bulletin board by clicking on this: