I LATEXed up lecture notes for many of the classes I have taken; feel free to read through them or use them to review. If you find a mistake or typo, please let me know. If you want to look over the .tex source for any of these notes, please send me an email.
via Rama Kunapuli.
Revisited this collection of Richard Feynman's eclectic adventures, and found them more inspiring than ever -- though parts demand a charitable eye
Module 4: -- Edward Frenkel: Harmonic Analysis
Module 1: -- Edward Frenkel: The Continents of Mathematics
Module 3: -- Edward Frenkel: Galois Groups
Module 2: -- Edward Frenkel: Symmetry and Unification
Squirrels were stealing my bird seed so I solved the problem with mechanical engineering :)
My PhD thesis “At the Interface of Algebra and Statistics” is now on the arXiv! https://t.co/7IwEu8wqQC It uses basic tools in quantum physics to explore mathematical structure that's both algebraic & statistical. Curious? See my new 10m video on YouTube!! https://t.co/zOtBtGeQVK pic.twitter.com/U8X622mtse— Tai-Danae Bradley (@math3ma) April 14, 2020
This thesis takes inspiration from quantum physics to investigate mathematical structure that lies at the interface of algebra and statistics. The starting point is a passage from classical probability theory to quantum probability theory. The quantum version of a probability distribution is a density operator, the quantum version of marginalizing is an operation called the partial trace, and the quantum version of a marginal probability distribution is a reduced density operator. Every joint probability distribution on a finite set can be modeled as a rank one density operator. By applying the partial trace, we obtain reduced density operators whose diagonals recover classical marginal probabilities. In general, these reduced densities will have rank higher than one, and their eigenvalues and eigenvectors will contain extra information that encodes subsystem interactions governed by statistics. We decode this information, and show it is akin to conditional probability, and then investigate the extent to which the eigenvectors capture "concepts" inherent in the original joint distribution. The theory is then illustrated with an experiment that exploits these ideas. Turning to a more theoretical application, we also discuss a preliminary framework for modeling entailment and concept hierarchy in natural language, namely, by representing expressions in the language as densities. Finally, initial inspiration for this thesis comes from formal concept analysis, which finds many striking parallels with the linear algebra. The parallels are not coincidental, and a common blueprint is found in category theory. We close with an exposition on free (co)completions and how the free-forgetful adjunctions in which they arise strongly suggest that in certain categorical contexts, the "fixed points" of a morphism with its adjoint encode interesting information.
After an early breakthrough on light and matter, he became a writer who challenged climate science and pondered space exploration and nuclear warfare.
Bookmarked on March 21, 2020 at 02:39PM
The word dord is a dictionary error in lexicography. It was accidentally created, as a ghost word, by the staff of G. and C. Merriam Company (now part of Merriam-Webster) in the New International Dictionary, second edition (1934). That dictionary defined the term a synonym for density used in physics and chemistry in the following way: "dord (dôrd), n. Physics & Chem. Density."
Sir Roger Penrose has had a remarkable life. He has contributed an enormous amount to our understanding of general relativity, perhaps more than anyone since Einstein himself — Penrose diagrams, singularity theorems, the Penrose process, cosmic censorship, and the list goes on. He has made important contributions to mathematics, including such fun ideas as the Penrose triangle and aperiodic tilings. He has also made bold conjectures in the notoriously contentious areas of quantum mechanics and the study of consciousness. In his spare time he’s managed to become an extremely successful author, writing such books as The Emperor’s New Mind and The Road to Reality. With far too much that we could have talked about, we decided to concentrate in this discussion on spacetime, black holes, and cosmology, but we made sure to reserve some time to dig into quantum mechanics and the brain by the end.
Maxwell’s Demon is a famous thought experiment in which a mischievous imp uses knowledge of the velocities of gas molecules in a box to decrease the entropy of the gas, which could then be used to do useful work such as pushing a piston. This is a classic example of converting information (what the gas molecules are doing) into work. But of course that kind of phenomenon is much more widespread — it happens any time a company or organization hires someone in order to take advantage of their know-how. César Hidalgo has become an expert in this relationship between information and work, both at the level of physics and how it bubbles up into economies and societies. Looking at the world through the lens of information brings new insights into how we learn things, how economies are structured, and how novel uses of data will transform how we live.
César Hidalgo received his Ph.D. in physics from the University of Notre Dame. He currently holds an ANITI Chair at the University of Toulouse, an Honorary Professorship at the University of Manchester, and a Visiting Professorship at Harvard’s School of Engineering and Applied Sciences. From 2010 to 2019, he led MIT’s Collective Learning group. He is the author of Why Information Grows and co-author of The Atlas of Economic Complexity. He is a co-founder of Datawheel, a data visualization company whose products include the Observatory of Economic Complexity.
I was also piqued at the mention of Lynne Kelly’s work, which I’m now knee deep into. I suspect it could dramatically expand on what we think of as the capacity of a personbyte, though the limit of knowledge there still exists. The idea of mnemotechniques within indigenous cultures certainly expands on the way knowledge worked in prehistory and what we classically think of and frame collective knowledge or collective learning.
I also think there are some interesting connections with Dr. Kelly’s mentions of social equity in prehistorical cultures and the work that Hidalgo mentions in the middle of the episode.
There are a small handful of references I’ll want to delve into after hearing this, though it may take time to pull them up unless they’re linked in the show notes.
hat-tip: Complexity Digest for the reminder that this is in my podcatcher. 🔖 November 22, 2019 at 03:28PM
Hard as he tried, Murray Gell-Mann could never make himself into a legend like his rakish colleague and collaborator, Richard Feynman -- even if he was probably the greater physicist