Introduction to Galois Theory | Coursera

Bookmarked Introduction to Galois Theory by Ekaterina AmerikEkaterina Amerik (Coursera)
A very beautiful classical theory on field extensions of a certain type (Galois extensions) initiated by Galois in the 19th century. Explains, in particular, why it is not possible to solve an equation of degree 5 or more in the same way as we solve quadratic or cubic equations. You will learn to compute Galois groups and (before that) study the properties of various field extensions. We first shall survey the basic notions and properties of field extensions: algebraic, transcendental, finite field extensions, degree of an extension, algebraic closure, decomposition field of a polynomial. Then we shall do a bit of commutative algebra (finite algebras over a field, base change via tensor product) and apply this to study the notion of separability in some detail. After that we shall discuss Galois extensions and Galois correspondence and give many examples (cyclotomic extensions, finite fields, Kummer extensions, Artin-Schreier extensions, etc.). We shall address the question of solvability of equations by radicals (Abel theorem). We shall also try to explain the relation to representations and to topological coverings. Finally, we shall briefly discuss extensions of rings (integral elemets, norms, traces, etc.) and explain how to use the reduction modulo primes to compute Galois groups.
I’ve been watching MOOCs for several years and this is one of the few I’ve come across that covers some more advanced mathematical topics. I’m curious to see how it turns out and what type of interest/results it returns.

It’s being offered by National Research University – Higher School of Economics (HSE) in Russia.

[1609.02422] What can logic contribute to information theory?

Bookmarked [1609.02422] What can logic contribute to information theory? by David EllermanDavid Ellerman (128.84.21.199)
Logical probability theory was developed as a quantitative measure based on Boole's logic of subsets. But information theory was developed into a mature theory by Claude Shannon with no such connection to logic. But a recent development in logic changes this situation. In category theory, the notion of a subset is dual to the notion of a quotient set or partition, and recently the logic of partitions has been developed in a parallel relationship to the Boolean logic of subsets (subset logic is usually mis-specified as the special case of propositional logic). What then is the quantitative measure based on partition logic in the same sense that logical probability theory is based on subset logic? It is a measure of information that is named "logical entropy" in view of that logical basis. This paper develops the notion of logical entropy and the basic notions of the resulting logical information theory. Then an extensive comparison is made with the corresponding notions based on Shannon entropy.
Ellerman is visiting at UC Riverside at the moment. Given the information theory and category theory overlap, I’m curious if he’s working with John Carlos Baez, or what Baez is aware of this.

Based on a cursory look of his website(s), I’m going to have to start following more of this work.

📖 61.0% done with Carioca Fletch (Fletch #7) by Gregory McDonald

📖 61.0% done with Carioca Fletch (Fletch #7) by Gregory McDonald

The plot seems to have slowed down significantly since the opening, but is just finally getting moving again.

Selfie with author Henry James Korn who reveals details about his next novel

A great lunch with author @henryjameskorn. Heard about his next novel which I hope to read the first draft of this weekend

Instagram filter used: Lark

Photo taken at: Porta Via

I had lunch today with author Henry James Korn who revealed big chunks of the plot of his upcoming novel Zionista to me. I should be getting a copy of the first draft to read over the weekend, and I can’t wait. It sounds like it continues the genius of his political satire in Amerikan Krazy.

A discussion of Grit at Innovate Pasadena

A discussion of Grit at Innovate Pasadena
Brian Nolan and Michael Ugino, co-founders of Pasadena-based Sellbrite, talk: “Grit – What it Takes to be a Successful Entrepreneur

Instagram filter used: Moon

Photo taken at: Cross Campus Pasadena

Hector Zenil

I’ve run across some of his work before, but I ran into some new material by Hector Zenil that will likely interest those following information theory, complexity, and computer science here. I hadn’t previously noticed that he refers to himself on his website as an “information theoretic biologist” — everyone should have that as a title, shouldn’t they? As a result, I’ve also added him to the growing list of ITBio Researchers.

If you’re not following him everywhere (?) yet, start with some of the sites below (or let me know if I’ve missed anything).

Hector Zenil:

His most recent paper on arXiv:
Low Algorithmic Complexity Entropy-deceiving Graphs | .pdf

A common practice in the estimation of the complexity of objects, in particular of graphs, is to rely on graph- and information-theoretic measures. Here, using integer sequences with properties such as Borel normality, we explain how these measures are not independent of the way in which a single object, such a graph, can be described. From descriptions that can reconstruct the same graph and are therefore essentially translations of the same description, we will see that not only is it necessary to pre-select a feature of interest where there is one when applying a computable measure such as Shannon Entropy, and to make an arbitrary selection where there is not, but that more general properties, such as the causal likeliness of a graph as a measure (opposed to randomness), can be largely misrepresented by computable measures such as Entropy and Entropy rate. We introduce recursive and non-recursive (uncomputable) graphs and graph constructions based on integer sequences, whose different lossless descriptions have disparate Entropy values, thereby enabling the study and exploration of a measure’s range of applications and demonstrating the weaknesses of computable measures of complexity.

Subjects: Information Theory (cs.IT); Computational Complexity (cs.CC); Combinatorics (math.CO)
Cite as: arXiv:1608.05972 [cs.IT] (or arXiv:1608.05972v4 [cs.IT]

YouTube

Yesterday he also posted two new introductory videos to his YouTube channel. There’s nothing overly technical here, but they’re nice short productions that introduce some of his work. (I wish more scientists did communication like this.) I’m hoping he’ll post them to his blog and write a bit more there in the future as well.

Universal Measures of Complexity

Relevant literature:

Reprogrammable World

Relevant literature:

Cross-boundary Behavioural Reprogrammability Reveals Evidence of Pervasive Turing Universality by Jürgen Riedel, Hector Zenil
Preprint available at http://arxiv.org/abs/1510.01671

Ed.: 9/7/16: Updated videos with links to relevant literature

Randomness And Complexity, from Leibniz To Chaitin | World Scientific Publishing

Bookmarked Randomness And Complexity, from Leibniz To Chaitin (amzn.to)
The book is a collection of papers written by a selection of eminent authors from around the world in honour of Gregory Chaitin s 60th birthday. This is a unique volume including technical contributions, philosophical papers and essays. Hardcover: 468 pages; Publisher: World Scientific Publishing Company (October 18, 2007); ISBN: 9789812770820