To make sense of President Donald Trump's first year in the White House, many have come to rely on Maggie Haberman. The powerhouse reporter for the New York Times talks with Atlantic editor in chief Jeffrey Goldberg about how her career covering New York City politics for the tabloids has given her a unique view of Trump. To Haberman, Trump's brashness and need for approval are partly products of his distinct experience of New York City.
A fascinating interview to be sure. There’s some subtlety particularly about Donald Trump that is injected here that I wouldn’t have thought about previously. I certainly don’t have more hope as a result, but I do have a lot more nuance in how he functions and interacts with others. There is some particularly fascinating discussion on language/linguistics which impinges on some of the discussion in my article Complexity isn’t a Vice: 10 Word Answers and Doubletalk in Election 2016.
Pinker talking about his then new book, The Stuff of Thought: Language as a Window into Human Nature, and doing what he does best: combining psychology and neuroscience with linguistics. The result is as entertaining as it is insightful.
One of America’s foremost philosophers offers a major new account of the origins of the conscious mind.
How did we come to have minds?
For centuries, this question has intrigued psychologists, physicists, poets, and philosophers, who have wondered how the human mind developed its unrivaled ability to create, imagine, and explain. Disciples of Darwin have long aspired to explain how consciousness, language, and culture could have appeared through natural selection, blazing promising trails that tend, however, to end in confusion and controversy. Even though our understanding of the inner workings of proteins, neurons, and DNA is deeper than ever before, the matter of how our minds came to be has largely remained a mystery.
That is now changing, says Daniel C. Dennett. In From Bacteria to Bach and Back, his most comprehensive exploration of evolutionary thinking yet, he builds on ideas from computer science and biology to show how a comprehending mind could in fact have arisen from a mindless process of natural selection. Part philosophical whodunit, part bold scientific conjecture, this landmark work enlarges themes that have sustained Dennett’s legendary career at the forefront of philosophical thought.
In his inimitable style―laced with wit and arresting thought experiments―Dennett explains that a crucial shift occurred when humans developed the ability to share memes, or ways of doing things not based in genetic instinct. Language, itself composed of memes, turbocharged this interplay. Competition among memes―a form of natural selection―produced thinking tools so well-designed that they gave us the power to design our own memes. The result, a mind that not only perceives and controls but can create and comprehend, was thus largely shaped by the process of cultural evolution.
An agenda-setting book for a new generation of philosophers, scientists, and thinkers, From Bacteria to Bach and Back will delight and entertain anyone eager to make sense of how the mind works and how it came about.
How Donald Trump is leveraging an old Vaudeville trick to heavily contest the presidential election
A Problem with Transcripts
In the past few weeks, I’ve seen dozens of news outlets publish multi-paragraph excerpts of speeches from Donald Trump and have been appalled that I was unable to read them in any coherent way. I could not honestly follow or discern any coherent thought or argument in the majority of them. I was a bit shocked because in listening to him, he often sounds like he has some kind of point, though he seems to be spouting variations on one of ten one-liners he’s been using for over a year now. There’s apparently a flaw in our primal reptilian brains that seems to be tricking us into thinking that there’s some sort of substance in his speech when there honestly is none. I’m going to have to spend some time reading more on linguistics and cognitive neuroscience. Maybe Stephen Pinker knows of an answer?
The situation got worse this week as I turned to news sources for fact-checking of the recent presidential debate. While it’s nice to have web-based annotation tools like Genius and Hypothes.is to mark up these debates, it becomes another thing altogether to understand the meaning of what’s being said in order to actually attempt to annotate it. I’ve included some links so that readers can attempt the exercise for themselves.
Recent transcripts (some with highlights/annotations):
It’s been a while since Americans were broadly exposed to actual doubletalk. For the most part our national experience with it has been a passing curiosity highlighted by comedians.
n. (NORTH AMERICAN)
a deliberately unintelligible form of speech in which inappropriate, invented or nonsense syllables are combined with actual words. This type of speech is commonly used to give the appearance of knowledge and thereby confuse, amuse, or entertain the speaker’s audience.
another term for doublespeak
see also n. doubletalk 
Since the days of vaudeville (and likely before), comedians have used doubletalk to great effect on stage, in film, and on television. Some comedians who have historically used the technique as part of their acts include Al Kelly, Cliff Nazarro, Danny Kaye, Gary Owens, Irwin Corey, Jackie Gleason, Sid Caesar, Stanley Unwin, and Reggie Watts. I’m including some short video clips below as examples.
A well-known, but foreshortened, form of it was used by Dana Carvey in his Saturday Night Live performances caricaturizing George H.W. Bush by using a few standard catch phrases with pablum in between: “Not gonna do it…”, “Wouldn’t be prudent at this juncture”, and “Thousand Points of Light…”. These snippets in combination with some creative hand gestures (pointing, lacing fingers together), along with a voice melding of Mr. Rogers and John Wayne were the simple constructs that largely transformed a diminutive comedian convincingly into a president.
Doubletalk also has a more “educated” sibling known as technobabble. Engineers are sure to recall a famous (and still very humorous) example of both doubletalk and technobabble in the famed description of the Turboencabulator. (See also, the short videos below.)
Doubletalk comedy examples
Al Kelly on Ernie Kovaks
Rockwell Turbo Encabulator Version 2
And of course doubletalk and technobabble have closely related cousins named doublespeak and politicobabble. These are far more dangerous than the others because they move over the line of comedy into seriousness and are used by people who make decisions effecting hundreds of thousands to millions, if not billions, of people on the planet. I’m sure an archeo-linguist might be able to discern where exactly politicobabble emerged and managed to evolve into a non-comedic form of speech which people manage to take far more seriously than its close ancestors. One surely suspects some heavy influence from George Orwell’s corpus of work:
While politicobabble is nothing new, I did find a very elucidating passage from the 1992 U.S. Presidential Election cycle which seems to be a major part of the Trump campaign playbook:
In the continuation of the article, Jacobs goes on to give a variety of examples of the term as well as a “translation” guide for some of the common politicobabble words from that particular election. I’ll leave it to the capable hands of others (perhaps in the comments, below?) to come up with the translation guide for our current political climate.
The interesting evolutionary change I’ll note for the current election cycle is that Trump hasn’t delved into any depth on any of his themes to offend anyone significantly enough. This has allowed him to stay with the dozen or so themes he started out using and therefore hasn’t needed to change them as in campaigns of old.
Filling in the Blanks
These forms of pseudo-speech area all meant to fool us into thinking that something of substance is being discussed and that a conversation is happening, when in fact, nothing is really being communicated at all. Most of the intended meaning and reaction to such speech seems to stem from the demeanor of the speaker as well as, in some part, to the reaction of the surrounding interlocutor and audience. In reading Donald Trump transcripts, an entirely different meaning (or lack thereof) is more quickly realized as the surrounding elements which prop up the narrative have been completely stripped away. In a transcript version, gone is the hypnotizing element of the crowd which is vehemently sure that the emperor is truly wearing clothes.
In many of these transcripts, in fact, I find so little is being said that the listener is actually being forced to piece together the larger story in their head. Being forced to fill in the blanks in this way leaves too much of the communication up to the listener who isn’t necessarily engaged at a high level. Without more detail or context to understand what is being communicated, the listener is far more likely to fill in the blanks to fit a story that doesn’t create any cognitive dissonance for themselves — in part because Trump is usually smiling and welcoming towards his adoring audiences.
One will surely recall that Trump even wanted Secretary Clinton to be happy during the debate when he said, “Now, in all fairness to Secretary Clinton — yes, is that OK? Good. I want you to be very happy. It’s very important to me.” (This question also doubles as an example of a standard psychological sales tactic of attempting to get the purchaser to start by saying ‘yes’ as a means to keep them saying yes while moving them towards making a purchase.)
His method of communicating by leaving large holes in his meaning reminds me of the way our brain smooths out information as indicated in this old internet meme:
I cdn’uolt blveiee taht I cluod aulaclty uesdnatnrd waht I was rdanieg: the phaonmneel pweor of the hmuan mnid. Aoccdrnig to a rseearch taem at Cmabrigde Uinervtisy, it deosn’t mttaer in waht oredr the ltteers in a wrod are, the olny iprmoatnt tihng is taht the frist and lsat ltteer be in the rghit pclae. The rset can be a taotl mses and you can sitll raed it wouthit a porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe. Scuh a cdonition is arpppoiatrely cllaed typoglycemia.
I’m also reminded of the biases and heuristics research carried out in part (and the remainder cited) by Daniel Kahneman in his book Thinking, Fast and Slow in which he discusses the mechanics of how system 1 and system 2 work in our brains. Is Trump taking advantage of the deficits of language processing in our brains in something akin to system 1 biases to win large blocks of votes? Is he creating a virtual real-time Choose-Your-Own-Adventure to subvert the laziness of the electorate? Kahneman would suggest the the combination of what Trump does say and what he doesn’t leaves it up to every individual listener to create their own story. Their system 1 is going to default to the easiest and most palatable one available to them: a happy story that fits their own worldview and is likely to encourage them to support Trump.
Ten Word Answers
As an information theorist, I know all too well that there must be a ‘linguistic Shannon limit’ to the amount of semantic meaning one can compress into a single word.  One is ultimately forced to attempt to form sentences to convey more meaning. But usually the less politicians say, the less trouble they can get into — a lesson hard won through generations of political fighting.
I’m reminded of a scene from The West Wing television series. In season 4, episode 6 which aired on October 30, 2002 on NBC, Game On had a poignant moment (video clip below) which is germane to our subject: 
Moderator: Governor Ritchie, many economists have stated that the tax cut, which is the centrepiece of your economic agenda, could actually harm the economy. Is now really the time to cut taxes? Governor Ritchie, R-FL: You bet it is. We need to cut taxes for one reason – the American people know how to spend their money better than the federal government does. Moderator: Mr. President, your rebuttal. President Bartlet: There it is…
That’s the 10 word answer my staff’s been looking for for 2 weeks. There it is.
10 word answers can kill you in political campaigns — they’re the tip of the sword.
Here’s my question: What are the next 10 words of your answer?
“Your taxes are too high?” So are mine…
Give me the next 10 words: How are we going to do it?
Give me 10 after that — I’ll drop out of the race right now.
Every once in a while — every once in a while, there’s a day with an absolute right and an absolute wrong, but those days almost always include body counts. Other than that there aren’t very many un-nuanced moments in leading a country that’s way too big for 10 words.
I’m the President of the United States, not the president of the people who agree with me. And by the way, if the left has a problem with that, they should vote for somebody else.
As someone who studies information theory and complexity theory and even delves into sub-topics like complexity and economics, I can agree wholeheartedly with the sentiment. Though again, here I can also see the massive gaps between system 1 and 2 that force us to want to simplify things down to such a base level that we don’t have to do the work to puzzle them out.
(And yes, that is Jennifer Anniston’s father playing the moderator.)
One can’t but wonder why Mr. Trump doesn’t seem to have ever gone past the first ten words? Is it because he isn’t capable? interested? Or does he instinctively know better? It would seem that he’s been doing business by using the uncertainty inherent in his speech for decades, but always operating by using what he meant (or thought he wanted to mean) than what the other party heard and thought they understood. If it ain’t broke, don’t fix it.
Idiocracy or Something Worse?
In our increasingly specialized world, people eventually have to give in and quit doing some tasks that everyone used to do for themselves. Yesterday I saw a lifeworn woman in her 70s pushing a wheeled wire basket with a 5 gallon container of water from the store to her home. As she shuffled along, I contemplated Thracian people from fourth century BCE doing the same thing except they likely carried amphorae possibly with a yoke and without the benefit of the $10 manufactured custom shopping cart. 20,000 years before that people were still carrying their own water, but possibly without even the benefit of earthenware containers. Things in human history have changed very slowly for the most part, but as we continually sub-specialize further and further, we need to remember that we can’t give up one of the primary functions that makes us human: the ability to think deeply and analytically for ourselves.
I suspect that far too many people are too wrapped up in their own lives and problems to listen to more than the ten word answers our politicians are advertising to us. We need to remember to ask for the next ten words and the ten after that.
Otherwise there are two extreme possible outcomes:
We’re either at the beginning of what Mike Judge would term Idiocracy. 
Here, one is tempted to quote George Santayana’s famous line (from The Life of Reason, 1905), “Those who cannot remember the past are condemned to repeat it.” However, I far prefer the following as more apropos to our present national situation:
If Cliff Navarro comes back to run for president, I hope no one falls for his joke just because he wasn’t laughing as he acted it out. If his instructions for fixing the wagon (America) are any indication, the voters who are listening and making the repairs will be in severe pain.
Advances in computing power, natural language processing, and digitization of text now make it possible to study our a culture's evolution through its texts using a "big data" lens. Our ability to communicate relies in part upon a shared emotional experience, with stories often following distinct emotional trajectories, forming patterns that are meaningful to us. Here, by classifying the emotional arcs for a filtered subset of 1,737 stories from Project Gutenberg's fiction collection, we find a set of six core trajectories which form the building blocks of complex narratives. We strengthen our findings by separately applying optimization, linear decomposition, supervised learning, and unsupervised learning. For each of these six core emotional arcs, we examine the closest characteristic stories in publication today and find that particular emotional arcs enjoy greater success, as measured by downloads.
Big History may indicate why we're losing diversity in the number of languages on Earth.
Yesterday, I saw an interesting linguistic exercise:
I have to imagine that once the conceptualization of language and some basic grammar existed, word generation was a much more common thing than it is now. It’s only been since the time of Noah Webster that humans have been actively standardizing things like spelling. If we can use Papua New Guinea as a model of pre-agrarian society and consider that almost 12% of extant languages on the Earth are spoken in an area about the size of Texas (and with about 1/5th the population of Texas too), then modern societies are actually severely limiting language (creation, growth, diversity, creativity, etc.) [cross reference: A World of Languages – and How Many Speak Them (Infographic)]
Consider that the current extinction of languages is about one every 14 weeks, which puts us on a course to loose about half of the 7,100 languages on the planet right now before the end of the century. Collective learning has potentially been growing at the expense of a shrinking body of diverse language! In the paper “Global distribution and drivers of language extinction risk” the authors indicate that of all the variables tested, economic growth was most strongly linked to language loss.
To help put this exercise into perspective, we can look at the corpus of extant written Latin (a technically dead language):
These numbers become even smaller when considering ancient Greek texts.
Another interesting measurement is the vocabulary of a modern 2 year old who typically has a 50-75 word vocabulary while a 4 year old has 250-500 words, which is about the level of the exercise.
As a contrast, consider the message in this TED Youth Talk from last year by Erin McKean, which students should be able to relate to:
And of course, there’s the dog Chaser, which 60 minutes recently reported has a vocabulary of over 1,000 words. (Are we now destroying variants of “dog language” for English too?!)
Hopefully the evolutionary value of the loss of the multiple languages will be more than balanced out by the power of collective learning in the long run.
An infographic from the South China Morning Post has some interesting statistics about which many modern people don’t know (or remember). It’s very interesting to see the distribution of languages and where they’re spoken. Of particular note that most will miss, even from this infographic, is that 839 languages are spoken in Papua New Guinea (11.8% of all known languages on Earth). Given the effects of history and modernity, imagine how many languages there might have been without them.
I’ve been a proponent and user of a variety of mnemonic systems since I was about eleven years old. The two biggest and most useful in my mind are commonly known as the “method of loci” and the “major system.” The major system is also variously known as the phonetic number system, the phonetic mnemonic system, or Hergione’s mnemonic system after French mathematician and astronomer Pierre Hérigone (1580-1643) who is thought to have originated its use.
The major system generally works by converting numbers into consonant sounds and then from there into words by adding vowels under the overarching principle that images (of the words) can be remembered more easily than the numbers themselves. For instance, one could memorize one’s grocery list of a hundred items by associating each shopping item on a numbered list with the word associated with the individual number in the list. As an example, if item 22 on the list is lemons, one could translate the number 22 as “nun” within the major system and then associate or picture a nun with lemons – perhaps a nun in full habit taking a bath in lemons to make the image stick in one’s memory better. Then at the grocery store, when going down one’s list, when arriving at number 22 on the list, one automatically translates the number 22 to “nun” which will almost immediately conjure the image of a nun taking a bath in lemons which gives one the item on the list that needed to be remembered. This comes in handy particularly when one needs to be able to remember large lists of items in and out of order.
The following generalized chart, which can be found in a hoard of books and websites on the topic, is fairly canonical for the overall system:
Mnemonic for remembering the numeral and consonant relationship
s, z, soft c
“z” is the first letter of zero; the other letters have a similar sound
t & d have one downstroke and sound similar (some variant systems include “th”)
n has two downstrokes
m has three downstrokes; m looks like a “3” on its side
last letter of four; 4 and R are almost mirror images of each other
L is the Roman Numeral for 50
/ʃ/ /ʒ/ /tʃ/ /dʒ/
j, sh, soft g, soft “ch”
a script j has a lower loop; g is almost a 6 rotated
k, hard c, hard g, hard “ch”, q, qu
capital K “contains” two sevens (some variant systems include “ng”)
script f resembles a figure-8; v sounds similar (v is a voiced f)
p is a mirror-image 9; b sounds similar and resembles a 9 rolled around
Vowel sounds, w,h,y
w and h are considered half-vowels; these can be used anywhere without changing a word’s number value
There are a variety of ways to use the major system as a code in addition to its uses in mnemonic settings. When I was a youth, I used it to write coded messages and to encrypt a variety of things for personal use. After I had originally read Dr. Bruno Furst’s series of booklets entitled You Can Remember: A Home Study Course in Memory and Concentration  , I had always wanted to spend some time creating an alternate method of writing using the method. Sadly I never made the time to do the project, but yesterday I made a very interesting discovery that, to my knowledge, doesn’t seem to have been previously noticed!
My discovery began last week when I read an article in The Atlantic by journalist Dennis Hollier entitled How to Write 225 Words Per Minute with a Pen: A Lesson in the Lost Technology of Shorthand.  In the article, which starts off with a mention of the Livescribe pen – one of my favorite tools, Mr. Hollier outlines the use of the Gregg System of Shorthand which was invented by John Robert Gregg in 1888. The description of the method was intriguing enough to me that I read a dozen additional general articles on shorthand on the internet and purchased a copy of Louis A. Leslie’s two volume text Gregg Shorthand: Functional Method. 
I was shocked, on page x of the front matter, just before the first page of the text, to find the following “Alphabet of Gregg Shorthand”:
Gregg Shorthand is using EXACTLY the same consonant-type breakdown of the alphabet as the major system!
Apparently I wasn’t the first to have the idea to turn the major system into a system of writing. The fact that the consonant breakdowns for the major system coincide almost directly to those for the shorthand method used by Gregg cannot be a coincidence!
The Gregg system works incredibly well precisely because the major system works so well. The biggest difference between the two systems is that Gregg utilizes a series of strokes (circles and semicircles) to indicate particular vowel sounds which allows for better differentiation of words which the major system doesn’t generally take into consideration. From an information theoretic standpoint, this is almost required to make the coding from one alphabet to the other possible, but much like ancient Hebrew, leaving out the vowels doesn’t remove that much information. Gregg, also like Hebrew, also uses dots and dashes above or below certain letters to indicate the precise sound of many of its vowels.
The upside of all of this is that the major system is incredibly easy to learn and use, and from here, learning Gregg shorthand is just a hop, skip , and a jump – heck, it’s really only just a hop because the underlying structure is so similar. Naturally as with the major system, one must commit some time to practicing it to improve on speed and accuracy, but the general learning of the system is incredibly straightforward.
Because the associations between the two systems are so similar, I wasn’t too surprised to find that some of the descriptions of why certain strokes were used for certain letters were very similar to the mnemonics for why certain letters were used for certain numbers in the major system.
One thing I have noticed in my studies on these topics is the occasional references to the letter combinations “NG” and “NK”. I’m curious why these are singled out in some of these systems? I have a strong suspicion that their inclusion/exclusion in various incarnations of their respective systems may be helpful in dating the evolution of these systems over time.
I’m aware that various versions of shorthand have appeared over the centuries with the first recorded having been the “Tironian Notes” of Marcus Tullius Tiro (103-4 BCE) who apparently used his system to write down the speeches of his master Cicero. I’m now much more curious at what point the concepts for shorthand and the major system crossed paths or converged? My assumption would be that it happened in the late Renaissance, but it would be nice to have the underlying references and support for such a timeline. Perhaps it was with Timothy Bright’s publication of Characterie; An Arte of Shorte, Swifte and Secrete Writing by Character (1588)  , John Willis’s Art of Stenography (1602)  , Edmond Willis’s An abbreviation of writing by character (1618)  , or Thomas Shelton’s Short Writing (1626)  ? Shelton’s system was certainly very popular and well know because it was used by both Samuel Pepys and Sir Isaac Newton.
Certainly some in-depth research will tell, though if anyone has ideas, please don’t hesitate to indicate your ideas in the comments.
UPDATE on 7/6/14:
I’m adding a new chart making the correspondence between the major system and Gregg Shorthand more explicit.
B. Furst, You Can Remember: A Home Study Course in Memory and Concentration. Markus-Campbell Co., 1965.
I’ve long been a student of the humanities (and particularly the classics) and have recently begun reviewing over my very old and decrepit knowledge of Latin. It’s been two decades since I made a significant study of classical languages, and lately (as the result of conversations with friends like Dave Harris, Jim Houser, Larry Richardson, and John Kountouris) I’ve been drawn to reviewing them for reading a variety of classical texts in their original languages. Fortunately, in the intervening years, quite a lot has changed in the tools relating to pedagogy for language acquisition.
The biggest change in the intervening time is the spread of the internet which supplies a broad variety of related websites with not only interesting resources for things like basic reading and writing, but even audio sources apparently including listening to the nightly news in Latin. There are a variety of blogs on Latin as well as even online courseware, podcasts, pronunciation recordings, and even free textbooks. I’ve written briefly about the RapGenius platform before, but I feel compelled to mention it as a potentially powerful resource as well. (Julius Caesar, Seneca, Ovid, Cicero, et al.) There is a paucity of these sources in a general sense in comparison with other modern languages, but given the size of the niche, there is quite a lot out there, and certainly a mountain in comparison to what existed only twenty years ago.
There has also been a spread of pedagogic aids like flashcard software including Anki and Mnemosyne with desktop, web-based, and even mobile-based versions making learning available in almost any situation. The psychology and learning research behind these types of technologies has really come a long way toward assisting students to best make use of their time in learning and retaining what they’ve learned in long term memory. Simple mobile applications like Duolingo exist for a variety of languages – though one doesn’t currently exist for classical Latin (yet).
The other great change is the advancement of the digital humanities which allows for a lot of interesting applications of knowledge acquisition. One particular one that I ran across this week was the Dickinson College Commentaries (DCC). Specifically a handful of scholars have compiled and documented a list of the most common core vocabulary words in Latin (and in Greek) based on their frequency of appearance in extant works. This very specific data is of interest to me in relation to my work in information theory, but it also becomes a tremendously handy tool when attempting to learn and master a language. It is a truly impressive fact that, simply by knowing that if one can memorize and master about 250 words in Latin, it will allow them to read and understand 50% of most written Latin. Further, knowledge of 1,500 Latin words will put one at the 80% level of vocabulary mastery for most texts. Mastering even a very small list of vocabulary allows one to read a large variety of texts very comfortably. I can only think about the old concept of a concordance (which was generally limited to heavily studied texts like the Bible or possibly Shakespeare) which has now been put on some serious steroids for entire cultures. Another half step and one arrives at the Google Ngram Viewer.
The best part is that one can, with very little technical knowledge, easily download the DCC Core Latin Vocabulary (itself a huge research undertaking) and upload and share it through the Anki platform, for example, to benefit a fairly large community of other scholars, learners, and teachers. With a variety of easy-to-use tools, shortly it may be even that much easier to learn a language like Latin – potentially to the point that it is no longer a dead language. For those interested, you can find my version of the shared DCC Core Latin Vocabulary for Anki online; the DCC’s Chris Francese has posted details and a version for Mnemosyne already.
[Editor’s note: Anki’s web service occasionally clears decks of cards from their servers, so if you find that the Anki link to the DCC Core Latin is not working, please leave a comment below, and we’ll re-upload the deck for shared use.]
What tools and tricks do you use for language study and pedagogy?
This series of 12 audio lectures is an excellent little overview of Augustine, his life, times, and philosophy. Most of the series focuses on his writings and philosophy as well as their evolution over time, often with discussion of the historical context in which they were created as well as some useful comparing/contrasting to extant philosophies of the day (and particularly Platonism.)
Early in the series there were some interesting and important re-definitions of some contemporary words. Cary pushes them back to an earlier time with slightly different meanings compared to their modern ones which certainly helps to frame the overarching philosophy presented. Without a close study of this vocabulary, many modern readers will become lost or certainly misdirected when reading modern translations. As examples, words like perverse, righteousness, and justice (or more specifically their Latin counterparts) have subtly different meanings in the late Roman empire than they do today, even in modern day religious settings.
My favorite part, however, has to have been the examples discussing mathematics as an extended metaphor for God and divinity to help to clarify some of Augustine’s thought. These were not only very useful, but very entertaining to me.
As an aside for those interested in mnemotechnic tradition, I’ll also mention that I’ve (re)discovered (see the reference to the Tell paper below) an excellent reference to the modern day “memory palace” (referenced most recently in the book Moonwalking with Einstein: The Art and Science of Remembering Everything) squirreled away in Book X of Confessions where Augustine discusses memory as:
Those interested in memes and the history of “memoria ex locis” (of which I don’t even find a reference explicitly written in the original Rhetorica ad Herrenium) would appreciate an additional reference I subsequently found in the opening (and somewhat poetic) paragraph of a paper written by David Tell on JSTOR. The earliest specific reference to a “memory palace” I’m aware of is Matteo Ricci’s in the 16th century, but certainly other references to the construct may have come earlier. Given that Ricci was a Jesuit priest, it’s nearly certain that he would have been familiar with Augustine’s writings at the time, and it’s possible that his modification of Augustine’s mention brought the concept into its current use. Many will know memory as one of the major underpinnings of rhetoric (of which Augustine was a diligent student) as part of the original trivium.
Some may shy away from Augustine because of the religious overtones which go along with his work, but though there were occasional “preachy sounding” sections in the material, they were present only to clarify the philosophy.
I’d certainly recommend this series of lectures to anyone not closely familiar with Augustine’s work as it has had a profound and continuing affect on Western philosophy, thought, and politics.
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
The research this article is based on is quite interesting for those doing language research.