I’ve long been a student of the humanities (and particularly the classics) and have recently begun reviewing over my very old and decrepit knowledge of Latin. It’s been two decades since I made a significant study of classical languages, and lately (as the result of conversations with friends like Dave Harris, Jim Houser, Larry Richardson, and John Kountouris) I’ve been drawn to reviewing them for reading a variety of classical texts in their original languages. Fortunately, in the intervening years, quite a lot has changed in the tools relating to pedagogy for language acquisition.
The biggest change in the intervening time is the spread of the internet which supplies a broad variety of related websites with not only interesting resources for things like basic reading and writing, but even audio sources apparently including listening to the nightly news in Latin. There are a variety of blogs on Latin as well as even online courseware, podcasts, pronunciation recordings, and even free textbooks. I’ve written briefly about the RapGenius platform before, but I feel compelled to mention it as a potentially powerful resource as well. (Julius Caesar, Seneca, Ovid, Cicero, et al.) There is a paucity of these sources in a general sense in comparison with other modern languages, but given the size of the niche, there is quite a lot out there, and certainly a mountain in comparison to what existed only twenty years ago.
There has also been a spread of pedagogic aids like flashcard software including Anki and Mnemosyne with desktop, web-based, and even mobile-based versions making learning available in almost any situation. The psychology and learning research behind these types of technologies has really come a long way toward assisting students to best make use of their time in learning and retaining what they’ve learned in long term memory. Simple mobile applications like Duolingo exist for a variety of languages – though one doesn’t currently exist for classical Latin (yet).
The other great change is the advancement of the digital humanities which allows for a lot of interesting applications of knowledge acquisition. One particular one that I ran across this week was the Dickinson College Commentaries (DCC). Specifically a handful of scholars have compiled and documented a list of the most common core vocabulary words in Latin (and in Greek) based on their frequency of appearance in extant works. This very specific data is of interest to me in relation to my work in information theory, but it also becomes a tremendously handy tool when attempting to learn and master a language. It is a truly impressive fact that, simply by knowing that if one can memorize and master about 250 words in Latin, it will allow them to read and understand 50% of most written Latin. Further, knowledge of 1,500 Latin words will put one at the 80% level of vocabulary mastery for most texts. Mastering even a very small list of vocabulary allows one to read a large variety of texts very comfortably. I can only think about the old concept of a concordance (which was generally limited to heavily studied texts like the Bible or possibly Shakespeare) which has now been put on some serious steroids for entire cultures. Another half step and one arrives at the Google Ngram Viewer.
The best part is that one can, with very little technical knowledge, easily download the DCC Core Latin Vocabulary (itself a huge research undertaking) and upload and share it through the Anki platform, for example, to benefit a fairly large community of other scholars, learners, and teachers. With a variety of easy-to-use tools, shortly it may be even that much easier to learn a language like Latin – potentially to the point that it is no longer a dead language. For those interested, you can find my version of the shared DCC Core Latin Vocabulary for Anki online; the DCC’s Chris Francese has posted details and a version for Mnemosyne already.
[Editor’s note: Anki’s web service occasionally clears decks of cards from their servers, so if you find that the Anki link to the DCC Core Latin is not working, please leave a comment below, and we’ll re-upload the deck for shared use.]
What tools and tricks do you use for language study and pedagogy?
I’ve been a proponent and user of a variety of mnemonic systems since I was about eleven years old. The two biggest and most useful in my mind are commonly known as the “method of loci” and the “major system.” The major system is also variously known as the phonetic number system, the phonetic mnemonic system, or Hergione’s mnemonic system after French mathematician and astronomer Pierre Hérigone (1580-1643) who is thought to have originated its use.
The major system generally works by converting numbers into consonant sounds and then from there into words by adding vowels under the overarching principle that images (of the words) can be remembered more easily than the numbers themselves. For instance, one could memorize one’s grocery list of a hundred items by associating each shopping item on a numbered list with the word associated with the individual number in the list. As an example, if item 22 on the list is lemons, one could translate the number 22 as “nun” within the major system and then associate or picture a nun with lemons – perhaps a nun in full habit taking a bath in lemons to make the image stick in one’s memory better. Then at the grocery store, when going down one’s list, when arriving at number 22 on the list, one automatically translates the number 22 to “nun” which will almost immediately conjure the image of a nun taking a bath in lemons which gives one the item on the list that needed to be remembered. This comes in handy particularly when one needs to be able to remember large lists of items in and out of order.
The following generalized chart, which can be found in a hoard of books and websites on the topic, is fairly canonical for the overall system:
Mnemonic for remembering the numeral and consonant relationship
s, z, soft c
“z” is the first letter of zero; the other letters have a similar sound
t & d have one downstroke and sound similar (some variant systems include “th”)
n has two downstrokes
m has three downstrokes; m looks like a “3” on its side
last letter of four; 4 and R are almost mirror images of each other
L is the Roman Numeral for 50
/ʃ/ /ʒ/ /tʃ/ /dʒ/
j, sh, soft g, soft “ch”
a script j has a lower loop; g is almost a 6 rotated
k, hard c, hard g, hard “ch”, q, qu
capital K “contains” two sevens (some variant systems include “ng”)
script f resembles a figure-8; v sounds similar (v is a voiced f)
p is a mirror-image 9; b sounds similar and resembles a 9 rolled around
Vowel sounds, w,h,y
w and h are considered half-vowels; these can be used anywhere without changing a word’s number value
There are a variety of ways to use the major system as a code in addition to its uses in mnemonic settings. When I was a youth, I used it to write coded messages and to encrypt a variety of things for personal use. After I had originally read Dr. Bruno Furst’s series of booklets entitled You Can Remember: A Home Study Course in Memory and Concentration1, I had always wanted to spend some time creating an alternate method of writing using the method. Sadly I never made the time to do the project, but yesterday I made a very interesting discovery that, to my knowledge, doesn’t seem to have been previously noticed!
My discovery began last week when I read an article in The Atlantic by journalist Dennis Hollier entitled How to Write 225 Words Per Minute with a Pen: A Lesson in the Lost Technology of Shorthand. 2 In the article, which starts off with a mention of the Livescribe pen – one of my favorite tools, Mr. Hollier outlines the use of the Gregg System of Shorthand which was invented by John Robert Gregg in 1888. The description of the method was intriguing enough to me that I read a dozen additional general articles on shorthand on the internet and purchased a copy of Louis A. Leslie’s two volume text Gregg Shorthand: Functional Method.3
I was shocked, on page x of the front matter, just before the first page of the text, to find the following “Alphabet of Gregg Shorthand”:
Gregg Shorthand is using EXACTLY the same consonant-type breakdown of the alphabet as the major system!
Apparently I wasn’t the first to have the idea to turn the major system into a system of writing. The fact that the consonant breakdowns for the major system coincide almost directly to those for the shorthand method used by Gregg cannot be a coincidence!
The Gregg system works incredibly well precisely because the major system works so well. The biggest difference between the two systems is that Gregg utilizes a series of strokes (circles and semicircles) to indicate particular vowel sounds which allows for better differentiation of words which the major system doesn’t generally take into consideration. From an information theoretic standpoint, this is almost required to make the coding from one alphabet to the other possible, but much like ancient Hebrew, leaving out the vowels doesn’t remove that much information. Gregg, also like Hebrew, also uses dots and dashes above or below certain letters to indicate the precise sound of many of its vowels.
The upside of all of this is that the major system is incredibly easy to learn and use, and from here, learning Gregg shorthand is just a hop, skip , and a jump – heck, it’s really only just a hop because the underlying structure is so similar. Naturally as with the major system, one must commit some time to practicing it to improve on speed and accuracy, but the general learning of the system is incredibly straightforward.
Because the associations between the two systems are so similar, I wasn’t too surprised to find that some of the descriptions of why certain strokes were used for certain letters were very similar to the mnemonics for why certain letters were used for certain numbers in the major system.
One thing I have noticed in my studies on these topics is the occasional references to the letter combinations “NG” and “NK”. I’m curious why these are singled out in some of these systems? I have a strong suspicion that their inclusion/exclusion in various incarnations of their respective systems may be helpful in dating the evolution of these systems over time.
I’m aware that various versions of shorthand have appeared over the centuries with the first recorded having been the “Tironian Notes” of Marcus Tullius Tiro (103-4 BCE) who apparently used his system to write down the speeches of his master Cicero. I’m now much more curious at what point the concepts for shorthand and the major system crossed paths or converged? My assumption would be that it happened in the late Renaissance, but it would be nice to have the underlying references and support for such a timeline. Perhaps it was with Timothy Bright’s publication of Characterie; An Arte of Shorte, Swifte and Secrete Writing by Character (1588) 4, John Willis’s Art of Stenography (1602) 5, Edmond Willis’s An abbreviation of writing by character (1618) 6, or Thomas Shelton’s Short Writing (1626) 7? Shelton’s system was certainly very popular and well know because it was used by both Samuel Pepys and Sir Isaac Newton.
Certainly some in-depth research will tell, though if anyone has ideas, please don’t hesitate to indicate your ideas in the comments.
UPDATE on 7/6/14:
I’m adding a new chart making the correspondence between the major system and Gregg Shorthand more explicit.
Furst B. You Can Remember: A Home Study Course in Memory and Concentration. Markus-Campbell Co.; 1965.
Click the link, read the stuff and scroll down to “CLICK HERE” to apply. The deadline is 12 November 2014.
Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.
Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.
So, here are the goals of our workshop:
To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
To study the interplay between information theory and the thermodynamics of individual cells and organelles.
Stephen Greenblatt provides an interesting synthesis of history and philosophy. Greenblatt’s love of the humanities certainly shines through. This stands as an almost over-exciting commercial for not only reading Lucretius’s “De Rerum Natura” (“On the Nature of Things”), but in motivating the reader to actually go out to learn Latin to appreciate it properly.
I would have loved more direct analysis and evidence of the immediate impact of Lucretius in the 1400’s as well as a longer in-depth analysis of the continuing impact through the 1700’s.
The first half of the book is excellent at painting a vivid portrait of the life and times of Poggio Bracciolini which one doesn’t commonly encounter. I’m almost reminded of Stacy Schiff’s Cleopatra: A Life, though Greenblatt has far more historical material with which to paint the picture. I may also be biased that I’m more interested in the mechanics of the scholarship of the resurgence of the classics in the Renaissance than I was of that particular political portion of the first century BCE. Though my background on the history of the time periods involved is reasonably advanced, I fear that Greenblatt may be leaving out a tad too much for the broader reading public who may not be so well versed. The fact that he does bring so many clear specifics to the forefront may more than compensate for this however.
Greenblatt includes lots of interesting tidbits and some great history. I wish it had continued on longer… I’d love to have the spare time to lose myself in the extensive bibliography. Though the footnotes, bibliography, and index account for about 40% of the book, the average reader should take a reasonable look at the quarter or so of the footnotes which add some interesting additional background an subtleties to the text as well as to some of the translations that are discussed therein.
I am definitely very interested in the science behind textual preservation which is presented as the underlying motivation for the action in this book. I wish that Greenblatt had covered some of these aspects in the same vivid detail he exhibited for other portions of the story. Perhaps summarizing some more of the relevant scholarship involved in transmitting and restoring old texts as presented in Bart Ehrman and Bruce Metzter’s The Text of the New Testament: Its Transmission, Corruption & Restoration would have been a welcome addition given the audience of the book. It might also have presented a more nuanced picture of the character of the Church and their predicament presented in the text as well.
Though I only caught one small reference to modern day politics (a prison statistic for America which was obscured in a footnote), I find myself wishing that Greenblatt had spent at least a few paragraphs or even a short chapter drawing direct parallels to our present-day political landscape. I understand why he didn’t broach the subject as it would tend to date an otherwise timeless feeling text and generally serve to dissuade a portion of his readership and in particular, the portion which most needs to read such a book. I can certainly see a strong need for having another short burst of popularity for “On the Nature of Things” to assist with the anti-science and overly pro-religion climate we’re facing in American politics.
For those interested in the topic, I might suggest that this text has some flavor of Big History in its DNA. It covers not only a fairly significant chunk of recorded human history, but has some broader influential philosophical themes that underlie a potential change in the direction of history which we’ve been living for the past 300 years. There’s also an intriguing overlap of multidisciplinary studies going on in terms of the history, science, philosophy, and technology involved in the multiple time periods discussed.
My response to his post with some thoughts of my own follows:
This is an interesting, but very germane, review. As someone who’s both worked in the entertainment industry and followed the MOOC (massively open online courseware) revolution over the past decade, I very often consider the physical production value of TGCs offerings and have been generally pleased at their steady improvement over time. Not only do they offer some generally excellent content, but they’re entertaining and pleasing to watch. From a multimedia perspective, I’m always amazed at what they offer and that generally the difference between the video versus the audio only versions isn’t as drastic as one might otherwise expect. Though there are times that I think that TGC might include some additional graphics, maps, etc. either in the course itself or in the booklets, I’m impressed that they still function exceptionally well without them.
Within the MOOC revolution, Sue Alcott’s Coursera course Archaeology’s Dirty Little Secrets is still by far the best produced multi-media course I’ve come across. It’s going to take a lot of serious effort for other courses to come up to this level of production however. It’s one of the few courses which I think rivals that of The Teaching Company’s offerings thus far. Unfortunately, the increased competition in the MOOC space is going to eventually encroach on the business model of TGC, and I’m curious to see how that will evolve and how it will benefit students. Will TGC be forced to offer online fora for students to interact with each other the way most MOOCs do? Will MOOCs be forced to drastically increase their production quality to the level of TGC? Will certificates or diplomas be offered for courseware? Will the subsequent models be free (like most MOOCs now), paid like TGC, or some mixture of the two?
One area which neither platform seems to be doing very well at present is offering more advanced coursework. Naturally the primary difficulty is in having enough audience to justify the production effort. The audience for a graduate level topology class is simply far smaller than introductory courses in history or music appreciation, but those types of courses will eventually have to exist to make the enterprises sustainable – in addition to the fact that they add real value to society. Another difficulty is that advanced coursework usually requires some significant work outside of the lecture environment – readings, homework, etc. MOOCs seem to have a slight upper hand here while TGC has generally relied on all of the significant material being offered in a lecture with the suggestion of reading their accompanying booklets and possibly offering supplementary bibliographies. When are we going to start seeing course work at the upper-level undergraduate or graduate level?
The nice part is that with evolving technology and capabilities, there are potentially new pedagogic methods that will allow easier teaching of some material that may not have been possible previously. (For some brief examples, see this post I wrote last week on Latin and the digital humanities.) In particular, I’m sure many of us have been astounded and pleased at how Dr. Greenberg managed the supreme gymnastics of offering of “Understanding the Fundamentals of Music” without delving into traditional music theory and written notation, but will he be able to actually offer that in new and exciting ways to increase our levels of understanding of music and then spawn off another 618 lectures that take us all further and deeper into his exciting world? Perhaps it comes in the form of a multimedia mobile app? We’re all waiting with bated breath, because regardless of how he pulls it off, we know it’s going to be educational, entertaining and truly awe inspiring.
Following my commentary, Scott Ableman, the Chief Marketing Officer for TGC, responded with the following, which I find very interesting:
I recently ran across this TED talk and felt compelled to share it. It really highlights some of my own personal thoughts on how science should be taught and done in the modern world. It also overlaps much of the reading I’ve been doing lately on innovation and creativity. If these don’t get you to watch, then perhaps mentioning that Alon manages to apply comedy and improvisation techniques to science will.
Uri Alon was already one of my scientific heroes, but this adds a lovely garnish.