As many may know or have already heard, Dr. Mike Miller, a retired mathematician from RAND and long-time math professor at UCLA, is offering a course on Introduction to Lie Groups and Lie Algebras this fall through UCLA Extension. Whether you’re a professional mathematician, engineer, physicist, physician, or even a hobbyist interested in mathematics you’ll be sure to get something interesting out of this course, not to mention the camaraderie of 20-30 other “regulars” with widely varying backgrounds (actors to surgeons and evolutionary theorists to engineers) who’ve been taking almost everything Mike has offered over the years (and yes, he’s THAT good — we’re sure you’ll be addicted too.)
Even if it’s been years since you last took Calculus or Linear Algebra, Mike (and the rest of the class) will help you get quickly back up to speed to delve into what is often otherwise a very deep subject. If you’re interested in advanced physics, quantum mechanics, quantum information or string theory, this is one of the topics that is de rigueur for delving in deeply and being able to understand them better. The topic is also one near and dear to the hearts of those in robotics, graphics, 3-D modelling, gaming, and areas utilizing multi-dimensional rotations. And naturally, it’s simply a beautiful and elegant subject for those who have no need to apply it to anything, but who just want to meander their way through higher mathematics for the fun of it (this will comprise the largest majority of the class by the way.)
Whether you’ve been away from serious math for decades or use it every day or even if you’ve never gone past Calculus or Linear Algebra, this is bound to be the most entertaining thing you can do with your Tuesday nights in the fall. If you’re not sure what you’re getting into (or are scared a bit by the course description), I highly encourage to come and join us for at least the first class before you pass up on the opportunity. I’ll mention that the greater majority of new students to Mike’s classes join the ever-growing group of regulars who take almost everything he teaches subsequently. (For the reticent, I’ll mention that one of the first courses I took from Mike was Algebraic Topology which generally requires a few semesters of Abstract Algebra and a semester of Topology as prerequisites. I’d taken neither of these prerequisites, but due to Mike’s excellent lecture style and desire to make everything comprehensible, I was able to do exceedingly well in the course.) I’m happy to chat with those who may be reticent. Also keep in mind that you can register to take the class for a grade, pass/fail, or even no grade at all to suit your needs/lifestyle.
As a group, some of us have a collection of a few dozen texts in the area which we’re happy to loan out as well. In addition to the one recommended text (Mike always gives such comprehensive notes that any text for his classes is purely supplemental at best), several of us have also found some good similar texts:
Given the breadth and diversity of the backgrounds of students in the class, I’m sure Mike will spend some reasonable time at the beginning [or later in the class, as necessary] doing a quick overview of some linear algebra and calculus related topics that will be needed later in the quarter(s).
Further information on the class and a link to register can be found below. If you know of others who might be interested in this, please feel free to forward it along – the more the merrier.
I hope to see you all soon.
Introduction to Lie Groups and Lie Algebras
MATH X 450.6 / 3.00 units / Reg. # 249254W
Professor: Michael Miller, Ph.D.
Start Date: 9/30/2014
Location UCLA: 5137 Math Sciences Building
September 30 – December 16, 2014
11 meetings total (no mtg 11/11)
Register here: https://www.uclaextension.edu/Pages/Course.aspx?reg=249254
A Lie group is a differentiable manifold that is also a group for which the product and inverse maps are differentiable. A Lie algebra is a vector space endowed with a binary operation that is bilinear, alternating, and satisfies the so-called Jacobi identity. This course, the first in a 2-quarter sequence, is an introductory survey of Lie groups, their associated Lie algebras, and their representations. This first quarter will focus on the special case of matrix Lie groups–including general linear, special linear, orthogonal, unitary, and symplectic. The second quarter will generalize the theory developed to the case of arbitrary Lie groups. Topics to be discussed include compactness and connectedness, homomorphisms and isomorphisms, exponential mappings, the Baker-Campbell-Hausdorff formula, covering groups, and the Weyl group. This is an advanced course, requiring a solid understanding of linear algebra and basic analysis.
In all the sadness of the passing of Robin Williams, I nearly forgot I’d “written” a short joke for him just after I’d first moved to Hollywood.
Killing some time just before I started work at Creative Artists Agency, I finagled my way into a rough-cut screening of Robin William’s iconoclastic role in PATCH ADAMS on the Universal Lot. Following the screening, I had the pleasure of chatting with [read: bum-rushed like a crazy fan] Tom Shadyac for a few minutes on the way out. I told him as a recent grad of Johns Hopkins University and having spent a LOT of time in hospitals, that they were missing their obligatory hospital gown joke. But to give it a karate chop (and because I’d just graduated relatively recently), they should put it into the graduation at the “end” and close on a high note.
I didn’t see or hear anything about it until many months later when I went to Mann’s Chinese Theater for the premiere and saw the final cut of the ending of the film, which I’ve clipped below. Just for today, I’m wearing the same red foam clown nose that I wore to the premiere that night.
I recently ran across this TED talk and felt compelled to share it. It really highlights some of my own personal thoughts on how science should be taught and done in the modern world. It also overlaps much of the reading I’ve been doing lately on innovation and creativity. If these don’t get you to watch, then perhaps mentioning that Alon manages to apply comedy and improvisation techniques to science will.
Uri Alon was already one of my scientific heroes, but this adds a lovely garnish.
My response to his post with some thoughts of my own follows:
This is an interesting, but very germane, review. As someone who’s both worked in the entertainment industry and followed the MOOC (massively open online courseware) revolution over the past decade, I very often consider the physical production value of TGCs offerings and have been generally pleased at their steady improvement over time. Not only do they offer some generally excellent content, but they’re entertaining and pleasing to watch. From a multimedia perspective, I’m always amazed at what they offer and that generally the difference between the video versus the audio only versions isn’t as drastic as one might otherwise expect. Though there are times that I think that TGC might include some additional graphics, maps, etc. either in the course itself or in the booklets, I’m impressed that they still function exceptionally well without them.
Within the MOOC revolution, Sue Alcott’s Coursera course Archaeology’s Dirty Little Secrets is still by far the best produced multi-media course I’ve come across. It’s going to take a lot of serious effort for other courses to come up to this level of production however. It’s one of the few courses which I think rivals that of The Teaching Company’s offerings thus far. Unfortunately, the increased competition in the MOOC space is going to eventually encroach on the business model of TGC, and I’m curious to see how that will evolve and how it will benefit students. Will TGC be forced to offer online fora for students to interact with each other the way most MOOCs do? Will MOOCs be forced to drastically increase their production quality to the level of TGC? Will certificates or diplomas be offered for courseware? Will the subsequent models be free (like most MOOCs now), paid like TGC, or some mixture of the two?
One area which neither platform seems to be doing very well at present is offering more advanced coursework. Naturally the primary difficulty is in having enough audience to justify the production effort. The audience for a graduate level topology class is simply far smaller than introductory courses in history or music appreciation, but those types of courses will eventually have to exist to make the enterprises sustainable – in addition to the fact that they add real value to society. Another difficulty is that advanced coursework usually requires some significant work outside of the lecture environment – readings, homework, etc. MOOCs seem to have a slight upper hand here while TGC has generally relied on all of the significant material being offered in a lecture with the suggestion of reading their accompanying booklets and possibly offering supplementary bibliographies. When are we going to start seeing course work at the upper-level undergraduate or graduate level?
The nice part is that with evolving technology and capabilities, there are potentially new pedagogic methods that will allow easier teaching of some material that may not have been possible previously. (For some brief examples, see this post I wrote last week on Latin and the digital humanities.) In particular, I’m sure many of us have been astounded and pleased at how Dr. Greenberg managed the supreme gymnastics of offering of “Understanding the Fundamentals of Music” without delving into traditional music theory and written notation, but will he be able to actually offer that in new and exciting ways to increase our levels of understanding of music and then spawn off another 618 lectures that take us all further and deeper into his exciting world? Perhaps it comes in the form of a multimedia mobile app? We’re all waiting with bated breath, because regardless of how he pulls it off, we know it’s going to be educational, entertaining and truly awe inspiring.
Following my commentary, Scott Ableman, the Chief Marketing Officer for TGC, responded with the following, which I find very interesting:
Stephen Greenblatt provides an interesting synthesis of history and philosophy. Greenblatt’s love of the humanities certainly shines through. This stands as an almost over-exciting commercial for not only reading Lucretius’s “De Rerum Natura” (“On the Nature of Things”), but in motivating the reader to actually go out to learn Latin to appreciate it properly.
I would have loved more direct analysis and evidence of the immediate impact of Lucretius in the 1400’s as well as a longer in-depth analysis of the continuing impact through the 1700’s.
The first half of the book is excellent at painting a vivid portrait of the life and times of Poggio Bracciolini which one doesn’t commonly encounter. I’m almost reminded of Stacy Schiff’s Cleopatra: A Life, though Greenblatt has far more historical material with which to paint the picture. I may also be biased that I’m more interested in the mechanics of the scholarship of the resurgence of the classics in the Renaissance than I was of that particular political portion of the first century BCE. Though my background on the history of the time periods involved is reasonably advanced, I fear that Greenblatt may be leaving out a tad too much for the broader reading public who may not be so well versed. The fact that he does bring so many clear specifics to the forefront may more than compensate for this however.
Greenblatt includes lots of interesting tidbits and some great history. I wish it had continued on longer… I’d love to have the spare time to lose myself in the extensive bibliography. Though the footnotes, bibliography, and index account for about 40% of the book, the average reader should take a reasonable look at the quarter or so of the footnotes which add some interesting additional background an subtleties to the text as well as to some of the translations that are discussed therein.
I am definitely very interested in the science behind textual preservation which is presented as the underlying motivation for the action in this book. I wish that Greenblatt had covered some of these aspects in the same vivid detail he exhibited for other portions of the story. Perhaps summarizing some more of the relevant scholarship involved in transmitting and restoring old texts as presented in Bart Ehrman and Bruce Metzter’s The Text of the New Testament: Its Transmission, Corruption & Restoration would have been a welcome addition given the audience of the book. It might also have presented a more nuanced picture of the character of the Church and their predicament presented in the text as well.
Though I only caught one small reference to modern day politics (a prison statistic for America which was obscured in a footnote), I find myself wishing that Greenblatt had spent at least a few paragraphs or even a short chapter drawing direct parallels to our present-day political landscape. I understand why he didn’t broach the subject as it would tend to date an otherwise timeless feeling text and generally serve to dissuade a portion of his readership and in particular, the portion which most needs to read such a book. I can certainly see a strong need for having another short burst of popularity for “On the Nature of Things” to assist with the anti-science and overly pro-religion climate we’re facing in American politics.
For those interested in the topic, I might suggest that this text has some flavor of Big History in its DNA. It covers not only a fairly significant chunk of recorded human history, but has some broader influential philosophical themes that underlie a potential change in the direction of history which we’ve been living for the past 300 years. There’s also an intriguing overlap of multidisciplinary studies going on in terms of the history, science, philosophy, and technology involved in the multiple time periods discussed.
Click the link, read the stuff and scroll down to “CLICK HERE” to apply. The deadline is 12 November 2014.
Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.
Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.
So, here are the goals of our workshop:
To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
To study the interplay between information theory and the thermodynamics of individual cells and organelles.
I’ve been a proponent and user of a variety of mnemonic systems since I was about eleven years old. The two biggest and most useful in my mind are commonly known as the “method of loci” and the “major system.” The major system is also variously known as the phonetic number system, the phonetic mnemonic system, or Hergione’s mnemonic system after French mathematician and astronomer Pierre Hérigone (1580-1643) who is thought to have originated its use.
The major system generally works by converting numbers into consonant sounds and then from there into words by adding vowels under the overarching principle that images (of the words) can be remembered more easily than the numbers themselves. For instance, one could memorize one’s grocery list of a hundred items by associating each shopping item on a numbered list with the word associated with the individual number in the list. As an example, if item 22 on the list is lemons, one could translate the number 22 as “nun” within the major system and then associate or picture a nun with lemons – perhaps a nun in full habit taking a bath in lemons to make the image stick in one’s memory better. Then at the grocery store, when going down one’s list, when arriving at number 22 on the list, one automatically translates the number 22 to “nun” which will almost immediately conjure the image of a nun taking a bath in lemons which gives one the item on the list that needed to be remembered. This comes in handy particularly when one needs to be able to remember large lists of items in and out of order.
The following generalized chart, which can be found in a hoard of books and websites on the topic, is fairly canonical for the overall system:
Mnemonic for remembering the numeral and consonant relationship
s, z, soft c
“z” is the first letter of zero; the other letters have a similar sound
t & d have one downstroke and sound similar (some variant systems include “th”)
n has two downstrokes
m has three downstrokes; m looks like a “3” on its side
last letter of four; 4 and R are almost mirror images of each other
L is the Roman Numeral for 50
/ʃ/ /ʒ/ /tʃ/ /dʒ/
j, sh, soft g, soft “ch”
a script j has a lower loop; g is almost a 6 rotated
k, hard c, hard g, hard “ch”, q, qu
capital K “contains” two sevens (some variant systems include “ng”)
script f resembles a figure-8; v sounds similar (v is a voiced f)
p is a mirror-image 9; b sounds similar and resembles a 9 rolled around
Vowel sounds, w,h,y
w and h are considered half-vowels; these can be used anywhere without changing a word’s number value
There are a variety of ways to use the major system as a code in addition to its uses in mnemonic settings. When I was a youth, I used it to write coded messages and to encrypt a variety of things for personal use. After I had originally read Dr. Bruno Furst’s series of booklets entitled You Can Remember: A Home Study Course in Memory and Concentration1, I had always wanted to spend some time creating an alternate method of writing using the method. Sadly I never made the time to do the project, but yesterday I made a very interesting discovery that, to my knowledge, doesn’t seem to have been previously noticed!
My discovery began last week when I read an article in The Atlantic by journalist Dennis Hollier entitled How to Write 225 Words Per Minute with a Pen: A Lesson in the Lost Technology of Shorthand. 2 In the article, which starts off with a mention of the Livescribe pen – one of my favorite tools, Mr. Hollier outlines the use of the Gregg System of Shorthand which was invented by John Robert Gregg in 1888. The description of the method was intriguing enough to me that I read a dozen additional general articles on shorthand on the internet and purchased a copy of Louis A. Leslie’s two volume text Gregg Shorthand: Functional Method.3
I was shocked, on page x of the front matter, just before the first page of the text, to find the following “Alphabet of Gregg Shorthand”:
Gregg Shorthand is using EXACTLY the same consonant-type breakdown of the alphabet as the major system!
Apparently I wasn’t the first to have the idea to turn the major system into a system of writing. The fact that the consonant breakdowns for the major system coincide almost directly to those for the shorthand method used by Gregg cannot be a coincidence!
The Gregg system works incredibly well precisely because the major system works so well. The biggest difference between the two systems is that Gregg utilizes a series of strokes (circles and semicircles) to indicate particular vowel sounds which allows for better differentiation of words which the major system doesn’t generally take into consideration. From an information theoretic standpoint, this is almost required to make the coding from one alphabet to the other possible, but much like ancient Hebrew, leaving out the vowels doesn’t remove that much information. Gregg, also like Hebrew, also uses dots and dashes above or below certain letters to indicate the precise sound of many of its vowels.
The upside of all of this is that the major system is incredibly easy to learn and use, and from here, learning Gregg shorthand is just a hop, skip , and a jump – heck, it’s really only just a hop because the underlying structure is so similar. Naturally as with the major system, one must commit some time to practicing it to improve on speed and accuracy, but the general learning of the system is incredibly straightforward.
Because the associations between the two systems are so similar, I wasn’t too surprised to find that some of the descriptions of why certain strokes were used for certain letters were very similar to the mnemonics for why certain letters were used for certain numbers in the major system.
One thing I have noticed in my studies on these topics is the occasional references to the letter combinations “NG” and “NK”. I’m curious why these are singled out in some of these systems? I have a strong suspicion that their inclusion/exclusion in various incarnations of their respective systems may be helpful in dating the evolution of these systems over time.
I’m aware that various versions of shorthand have appeared over the centuries with the first recorded having been the “Tironian Notes” of Marcus Tullius Tiro (103-4 BCE) who apparently used his system to write down the speeches of his master Cicero. I’m now much more curious at what point the concepts for shorthand and the major system crossed paths or converged? My assumption would be that it happened in the late Renaissance, but it would be nice to have the underlying references and support for such a timeline. Perhaps it was with Timothy Bright’s publication of Characterie; An Arte of Shorte, Swifte and Secrete Writing by Character (1588) 4, John Willis’s Art of Stenography (1602) 5, Edmond Willis’s An abbreviation of writing by character (1618) 6, or Thomas Shelton’s Short Writing (1626) 7? Shelton’s system was certainly very popular and well know because it was used by both Samuel Pepys and Sir Isaac Newton.
Certainly some in-depth research will tell, though if anyone has ideas, please don’t hesitate to indicate your ideas in the comments.
UPDATE on 7/6/14:
I’m adding a new chart making the correspondence between the major system and Gregg Shorthand more explicit.
Furst B. You Can Remember: A Home Study Course in Memory and Concentration. Markus-Campbell Co.; 1965.
I’ve long been a student of the humanities (and particularly the classics) and have recently begun reviewing over my very old and decrepit knowledge of Latin. It’s been two decades since I made a significant study of classical languages, and lately (as the result of conversations with friends like Dave Harris, Jim Houser, Larry Richardson, and John Kountouris) I’ve been drawn to reviewing them for reading a variety of classical texts in their original languages. Fortunately, in the intervening years, quite a lot has changed in the tools relating to pedagogy for language acquisition.
The biggest change in the intervening time is the spread of the internet which supplies a broad variety of related websites with not only interesting resources for things like basic reading and writing, but even audio sources apparently including listening to the nightly news in Latin. There are a variety of blogs on Latin as well as even online courseware, podcasts, pronunciation recordings, and even free textbooks. I’ve written briefly about the RapGenius platform before, but I feel compelled to mention it as a potentially powerful resource as well. (Julius Caesar, Seneca, Ovid, Cicero, et al.) There is a paucity of these sources in a general sense in comparison with other modern languages, but given the size of the niche, there is quite a lot out there, and certainly a mountain in comparison to what existed only twenty years ago.
There has also been a spread of pedagogic aids like flashcard software including Anki and Mnemosyne with desktop, web-based, and even mobile-based versions making learning available in almost any situation. The psychology and learning research behind these types of technologies has really come a long way toward assisting students to best make use of their time in learning and retaining what they’ve learned in long term memory. Simple mobile applications like Duolingo exist for a variety of languages – though one doesn’t currently exist for classical Latin (yet).
The other great change is the advancement of the digital humanities which allows for a lot of interesting applications of knowledge acquisition. One particular one that I ran across this week was the Dickinson College Commentaries (DCC). Specifically a handful of scholars have compiled and documented a list of the most common core vocabulary words in Latin (and in Greek) based on their frequency of appearance in extant works. This very specific data is of interest to me in relation to my work in information theory, but it also becomes a tremendously handy tool when attempting to learn and master a language. It is a truly impressive fact that, simply by knowing that if one can memorize and master about 250 words in Latin, it will allow them to read and understand 50% of most written Latin. Further, knowledge of 1,500 Latin words will put one at the 80% level of vocabulary mastery for most texts. Mastering even a very small list of vocabulary allows one to read a large variety of texts very comfortably. I can only think about the old concept of a concordance (which was generally limited to heavily studied texts like the Bible or possibly Shakespeare) which has now been put on some serious steroids for entire cultures. Another half step and one arrives at the Google Ngram Viewer.
The best part is that one can, with very little technical knowledge, easily download the DCC Core Latin Vocabulary (itself a huge research undertaking) and upload and share it through the Anki platform, for example, to benefit a fairly large community of other scholars, learners, and teachers. With a variety of easy-to-use tools, shortly it may be even that much easier to learn a language like Latin – potentially to the point that it is no longer a dead language. For those interested, you can find my version of the shared DCC Core Latin Vocabulary for Anki online; the DCC’s Chris Francese has posted details and a version for Mnemosyne already.
[Editor’s note: Anki’s web service occasionally clears decks of cards from their servers, so if you find that the Anki link to the DCC Core Latin is not working, please leave a comment below, and we’ll re-upload the deck for shared use.]
What tools and tricks do you use for language study and pedagogy?
‘ll be the first to admit that I’m a reading junkie, but unfortunately there isn’t (yet) a 12 step program to help me. I love reading lots of different types of things across an array of platforms (books, newspapers, magazines, computer, web, phone, tablet, apps) and topics (fiction/non-fiction and especially history, biography, economics, popular science, etc.). My biggest problem and one others surely face is time.
There are so many things I want to read, and far too little time to do it in. Over the past several years, I’ve spent an almost unreasonable amount of time thinking about what I consume and (possibly more importantly) how to intelligently consume more of it. I’ve spent so much time delving into it that I’ve befriended a professor and fellow renaissance man (literally and figuratively) who gave me a personal thank you in his opening to a best-selling book entitled “The Thinking Life: How to Thrive in an Age of Distraction.”
At least twice a year I look at my reading consumption and work on how to improve it, all the while trying to maintain a level of quality and usefulness in what I’m consuming and why I’m consuming it.
I continually subscribe to new and interesting sources.
I close off subscriptions to old sources that I find uninteresting, repetitive (goodbye echo chamber), and those that are (or become) generally useless.
I carefully monitor the huge volumes of junk email that end up in my inbox and trim down on the useless material that I never seem to read, so that I’ll have more time to focus on what is important.
I’ve taken up listening to audiobooks to better utilize my time in the car while commuting.
I’ve generally quit reading large swaths of social media for their general inability to uncover truly interesting sources.
I’ve used some portions of social media to find other interesting people collating and curating areas I find interesting, but which I don’t have the time to read through everything myself. Why waste my time reading hundreds of articles, when I can rely on a small handful of people to read them and filter out the best of the best for myself? Twitter lists in particular are an awesome thing.
I’ve given up on things like “listicles” or stories from internet click farm sources like BuzzFeed which can have some truly excellent linkbait-type headlines, but I always felt like I’ve completely wasted my time clicking through to them.
A New Solution
About six months ago in the mountain of tech journalism I love reading, I ran across a site launch notice about a tech start-up calledSpritzwhich promised a radically different solution for the other side of the coin relating to my reading problem: speeding the entire process up! Unfortunately, despite a few intriguing samples at the time (and some great details on the problem and their solution), they weren’t actually delivering a product.
Well, all that seems to have changed in the past few weeks. I’ve waited somewhat patiently and occasionally checked back on their progress, but following a recent mention on Charlie Rose, and some serious digging around on the broader internet, I’ve found some worthwhile tools that have sprouted out of their efforts. Most importantly, Spritz itself now has a bookmarklet that seems to deliver on their promise of improving my reading speeds for online content. With the bookmarklet installed, one can go to almost any web article, click on the bookmarklet and then sit back and just read at almost any desired speed. Their technology uses a modified version of the 1970’s technology known as Rapid Serial Visual Presentation (RSVP) to speed up your reading ability, but does so in a way that is easier to effectuate with web and mobile technologies. Essentially they present words serially in the same position on your screen with an optimized center mass so that one’s eyes stay still while reading instead of doing the typical saccaddic eye movements which occur with typical reading – and slow the process down.
As a biomedical engineer, I feel compelled to note the interesting physiologic phenomenon that if one sits in a rotatable chair and spins with one’s eyes closed and their fingers lightly placed on their eyelids, one will feel the eye’s saccades even though one isn’t actually seeing anything.
Spritz also allows one to create an account and log in so that the service will remember your previously set reading speed. Their website does such a great job of explaining their concept, I’ll leave it to the reader to take a peek; but you may want to visit their bookmarklet page directly, as their own website didn’t seem to have a link to it initially.
Naturally, Spritz’s solution is not a catch-all for everything I’d like to read, but it covers an interesting subcategory that will make things useful and easier. Though trying to speed read journal articles, textbooks, and other technical literature isn’t the best idea in the world, Spritz will help me plow through more fiction and more leisurely types of magazine and online articles that are of general interest. I generally enjoy and appreciate these types of journalism and work, but just can’t always justify taking the time away from more academic pursuits to delve into them. Some will still require some further thought after-the-fact to really get their full value out of them, but at least I can cover the additional ground without wasting all the additional time to do so. I find I can easily double or triple my usual reading speed without any real loss of comprehension.
In the last week or so since installing a my several new speed reading bookmarklets, I’ve begun using them almost religiously in my daily reading regimen.
I’ll also note in passing that some studies suggest that this type of reading modality has helped those who face difficulties with dyslexia.
Speed Reading Competition
Naturally, since this is a great idea, there’s a bit of competition in the speed reading arena.
There are a small handful of web and app technologies which are built upon the RSVP concept:
Clayton Morris has also developed an iOS application called ReadQuick, which is based on the same concept as Spritz, but is only available via app and not on web.
Rich Jones has developed a program called OpenSpritz. His version is opensource and has an Android port for mobile.
There’s also another similar bookmarklet called Squirt which also incorporates some nice UI tweaks and some of the technology from Readability as well.
For those wishing to Spritz .pdf or .txt documents, one can upload them using Readsy which uses Spritz’s open API to allow these types of functionalities.
There are also a variety of similar free apps in the Google Play store which follow the RSVP technology model.
Those on the Amazon (or Kindle Fire/Android Platform) will appreciate the Balto App which utilizes RSVP and is not only one of the more fully functional apps in the space, but it also has the ability to unpack Kindle formatted books (i.e. deal with Amazon’s DRM) to allow speed reading Kindle books. While there is a free version, the $1.99 paid version is more than well worth the price for the additional perks.
On and off for the past couple of years, I’ve also used a web service and app called Readfa.st which is a somewhat useful, but generally painful way to improve one’s speed reading. It also has a handy bookmarklet, but just wasn’t as useful as I had always hoped it might be. It’s interesting, but not as interesting or as useful as Spritz (and other RSVP technology) in my opinion since it feels more fatiguing to read in this manner
Bookmarklet Junkie Addendum
In addition to the handful of speed reading bookmarklets I’ve mentioned above, I’ve got over 50 bookmarklets in a folder on my web browser toolbar. I easily use about a dozen on a daily basis. Bookmarklets make my internet world much prettier, nicer, and cleaner with a range of simple clever code. Many are for URL shortening, sharing content to a variety of social networks quickly, but a large number of the ones I use are for reading-related tasks which I feel compelled to include here: web clippers for Evernote and OneNote, Evernote’s Clearly, Readability, Instapaper, Pocket, Mendeley (for reading journal articles), and GoodReads.
Do you have a favorite speed reading application (or bookmarklet)?
Eating at In-N-Out has always been a religious experience for me, but today, to mix things up when ordering lunch, I tried making my order by number, but not In-N-Out’s traditional #1, #2, or #3 system.
I got myself
a Nahum 1:7
a Revelation 3:20 with cheese
two Proverbs 24:16s
two John 3:16s
and a Chocolate Proverbs 3:5.
“What?!” you ask. “I’m all too aware of In-N-Out’s ‘Secret menu’ and have heard of a 4×4 and even a mythical 20×20, but what is a Nahum 1:7?!”
In-N-Out aficionados have probably noticed that the company prints references to Bible verses with just the book, chapter, and verse on their burger wrappers, fry containers, and on the bottom of their cups, so why not order this way as well?
For those not in-the-know, here’s the “translation” to help make your next meal more religious than it already was:
Products and Bible Verses
Burger and cheeseburger wrappers:
I’ll note a few interesting things:
The verse for the hamburger is about dining together with others – this is always important.
If you substitute the product the wrappers contain for the words “Lord,” “God,” and “Son,” there is certain sense of poetic verisimilitude in the new verses: their shakes apparently have a heavenly thickness, the double-double sounds like it will fill you up, and the sugary sodas will give you everlasting life. I wonder what would happen if we transubstantiated a hamburger bun?
Animal Style Anyone?
Now if only there were a special chapter and verse for getting my burger “animal style!”
Genesis 7:2 perhaps?
This might be far preferable to Exodus 22:19:
But let’s be honest, with all the fat, salt, sugar, and cholesterol in a good-ol’ traditional #1, I’m going to die sooner than later whether it comes animal style or not.
I’m curious how many In-N-Out employees know their product so well that they can take orders this way?
Not long ago, my alma mater Johns Hopkins University announced the creation of a task force on Academic Freedom. Since then, I’ve corresponded with the group on a few occasions and in the spirit of my notes to them, I thought I’d share some of those thoughts with others in the academy, science writers/communicators, and even the general public who may also find them useful. Toward that end, below is a slightly modified version of my two main emails to the task force. [They’ve been revised marginally for their appearance and readability in this format and now also include section headings.] While I’m generally writing about Johns Hopkins as an example, I’m sure that the majority of it also applies to the rest of the academy.
On a personal note, the first email has some interesting thoughts and background, while the second email has some stronger broader recommendations.
My First Thoughts to the Task Force
Matthew Green’s Blog and Questions of National Security
Early in September 2013, there was a rather large PR nightmare created for the university (especially as it regards poor representation within the blogosphere and social media) when interim Dean of the Whiting School of Engineering Andrew Douglas requested to have professor Matthew Green’s web presence modified in relation to an alleged anti-NSA post on it. Given the increasing level of NSA related privacy news at the time (and since as relates to the ongoing Edward Snowden case), the case was certainly blown out of proportion. But the Green/NSA story is also one of the most highlighted cases relating to academic freedom in higher education in the last several years, and I’m sure it may be the motivating force behind why the task force was created in the first place. (If you or the task force is unaware of the issues in that case you can certainly do a quick web search, though one of the foremost followers of the controversy was ArsTechnica which provided this post with most of the pertinent information; alternately take a look at what journalism professor Jay Rosen had to say on the issue in the Guardian.) I’m sure you can find a wealth of additional reportage from the Hopkins Office of News and Information which maintains its daily digests of “Today’s News” from around that time period.
In my mind, much of the issue and the outpouring of poor publicity, which redounded to the university, resulted from the media getting information about the situation via social media before the internal mechanisms of the university had the chance to look at the issue in detail and provide a more timely resolution. [Rumors via social media will certainly confirm Mark Twain’s aphorism that “A lie can travel half way around the world while the truth is putting on its shoes.”]
While you’re mulling over the issue of academic freedom, I would highly suggest you all closely consider the increased impact of the internet and particularly social media with regard to any policies which are proposed going forward. As the volunteer creator and initial maintainer of much of Hopkins’ social media presence on both Facebook and Twitter as well as many others for their first five years of existence (JHU was the first university in these areas of social media and most other major institutions followed our early lead), I have a keen insight to how these tools impact higher education. With easy-to-use blogging platforms and social media (Matthew Green had both a personal blog that was hosted outside the University as well as one that was mirrored through the University as well as a Twitter account), professors now have a much larger megaphone and constituency than they’ve had any time in the preceding 450 years of the academy. This fact creates unique problems as it relates to the university, its image, how it functions, and how its professoriate interact with relation to academic freedom, which is a far different animal than it had been even 17 years ago at the dawn of the internet age. Things can obviously become sticky and quickly as evinced in the Green/APL situation which was exacerbated by the APL’s single source of income at a time when the NSA and privacy were foremost in the public eye.
What are Some of the Issues for Academic Freedom in the Digital Age?
Consider the following:
How should/shouldn’t the university regulate the border of social media and internet presence at the line between personal/private lives and professional lives?
How can the university help to promote/facilitate the use of the internet/social media to increase the academic freedom of its professoriate and simultaneously lower the technological hurdles as well as the generational hurdles faced by the academy? (I suspect that few on the task force have personal blogs or twitter accounts, much less professional blogs hosted by the university beyond their simple “business card” information pages through their respective departments.)
How should the university handle issues like the Matthew Green/APL case so that comments via social media don’t gain steam and blow up in the media before the university has a chance to handle them internally? (As I recall, there were about two news cycles of JHU saying “no comment” and resulting bad press which reached the level of national attention prior to a resolution.)
How can the university help to diffuse the issues which led up to the Green/APL incident before they happen?
I hope that the task force is able to spend some time with Dr. Green discussing his case and how it was handled.
Personal Reputation on the Internet in a Connected Age
I also suggest that the students on the task force take a peek into the case file of JHU’s Justin Park from 2007, which has become a textbook-case for expression on the internet/in social media and its consequences (while keeping in mind that it was a social/cultural issue which was the root cause of the incident rather than malice or base racism – this aspect of the case wasn’t/isn’t highlighted in extant internet reportage – Susan Boswell [Long-time Dean of Sudent Life] and Student Activities head Robert Turner can shed more light on the situation). Consider what would the university have done if Justin Park had been a professor instead of a student? What role did communication technology and the internet play in how these situations played out now compared to how they would have been handled when Dr. Grossman was a first year professor just starting out? [Editor’s note: Dr. Grossman is an incredible thought leader, but most of his life and academic work occurred prior to the internet age. Though unconfirmed, I suspect that his internet experience or even experience with email is exceedingly limited.]
In a related issue on academic freedom and internet, I also hope you’re addressing or at least touching on the topic of academic samizdat, so that the university can put forward a clear (and thought-leading) policy on where we stand there as well. I could certainly make a case that the university come out strongly in favor of professors maintaining the ability to more easily self-publish without detriment to their subsequent publication chances in major journals (and resultant potential detriment to the arc of their careers), but the political ramifications in this changing landscape are certainly subtle given that the university deals with both major sides as the employer of the faculty while simultaneously being one of the major customers of the institutionalized research publishing industry. As I currently view the situation, self-publishing and the internet will likely win the day over the major publishers which puts the university in the position of pressing the issue in a positive light to its own ends and that of increasing knowledge for the world. I’m sure Dean Winston Tabb [Dean of the Sheridan Libraries at Johns Hopkins] and his excellent staff could provide the task force with some useful insight on this topic. Simultaneously, how can the increased areas of academic expression/publication (for example the rapidly growing but still relatively obscure area known as the “Digital Humanities”) be institutionalized such that publication in what have previously been non-traditional areas be included more formally in promotion decisions? If professors can be incentivized to use some of their academic freedom and expanded opportunities to both their and the university’s benefit, then certainly everyone wins. Shouldn’t academic freedom also include the freedom of where/when to publish without detriment to one’s future career – particularly in an increasingly more rapidly shifting landscape of publication choices and outlets?
The Modern Research University is a Content Aggregator and Distributor (and Should Be Thought of as Such)
Taking the topic even further several steps further, given the value of the professoriate and their intellectual creations and content, couldn’t/shouldn’t the university create a customized platform to assist their employees in disseminating and promoting their own work? As an example, consider the volume of work (approximate 16,000-20,000 journal articles/year, as well as thousands of articles written for newspapers (NY Times, Wall Street Journal, etc.), magazines, and other outlets – academic or otherwise) being generated every year by those within the university. In a time of decreasing cost of content distribution, universities no longer need to rely on major journals, magazines, television stations, cable/satellite television, et al. to distribute their “product”. To put things in perspective, I can build the infrastructure to start a 24/7 streaming video service equivalent to both a television station and a major newspaper in my garage for the capital cost about $10,000.) Why not bring it all in-house with the benefit of academic flexibility as an added draw to better support the university and its mission? (Naturally, this could all be cross-promoted to other outlets after-the-fact for additional publicity.) At a time when MOOC’s (massively open online courseware) are eroding some of the educational mission within higher education and journals are facing increased financial pressures, perhaps there should be a new model of the university as a massive content/information creation engine and distributor for the betterment of humanity? And isn’t that what Johns Hopkins already is at heart? We’re already one of the largest knowledge creators on the planet, why are we not also simultaneously one of the largest knowledge disseminators – particularly at a time when it is inexpensive to do so, and becoming cheaper by the day?
[Email closing formalities removed]
Expanded Thoughts on Proactive Academic Freedom
Reframing What Academic Freedom Means in the Digital Age
[Second email opening removed]
Upon continued thought and reading on the topic of academic freedom as well as the associated areas of technology, I might presuppose (as most probably do) that the committee will be looking more directly at the concept of preventing the university from impeding the freedom of its faculty and what happens in those situations where action ought to be taken for the benefit of the wider community (censure, probation, warnings, etc.). If it hasn’t been brought up as a point yet, I think one of the most positive things the university could do to improve not only academic freedom, but the university’s position in relation to its competitive peers, is to look at the opposite side of the proverbial coin and actually find a way for the university to PROACTIVELY help promote the voices of its faculty and assist them in broadening their reach.
I touched upon the concept tangentially in my first email (see above), but thought it deserved some additional emphasis, examples to consider, and some possible recommendations. Over the coming decades, the aging professoriate will slowly retire to be replaced with younger faculty who grew up completely within the internet age and who are far more savvy about it as well as the concepts of Web 2.0, the social web and social media. More will be literate in how to shoot and edit short videos and how to post them online to garner attention, readership, and acceptance for their ideas and viewpoints.
The recent PBS Frontline documentary “Generation Like” features a handful of pre-teens and teens who are internet sensations and garnering hundreds of thousands to millions of views of their content online. But imagine for a minute: a savvy professoriate that could do something similar with their academic thought and engaging hundreds, thousands, or millions on behalf of Johns Hopkins? Or consider the agency being portrayed in the documentary [about 30 minutes into the documentary] that helps these internet sensations and what would happen if that type of functionality was taken on by the Provost’s office?
I could presuppose that with a cross-collaboration of the Provost’s office, the Sheridan Libraries, the Film & Media Studies Department, the Digital Media Center, and the Communications Office as an institution we should be able to help better train faculty who are not already using these tools to improve their web presences and reach.
What “Reach” Do Academics Really Have?
I’ve always been struck by my conversations with many professors about the reach of their academic work. I can cite the particular experience of Dr. P.M. Forni, in the Department of Romance Languages at Krieger, when he told me that he’s written dozens of academic papers and journal articles, most of which have “at most a [collective] readership of at most 11 people on the planet” – primarily because academic specialties have become so niche. He was completely dumbfounded on the expanded reach he had in not only writing a main-stream book on the topic of civility, which was heavily influenced by his academic research and background, but in the even more drastically expanded reach provided to him by appearing on the Oprah Winfrey show shortly after its release. Certainly his experience is not a common one, but there is a vast area in between that is being lost, not only by individual professors, but by the university by extension. Since you’re likely aware of the general numbers of people reading academic papers, I won’t bore you, but for the benefit of those on the committee I’ll quote a recent article from Pacific Standard Magazine and provide an additional reference from Physics World, 2007:
Some Examples of Increased Reach in the Academy
To provide some examples and simple statistics on where something like this might go, allow me to present the following brief references:
As a first example, written by an academic about academia, I suggest you take a look at a recent blog post “Why academics should blog and an update on readership” by Artem Kaznatcheev, a researcher in computer science and psychology at McGill University, posting on a shared blog named “Theory, Evolution, and Games Group”. He provides a clear and interesting motivation in the first major portion of his essay, and then unwittingly (for my example), he shows some basic statistics indicating a general minimum readership of 2,000 people which occasionally goes as high as 8,000. (Knowing how his platform operates and provides base-line statistics that he’s using, it’s likely that his readership is actually possibly higher.) If one skims through the blog, it’s obvious that he’s not providing infotainment type of material like one would find on TMZ, Buzzfeed, or major media outlets, but genuine academic thought – AND MANAGING TO REACH A SIZEABLE AUDIENCE! I would posit that even better, that his blog enriching not only himself and his fellow academy colleagues, but a reasonable number of people outside of the academy and therefore the world.
Another example of an even more technical academic blog can be found in that of Dr. Terrence Tao, a Fields Medal winner (the mathematical equivalent of the Nobel prize), and mathematics professor at UCLA. You’ll note that it’s far more technical and rigorous than Dr. Kaznatcheev’s, and though I don’t have direct statistics to back it up, I can posit based on the number of comments his blog has that his active readership is even much higher. Dr. Tao uses his blog to not only expound upon his own work, but uses it to post content for classes, to post portions of a book in process, and to promote the general mathematics research community. (I note that the post he made on 3/19, already within a day has 11 comments by people who’ve read it close enough to suggest typography changes as well as sparked some actual conversation on a topic that requires an education to at least the level of a master’s degree in mathematics.
Business Insider recently featured a list of 50 scientists to follow on social media (Twitter, Facebook, Tumblr, YouTube, and blogs amongst others). While there are a handful of celebrities and science journalists, many of those featured are professors or academics of one sort or another and quite a few of them are Ph.D. candidates (the beginning of the upcoming generation of tech-savvy future faculty I mentioned). Why aren’t there any JHU professors amongst those on this list?
As another clear example, consider the recent online video produced by NPR’s “Science Friday” show featuring research about Water flowing uphill via the Leidenfrost Effect. It is not only generally interesting research work, but this particular research is not only a great advertisement for the University of Bath, it’s a great teaching tool for students, and it features the research itself as well as the involvement of undergraduates in the research. Though I’ll admit that producing these types of vignettes is not necessarily simple, imagine the potential effect on the awareness of the university’s output if we could do this with even 10% of the academic research paper output? Imagine these types of videos as inspiring tools to assist in gaining research funding from government agencies or as fundraising tools for Alumni and Development relations? And how much better that they could be easily shared and spread organically on the web, not necessarily by the JHU Corporate Umbrella, but by its faculty, students, alumni, and friends.
How Does the Academy Begin Accomplishing All of This?
To begin, I’ll mention that Keswick’s new video lab or the Digital Media Center at Homewood and a few others like them are a great start, but they are just the tip of the iceberg (and somewhat unfortunate that faculty from any division will have to travel to use the Keswick facility, if they’re even notionally aware of it and its capabilities).
I recall Mary Spiro, a communications specialist/writer with the Institute of NanoBioTechnology, doing a test-pilot Intersession program in January about 4 years ago in which she helped teach a small group of researchers how to shoot and edit their own films about their research or even tours through their lab. Something like this program could be improved, amplified, and rolled out on a much larger basis. It could also be integrated or dovetailed, in part, with the Digital Media Center and the Film and Media Studies program at Krieger to assist researchers in their work.
The Sheridan Libraries provide teaching/training on using academic tools like bibliographic programs Mendeley.com, RefWorks, Zotero, but they could extend them to social media, blogging, or tools like FigShare, GitHub, and others.
Individual departments or divisions could adopt and easily maintain free content management platforms like WordPress and Drupal (I might even specifically look at their pre-configured product for academia known as OpenScholar, for example take a look at Harvard’s version.) This would make it much easier for even non-technicalminded faculty to more easily come up to speed by removing the initial trouble of starting a blog. It also has the side benefit of allowing the university to assist in ongoing maintenance, backup, data maintenance, hosting, as well as look/feel, branding as well as web optimization. (As a simple example, and not meant to embarrass them, but despite the fact that the JHU Math Department may have been one of the first departments in the university to be on the web, it’s a travesty that their website looks almost exactly as it did 20 years ago, and has less content on it than Terrence Tao’s personal blog which he maintains as a one man operation. I’m sure that some of the issue is political in the way the web has grown up over time at Hopkins, but the lion’s share is technology and management based.)
The Provost’s office in conjunction with IT and the Sheridan Libraries could invest some time and energy in to compiling resources and vetting them for ease-of-use, best practices, and use cases and then providing summaries of these tools to the faculty so that each faculty member need not re-invent the wheel each time, but to get up and running more quickly. This type of resource needs to be better advertised and made idiot-proof (for lack of better terminology) to ease faculty access and adoption. Online resources like the Chronicle of Education’s ProfHacker blog can be mined for interesting tools and use cases, for example.
I know portions of these types of initiatives are already brewing in small individual pockets around the university, but they need to be brought together and better empowered as a group instead of as individuals working separately in a vacuum. In interacting with people across the institution, this technology area seems to be one of those that has been left behind in the “One Hopkins” initiative. One of the largest hurdles is the teaching old dogs new tricks to put it colloquially, but the hurdles for understanding and comprehending these new digital tools is coming down drastically by the day. As part of the social contract in the university’s granting and promoting academic freedom, the faculty should be better encouraged (thought certainly not forced) to exercise it. I’m sure there are mandatory annual seminars on topics like sexual harassment, should there not be mandatory technology trainings as well?
To briefly recap, it would be phenomenal to see the committee make not only their base recommendations on what most consider academic freedom, but to further make a group of strong recommendations about the University proactively teaching, training, and providing a broader array of tools to encourage the active expression of the academic freedom that is provided within Hopkins’ [or even all of the Academy’s] mighty walls.
[Email closing removed]
I certainly welcome any thoughts or comments others may have on these topics. Please feel free to add them in the comments below.
On Friday, I had an excellent and stimulating conversation with Arieh Ben-Naim about his recent writing and work, and he mentioned in passing that he had been invited to a conference relating to entropy and biology in Vienna. A quick websearch found it quickly, and not having heard about it myself yet, I thought I’d pass it along to others who are regular readers and interested in the area.
The workshop on “Entropy in Biomolecular Systems” is being hosted by the Centre Européen de Calcul Atomique et Moléculaire (CECAM)
I’ll note that the registration deadline is on April 21 with a payment deadline of April 30, so check in quickly if you haven’t already.
The summary from the workshop website states:
This workshop brings together the world’s experts to address the challenges of determining the entropy of biomolecular systems, either by experiment or computer simulation. Entropy is one the main driving forces for any biological process such as binding, folding, partitioning and reacting. Our deficient understandng of entropy, however, means that such important processes remain controversial and only partially understood. Contributions of water, ions, cofactors, and biomolecular flexibility are actively examined but yet to be resolved. The state-of-the-art of each entropy method will be presented and explained, highlighting its capabilities and deficiencies. This will be followed by intensive discussion on the main areas that need improving, leading suitable actions and collaborations to address the main biological and industrial questions.
Further details on the workshop can be found on the CECAM website.
USC’s Viterbi School of Engineering has provided the following abstract for the talk:
Entropy, introduced by Shannon in 1948, arises naturally as a universal measure of information in single-source compression, randomness extraction, and random number generation. In distributed systems, such as communication networks, multiprocessors, distributed storage, and sensor networks, there are multiple correlated sources to be processed jointly. The information that is common between these sources can be utilized, for example, to reduce the amount of communication needed for compression, computing, simulation, and secret key generation. My talk will focus on the question of how such common information should be measured. While our understanding of common information is far from complete, I will aim to demonstrate the richness of this question through the lens of network information theory. I will show that, depending on the distributed information processing task considered, there can be several well-motivated measures of common information. Along the way, I will present some of the key models, ideas, and tools of information theory, which invite further investigation into this intriguing subject. Some parts of this talk are based on recent joint work with Gowtham Kumar and Cheuk Ting Li and on discussions with Young-Han Kim.
Biography: Abbas El Gamal is the Hitachi America Professor in the School of Engineering and Chair of the Department of Electrical Engineering at Stanford University. He received his Ph.D. degree in electrical engineering from Stanford University in 1978. He was an Assistant Professor in the Department of Electrical Engineering at the University of Southern California (USC) from 1978 to 1980. His research interests and contributions have spanned the areas of information theory, wireless networks, CMOS imaging sensors and systems, and integrated circuit design and design automation. He has authored or coauthored over 200 papers and 30 patents in these areas. He is coauthor of the book Network Information Theory (Cambridge Press 2011). He has won several honors and awards, including the 2012 Claude E. Shannon Award, and the 2004 Infocom best paper award. He is a member of the National Academy of Engineering and a Fellow of the IEEE. He has been active in several IEEE societies, including serving on the Board on Governors of the IT society where he is currently its President. He cofounded and/or served in various leadership roles at several semiconductor, EDA, and biotechnology companies.
The American Society for Cybernetics is currently planning their 50th Anniversary Conference. Entitled “Living in Cybernetics”, it will be held between August 3 and August 9, 2014, at George Washington University in Washington D.C. For more registration and additional details please visit the conference website.