Hopkins in Hollywood | Johns Hopkins Alumni Event on 1-12-17

Join students and alumni from the Film and Media Studies Program in Culver City

I’ve been invited to participate in a panel discussion as part of an Intersession course by the Johns Hopkins Film and Media Studies Program. I hope fellow alumni in the entertainment and media sectors will come out and join us in Culver City on Thursday.

Join the Hopkins in Hollywood Affinity Group (AEME LA) as they welcome Linda DeLibero, Director of the JHU Film and Media Studies Program, and current students of the program for a dynamic evening of networking which features an alumni panel of industry experts.

Open to alumni, students, and friends of Hopkins, this event is sponsored by Donald Kurz (A&S ’77), Johns Hopkins University Emeritus Trustee and School of Arts and Sciences Advisory Board Member, and the Hopkins in Hollywood (AEME LA) Affinity Group.

Event Date: Thursday, January 12, 2017
Start Time: 6:30pm
End Time: 8:30pm


Donald Kurz, A&S ’77

Donald Kurz is Chairman and CEO of Omelet LLC, an innovative new media and marketing services firm based in Los Angeles.   Previously, Mr. Kurz was co-founder and CEO of hedge fund Artemis Capital Partners.  Between 1990 and 2005, Mr. Kurz was Chairman, President, and CEO of EMAK Worldwide, Inc, a global, NASDAQ-traded company providing Fortune 500 companies with strategic and marketing services internationally. Mr. Kurz’s 25 years’ experience in senior leadership includes management positions with Willis Towers Watson, PwC, and the J.C. Penney Company. Mr. Kurz is a Trustee Emeritus of the Johns Hopkins University, having served for 12 years on the Hopkins board.  He received an MBA from the Columbia University Graduate School of Business and a BA from Johns Hopkins University.

J Altman

Jason Altman, A&S ’99

Jason Altman is an Executive Producer at Activision working on the Skylanders franchise and new development projects.  Prior to Activision, he spent the past 5 years at Ubisoft Paris in different leadership roles, most recently as the Executive Producer of Just Dance, the #1 music video game franchise.  He is a veteran game producer who loves the industry, and is a proud graduate of the media studies program at Johns Hopkins.


Paul Harris Boardman, A&S ’89

Paul Boardman wrote The Exorcism of Emily Rose (2005) and Devil’s Knot (2014), both of which he also produced, and Deliver Us From Evil (2014), which he also executive produced.  In 2008, Paul produced The Day the Earth Stood Still for Fox, and he did production rewrites on Poltergeist, Scream 4, The Messengers, and Dracula 2000, as well as writing and directing the second unit for Hellraiser:  Inferno (2000) and writing Urban Legends:  Final Cut (2000).  Paul has written screenplays for various studios and production companies, including Trimark, TriStar, Phoenix Pictures, Miramax/Dimension, Disney, Bruckheimer Films, IEG, APG, Sony, Lakeshore, Screen Gems, Universal and MGM.

D Chivvis

Devon Chivvis, A&S ’96

Devon Chivvis is a showrunner/director/producer of narrative and non-fiction television and film. Inspired by a life-long passion for visual storytelling combined with a love of adventure and the exploration of other cultures, Devon has made travel a priority through her work in film and television. Devon holds a B.A. from Johns Hopkins University in International Relations and French, with a minor in Italian.

Chris Aldrich

Chris Aldrich, Engr ’96

Chris started his career at Hopkins while running several movie groups on campus and was responsible for over $200,000 of renovations in Shriver Hall including installing a new screen, sound system, and 35mm projection while also running the 29th Annual Milton S. Eisenhower Symposium “Framing Society: A Century of Cinema” on the 100th anniversary of the moving picture.

Following Hopkins he joined Creative Artists Agency where he worked in Motion Picture Talent and also did work in music-crossover. He later joined Davis Entertainment with a deal at 20th Century Fox where he worked on the productions of Heartbreakers, Dr. Dolittle 2, Behind Enemy Lines as well as acquisition and development of Alien v. Predator, Paycheck, Flight of the Phoenix, Garfield, The Man from U.N.C.L.E., I, Robot and countless others.

Missing the faster pace of representation, he later joined Writers & Artists Agency for several years working in their talent, literary, and book departments. Since that time he’s had his own management company focusing on actors, writers, authors, and directors. Last year he started Boffo Socko Books, an independent publishing company and recently put out the book Amerikan Krazy.

Source: Hopkins in Hollywood | Johns Hopkins Alumni


Register Here

More information Office of Alumni Relations
800-JHU-JHU1 (548-5481)

Part of the course:

The Entertainment Industry in Contemporary Hollywood

Students will have the opportunity to spend one week in Los Angeles with Film and Media Studies Director Linda DeLibero. Students will meet and network with JHU alums in the entertainment industry, as well as heads of studios and talent agencies, screenwriters, directors, producers, and various other individuals in film and television. Associated fee with this intersession course is $1400 (financial support is available for those who qualify). Permission of Linda DeLibero is required. Film and Media Studies seniors and juniors will be given preference for the eight available slots, followed by senior minors.Students are expected to arrive in Los Angeles on January 8. The actual course runs January 9-13 with lodging check-in on January 8 and check-out on January 14.

Course Number: AS.061.377.60
Credits: 1
Distribution: H
Days:  Monday 1/9/2017 – Friday 1/13/2017
Times:  M – TBA | Tu- TBA | W- TBA | Th- TBA | F- TBA
Instructor: Linda DeLibero

Omlete LLC, 3540 Hayden Ave, Culver City, CA 90232

Syndicated copies to:

Stop Publishing Web Pages | Anil Dash

Stop Publishing Web Pages by Anil Dash (anildash.com)
Most users on the web spend most of their time in apps. The most popular of those apps, like Facebook, Twitter, Gmail, Tumblr and others, are primarily focused on a single, simple stream that offers a river of news which users can easily scroll through, skim over, and click on to read in more depth. Most media companies on the web spend all of their effort putting content into content management systems which publish pages. These pages work essentially the same way that pages have worked since the beginning of the web, with a single article or post living at...

Stop Publishing Web Pages

Most users on the web spend most of their time in apps. The most popular of those apps, like Facebook, Twitter, Gmail, Tumblr and others, are primarily focused on a single, simple stream that offers a river of news which users can easily scroll through, skim over, and click on to read in more depth.

Most media companies on the web spend all of their effort putting content into content management systems which publish pages. These pages work essentially the same way that pages have worked since the beginning of the web, with a single article or post living at a particular address, and then tons of navigation and cruft (and, usually, advertisements) surrounding that article.

Users have decided they want streams, but most media companies are insisting on publishing more and more pages. And the systems which publish the web are designed to keep making pages, not to make customized streams.

It’s time to stop publishing web pages.

But I’m Reading This On A Web Page Right Now!

Obviously, I’ve written this in an old-style content publishing system, and this piece lives on my website as an old-fashioned HTML page. But if I had my preference, I’d write up an article like this, and it’d seamlessly glide into a clean, simple stream of my writing, organized by topic and sorted with the newest stuff on top. Blogs have always worked this way, but they were shoehorning this stream-like behavior into the best representation possible under the old page model.

I don’t have a tool I can use to run my website which will output a stream that works the right way. “What about using Tumblr to publish your blog?” you ask. Well, besides the fact that my site would have to run on their infrastructure, individual tumblr-style blogs don’t allow you as a reader to personalize or customize the types of content in the stream, the way you would be choosing people to follow on Tumblr, Facebook or Twitter. You can’t choose to follow just the music-related posts on my blog, ignoring the ones about technology.

This isn’t just about how the content looks, it’s also about how it works. The smart, responsive, dynamic apps most of us use on the web everyday have all kinds of subtle but powerful bits of functionality which appear as we hover or click on items in a stream. Meanwhile, our pages are still piling a row of awkward-looking share-button cruft at the bottom.

Also: Dollars

The vast majority of advertising online is dependent on a page-view model that users have overwhelmingly decided to abandon. Facebook, Twitter, Tumblr and others will succeed by making in-stream advertisements that fit in with the native content of their networks. Meanwhile, page-based sites are cramming every corner and bit of white space on their sites with ads that only ever decrease in effectiveness until they are made even larger and more intrusive every few years.

Stream-based content naturally flows across different devices and media, from tiny phones to tablets to giant desktop monitors, with an adaptivity that works naturally hand-in-hand with responsive design. Page based ads basically have to be reimagined on each platform, and fundamentally don’t work in mobile form factors.

Streams of content can easily be read in friendly native apps on mobile platforms with the content flowing through simple APIs. Pages get squeezed into faux-mobile app experiences that look just enough like native apps to be frustrating and annoying when they don’t perform correctly. Pages tell users there’s no mobile version of this story available, or accidentally redirect an interested user to the site’s homepage, from where they quickly depart. Pages stop your flow.

Pages vs. Streams

Let’s Fix This

So: Start publishing streams. Start moving your content management system towards a future where it outputs content to simple APIs, which are consumed by stream-based apps that are either HTML5 in the browser and/or native clients on mobile devices. Insert your advertising into those streams using the same formats and considerations that you use for your own content. Trust your readers to know how to scroll down and skim across a simple stream, since that’s what they’re already doing all day on the web. Give them the chance to customize those streams to include (or exclude!) just the content they want.

Pay attention to the fact that all the links you click on Twitter, on Facebook, on Pinterest, all take you to out of the simple flow of those apps and into a jarring, cluttered experience where the most appealing option is the back button. Stop being one of those dead-end experiences and start being more like what users have repeatedly demonstrated they prefer.

And if you’re smugly thinking “oh, we’re an app — he’s only talking about publishing content, so we don’t have to pay attention”, then you should get to work, too. Except for power tools which need to make use of the screen in a particular way, most of our other apps are going to be rearranged into streams, too.

Further Reading

  • From ten years ago, Stories and Tools (Michael Sippey, now Twitter’s head of consumer product, liked this piece so much back then that he republished it.)
  • At Activate, we created a presentation called “What Matters” at the end of last year; It starts by offering some data about use of page-based sites vs. stream-based sites by web users.

Source: Stop Publishing Web Pages – Anil Dash

Reply to Antonio Sánchez-Padial about webmentions for academic research

a tweet by Antonio Sánchez-PadialAntonio Sánchez-Padial (Twitter)

Many academics are using academic related social platforms (silos) like Mendeley, Academia.edu, Research Gate and many others to collaborate, share data, and publish their work. (And should they really be trusting that data to those outside corporations?)

A few particular examples: I follow physicist John Carlos Baez and mathematician Terry Tao who both have one or more academic blogs for various topics which they POSSE work to several social silos including Google+ and Twitter. While they get some high quality response to posts natively, some of their conversations are forked/fragmented to those other silos. It would be far more useful if they were using webementions (and Brid.gy) so that all of that conversation was being aggregated to their original posts. If they supported webmentions directly, I suspect that some of their collaborators would post their responses on their own sites and send them after publication as comments. (This also helps to protect primacy and the integrity of the original responses as the receiving site could moderate them out of existence, delete them outright, or even modify them!)

While it’s pretty common for researchers to self-publish (sometimes known as academic samizdat) their work on their own site and then cross-publish to a pre-print server (like arXiv.org), prior to publishing in a (preferrably) major journal. There’s really no reason they shouldn’t just use their own personal websites, or online research journals like yours, to publish their work and then use that to collect direct comments, responses, and replies to it. Except possibly where research requires hosting uber-massive data sets which may be bandwidth limiting (or highly expensive) at the moment, there’s no reason why researchers shouldn’t self-host (and thereby own) all of their work.

Instead of publishing to major journals, which are all generally moving to an online subscription/readership model anyway, they might publish to topic specific hubs (akin to pre-print servers or major publishers’ websites). This could be done in much the same way many Indieweb users publish articles/links to IndieWeb News: they publish the piece on their own site and then syndicate it to the hub by webmention using the hub’s endpoint. The hub becomes a central repository of the link to the original as well as making it easier to subscribe to updates via email, RSS, or other means for hundreds or even thousands of researchers in the given area. Additional functionality could be built into these to support popularity measures as well to help filter some of the content on a weekly or monthly basis, which is essentially what many publishers are doing now.

In the end, citation metrics could be measured directly on the author’s original page by the number of incoming webmetions they’ve received on it as others referencing them would be linking to them and therefore sending webmentions. (PLOS|One does something kind of like this by showing related tweets which mention particular papers now: here’s an example.)

Naturally there is some fragility in some of this and protective archive measures should be taken to preserve sites beyond the authors lives, but much of this could be done by institutional repositories like University libraries which do much of this type of work already.

I’ve been meaning to write up a much longer post about how to use some of these types of technologies to completely revamp academic publishing, perhaps I should finish doing that soon? Hopefully the above will give you a little bit of an idea of what could be done.

Syndicated copies to:

Reply to Manton Reece: This morning I launched the Kickstarter project for Micro.blog. Really happy with the response. Thank you, everyone!

Manton Reece by Manton Reece (manton.org)

Manton, I’ve been following your blog and your indieweb efforts for creating a microblogging platform for a while. I’m excited to see your Kickstarter effort doing so well this afternoon!

As a fellow IndieWeb proponent, and since I know how much work such an undertaking can be, I’m happy to help you with the e-book and physical book portions of your project on a voluntary basis if you’d like. I’ve got a small publishing company set up to handle the machinery of such an effort as well as being able to provide services that go above and beyond the usual low-level services most self-publishing services might provide. Let me know if/how I can help.

Syndicated copies to:

Chris Aldrich is reading “What to expect when you’re publishing on Amazon Kindle Store”

Nothing new or interesting here.


Syndicated copies to:

Chris Aldrich is reading “Lulu.com launches academic service suite – Glasstree”

Syndicated copies to:

Weekly Recap: Interesting Articles 7/24-7/31 2016

Some of the interesting things I saw and read this week

Went on vacation or fell asleep at the internet wheel this week? Here’s some of the interesting stuff you missed.

Science & Math


Indieweb, Internet, Identity, Blogging, Social Media


Syndicated copies to:

Some Thoughts on Academic Publishing and “Who’s downloading pirated papers? Everyone” from Science | AAAS

Who's downloading pirated papers? Everyone by John Bohannon (Science | AAAS)
An exclusive look at data from the controversial web site Sci-Hub reveals that the whole world, both poor and rich, is reading pirated research papers.

Sci Hub has been in the news quite a bit over the past half a year and the bookmarked article here gives some interesting statistics. I’ll preface some of the following editorial critique with the fact that I love John Bohannon’s work; I’m glad he’s spent the time to do the research he has. Most of the rest of the critique is aimed at the publishing industry itself.

From a journalistic standpoint, I find it disingenuous that the article didn’t actually hyperlink to Sci Hub. Neither did it link out (or provide a full quote) to Alicia Wise’s Twitter post(s) nor link to her rebuttal list of 20 ways to access their content freely or inexpensively. Of course both of these are editorial related, and perhaps the rebuttal was so flimsy as to be unworthy of a link from such an esteemed publication anyway.

Sadly, Elsevier’s list of 20 ways of free/inexpensive access doesn’t really provide any simple coverage for graduate students or researchers in poorer countries which are the likeliest group of people using Sci Hub, unless they’re going to fraudulently claim they’re part of a class which they’re not, and is this morally any better than the original theft method? It’s almost assuredly never used by patients, which seem to be covered under one of the options, as the option to do so is painfully undiscoverable past their typical $30/paper firewalls. Their patchwork hodgepodge of free access is so difficult to not only discern, but one must keep in mind that this is just one of dozens of publishers a researcher must navigate to find the one thing they’re looking for right now (not to mention the thousands of times they need to do this throughout a year, much less a career).

Consider this experiment, which could be a good follow up to the article: is it easier to find and download a paper by title/author/DOI via Sci Hub (a minute) versus through any of the other publishers’ platforms with a university subscription (several minutes) or without a subscription (an hour or more to days)? Just consider the time it would take to dig up every one of 30 references in an average journal article: maybe just a half an hour via Sci Hub versus the days and/or weeks it would take to jump through the multiple hoops to first discover, read about, and then gain access and then download them from the over 14 providers (and this presumes the others provide some type of “access” like Elsevier).

Those who lived through the Napster revolution in music will realize that the dead simplicity of their system is primarily what helped kill the music business compared to the ecosystem that exists now with easy access through the multiple streaming sites (Spotify, Pandora, etc.) or inexpensive paid options like (iTunes). If the publishing business doesn’t want to get completely killed, they’re going to need to create the iTunes of academia. I suspect they’ll have internal bean-counters watching the percentage of the total (now apparently 5%) and will probably only do something before it passes a much larger threshold, though I imagine that they’re really hoping that the number stays stable which signals that they’re not really concerned. They’re far more likely to continue to maintain their status quo practices.

Some of this ease-of-access argument is truly borne out by the statistics of open access papers which are downloaded by Sci Hub–it’s simply easier to both find and download them that way compared to traditional methods; there’s one simple pathway for both discovery and download. Surely the publishers, without colluding, could come up with a standardized method or protocol for finding and accessing their material cheaply and easily?

“Hart-Davidson obtained more than 100 years of biology papers the hard way—legally with the help of the publishers. ‘It took an entire year just to get permission,’ says Thomas Padilla, the MSU librarian who did the negotiating.” John Bohannon in Who’s downloading pirated papers? Everyone

Personally, I use use relatively advanced tools like LibX, which happens to be offered by my institution and which I feel isn’t very well known, and it still takes me longer to find and download a paper than it would via Sci Hub. God forbid if some enterprising hacker were to create a LibX community version for Sci Hub. Come to think of it, why haven’t any of the dozens of publishers built and supported simple tools like LibX which make their content easy to access? If we consider the analogy of academic papers to the introduction of machine guns in World War I, why should modern researchers still be using single-load rifles against an enemy that has access to nuclear weaponry?

My last thought here comes on the heels of the two tweets from Alicia Wise mentioned, but not shown in the article:

She mentions that the New York Times charges more than Elsevier does for a full subscription. This is tremendously disingenuous as Elsevier is but one of dozens of publishers for which one would have to subscribe to have access to the full panoply of material researchers are typically looking for. Further, Elsevier nor their competitors are making their material as easy to find and access as the New York Times does. Neither do they discount access to the point that they attempt to find the subscription point that their users find financially acceptable. Case in point: while I often read the New York Times, I rarely go over their monthly limit of articles to need any type of paid subscription. Solely because they made me an interesting offer to subscribe for 8 weeks for 99 cents, I took them up on it and renewed that deal for another subsequent 8 weeks. Not finding it worth the full $35/month price point I attempted to cancel. I had to cancel the subscription via phone, but why? The NYT customer rep made me no less than 5 different offers at ever decreasing price points–including the 99 cents for 8 weeks which I had been getting!!–to try to keep my subscription. Elsevier, nor any of their competitors has ever tried (much less so hard) to earn my business. (I’ll further posit that it’s because it’s easier to fleece at the institutional level with bulk negotiation, a model not too dissimilar to the textbook business pressuring professors on textbook adoption rather than trying to sell directly the end consumer–the student, which I’ve written about before.)

(Trigger alert: Apophasis to come) And none of this is to mention the quality control that is (or isn’t) put into the journals or papers themselves. Fortunately one need’t even go further than Bohannon’s other writings like Who’s Afraid of Peer Review? Then there are the hordes of articles on poor research design and misuse of statistical analysis and inability to repeat experiments. Not to give them any ideas, but lately it seems like Elsevier buying the Enquirer and charging $30 per article might not be a bad business decision. Maybe they just don’t want to play second-banana to TMZ?

Interestingly there’s a survey at the end of the article which indicates some additional sources of academic copyright infringement. I do have to wonder how the data for the survey will be used? There’s always the possibility that logged in users will be indicating they’re circumventing copyright and opening themselves up to litigation.

I also found the concept of using the massive data store as a means of applied corpus linguistics for science an entertaining proposition. This type of research could mean great things for science communication in general. I have heard of people attempting to do such meta-analysis to guide the purchase of potential intellectual property for patent trolling as well.

Finally, for those who haven’t done it (ever or recently), I’ll recommend that it’s certainly well worth their time and energy to attend one or more of the many 30-60 minute sessions most academic libraries offer at the beginning of their academic terms to train library users on research tools and methods. You’ll save yourself a huge amount of time.

Syndicated copies to:

Thoughts on “Some academics remain skeptical of Academia.edu” | University Affairs

This morning I ran across a tweet from colleague Andrew Eckford:

His response was probably innocuous enough, but I thought the article should be put to task a bit more.

“35 million academics, independent scholars and graduate students as users, who collectively have uploaded some eight million texts”

35 million users is an okay number, but their engagement must be spectacularly bad if only 8 million texts are available. How many researchers do you know who’ve published only a quarter of an article anywhere, much less gotten tenure?

“the platform essentially bans access for academics who, for whatever reason, don’t have an Academia.edu account. It also shuts out non-academics.”

They must have changed this, as pretty much anyone with an email address (including non-academics) can create a free account and use the system. I’m fairly certain that the platform was always open to the public from the start, but the article doesn’t seem to question the statement at all. If we want to argue about shutting out non-academics or even academics in poorer countries, let’s instead take a look at “big publishing” and their $30+/paper paywalls and publishing models, shall we?

“I don’t trust academia.edu”

Given his following discussion, I can only imagine what he thinks of big publishers in academia and that debate.

“McGill’s Dr. Sterne calls it “the gamification of research,”

Most research is too expensive to really gamify in such a simple manner. Many researchers are publishing to either get or keep their jobs and don’t have much time, information, or knowledge to try to game their reach in these ways. And if anything, the institutionalization of “publish or perish” has already accomplished far more “gamification”, Academia.edu is just helping to increase the reach of the publication. Given that research shows that most published research isn’t even read, much less cited, how bad can Academia.edu really be? [Cross reference: Reframing What Academic Freedom Means in the Digital Age]

If we look at Twitter and the blogging world as an analogy with Academia.edu and researchers, Twitter had a huge ramp up starting in 2008 and helped bloggers obtain eyeballs/readers, but where is it now? Twitter, even with a reasonable business plan is stagnant with growing grumblings that it may be failing. I suspect that without significant changes that Academia.edu (which is a much smaller niche audience than Twitter) will also eventually fall by the wayside.

The article rails against not knowing what the business model is or what’s happening with the data. I suspect that the platform itself doesn’t have a very solid business plan and they don’t know what to do with the data themselves except tout the numbers. I’d suspect they’re trying to build “critical mass” so that they can cash out by selling to one of the big publishers like Elsevier, who might actually be able to use such data. But this presupposes that they’re generating enough data; my guess is that they’re not. And on that subject, from a journalistic viewpoint, where’s the comparison to the rest of the competition including ResearchGate.net or Mendeley.com, which in fact was purchased by Elsevier? As it stands, this simply looks like a “hit piece” on Academia.edu, and sadly not a very well researched or reasoned one.

In sum, the article sounds to me like a bunch of Luddites running around yelling “fire”, particularly when I’d imagine that most referred to in the piece feed into the more corporate side of publishing in major journals rather than publishing it themselves on their own websites. I’d further suspect they’re probably not even practicing academic samizdat. It feels to me like the author and some of those quoted aren’t actively participating in the social media space to be able to comment on it intelligently. If the paper wants to pick at the academy in this manner, why don’t they write an exposé on the fact that most academics still have websites that look like they’re from 1995 (if, in fact, they have anything beyond their University’s mandated business card placeholder) when there are a wealth of free and simple tools they could use? Let’s at least build a cart before we start whipping the horse.

For academics who really want to spend some time and thought on a potential solution to all of this, I’ll suggest that they start out by owning their own domain and own their own data and work. The #IndieWeb movement certainly has an interesting philosophy that’s a great start in fixing the problem; it can be found at http://www.indiewebcamp.com.

Syndicated copies to:

Moneyball for Book Publishers: A Detailed Look at How We Read

A reader analytics company in London wants to use data on our reading habits to transform how publishers acquire, edit and market books.

likes Moneyball for Book Publishers: A Detailed Look at How We Read – The New York Times


Syndicated copies to:

Git and Version Control for Novelists, Screenwriters, Academics, and the General Public

Revision (or version) control is used in tracking changes in computer programs, but it can easily be used for tracking changes in almost any type of writing from novels, short stories, screenplays, legal contracts, or any type of textual documentation.

Marginalia and Revision Control

At the end of April, I read an article entitled “In the Margins” in the Johns Hopkins University Arts & Sciences magazine.  I was particularly struck by the comments of eminent scholar Jacques Neefs on page thirteen (or paragraph 20) about computers making marginalia a thing of the past:

Neefs believes contemporary literature is losing a valuable component in an age when technology often precludes and trumps the need to save manuscripts or rough drafts. But it is not something that keeps him up at night. ‘The modern technique of computers and everything makes [marginalia] a thing of the past,’ he says. ‘There’s a new way of creation. Some would say it’s tragic, but something new has been invented. I don’t consider it tragic. There are still great writers who write and continue to have a way to keep the process.’

Photo looking over the shoulder of Jacques Neefs onto the paper he's been studing on the table in front of him.
Jacques Neefs (Image courtesy of Johns Hopkins University)

I actually think that he may be completely wrong and that current technology actually allows us to keep far more marginalia! (Has anyone heard of digital exhaust?) The bigger issue may be that many writers just don’t know how to keep a better running log of their work to maintain all the relevant marginalia they’re actually producing. (Of course there’s also the subsequent broader librarian’s “digital dilemma” of maintaining formats for the future. As an example, thing about how easy or hard it might be for you to read that ubiquitous 3.5 inch floppy disk you used in 1995.)

A a technologist who has spent many years in the entertainment industry, I feel compelled to point everyone towards the concept of revision control (or version control) within the realm of computer science.  Though it’s primarily used in tracking changes in computer programs and is often a tool used by large teams of programmers, it can very easily be used for tracking changes in almost any type of writing from novels, short stories, screenplays, legal contracts, or any type of textual documentation of nearly any sort.

Example Use Cases for Revision Control


As a direct example, I’m using what is known as a Git repository to track every change I make in a textbook I’m currently writing.  I can literally go back and view every change I’ve made since beginning the project, so though I’m directly revising one (or more) text files, all of my “marginalia” and revisions are saved and available.  Currently I’m only doing it for my own reference and for additional backup not supposing that anyone other than myself or an editor possibly may want to ever peruse it.  If I was working in conjunction with otheres, there are ways for me to track the changes, edits, or notes that others (perhaps an editor or collaborator) might make.

In addition to the general back-up of the project (in case of catastrophic computer failure), I also have the ability to go back and find that paragraph (or multiple pages) I deleted last week in haste, but realize that I desperately want them back now instead of having to recreate them de n0vo.

Because it’s all digital, future scholars also won’t have problems parsing my handwriting issues as has occasionally come up in differentiating Mary Shelley’s writing from that of her husband in digital projects like the Shelley Godwin Archive. The fact that all changes are tracked and placed in a tree-like structure will indicate who wrote what and when and will indicate which changes were ultimately accepted and merged into the final version.

Screenplays in Hollywood

One particular use case I can easily see for such technology is tracking changes in screenplays over time.  I’m honestly shocked that every production company or even more likely studios don’t use such technology to follow changes in drafts over time. In the end, doing such tracking will certainly make Writers Guild of America (WGA) arbitrations much easier as literally every contribution to a script can be tracked to give screenwriters appropriate credit. The end results with the easy ability to time-machine one’s way back into older drafts is truly lovely, and the outputs give so much more information about changes in the script compared to the traditional and all-too-simple (*) which screenwriters use to indicate that something/anything changed on a specific line or the different colored pages which are used on scripts during production.

I can also picture future screenwriters using services like GitHub as platforms for storing and distributing their screenplays to potential agents, managers, and producers.

Redlining Legal Documents

Having seen thousands of legal agreements go back and forth over the years, revision control is a natural tool for tracking the redlining and changes of legal documents as they change over time before they are finally (or even never) executed. I have to imagine that being able to abstract out the appropriate metadata in the long run may actually help attorneys, agents, etc. to become better negotiators, but something like this is a project for another day.


In addition to direct research for projects being undertaken by academics like Neefs, academics should look into using revision control in their own daily work and writings.  While writing a book, paper, journal article, essay, monograph, etc. (or graduate students writing theses) one could use their own Git repository to not only save but to back up all of their own work not only for themselves primarily, but also future scholars who come later who would not otherwise have access to the “marginalia” one creates while manufacturing their written thoughts in digital form.

I can easily picture Git as a very simple “next step” in furthering the concept of the digital humanities as well as in helping to bridge the gap between C.P. Snow’s “two cultures.” (I’d also suggest that revision control is a relatively simple step one could take before learning a particular programming language, which I think should be a mandatory tool in everyone’s daily toolbox regardless of their field(s) of interest.)

Git Logo 

Start Using Revision Control

“But how do I get started?” you ask.

Know going in that it may take parts of a day to get things set up and running, but once you’ve started with the basics, things are actually pretty easy and you can continue to learn the more advanced subtleties as you progress.  Once things are working smoothly, the additional overhead you’ll be expending won’t be too much more than the old method of hitting Alt-S to save one of your old Word documents in the time before auto-save became ubiquitous.

First one should start by choosing one of the myriad revision control systems that exist.  For the sake of brevity in this short introductory post, I’ll simply suggest that users take a very close look at Git because of its ubiquity and popularity in the computer science world and the fact that it includes a tremendously large amount of free information and support from a variety of sites on the internet. Git also has the benefit of having versions for all major operating systems (Windows, MacOS, and Linux). Git also has the benefit of a relatively long and robust life within the computer science community meaning that it’s very stable and has many more resources for the uninitiated to draw upon.

Once one has Git installed on their computer and has begun using it, I’d then recommending linking one’s local copy of the repository to a cloud storage solution like either GitHub or BitBucket.  While GitHub is certainly one of the most popular Git-related services out there (because it acts, in part, as the hub for a large portion of the open internet and thus promotes sharing), I often recommend using BitBucket as it allows free unlimited private but still share-able repositories while GitHub requires a small subscription fee for keeping one’s work private. Having a repository in the cloud will help tremendously in that your work will be available and downloadable from almost anywhere and because it also serves as a de-facto back-up solution for your work.

I’ve recently been playing around with version control to help streamline the writing/editing process for a book I’ve been writing. Though Git and it’s variants probably seem more daunting than they should to the everyday user, they really represent a very powerful tool. I’ve spent less than two days learning the basics of both Git and hosted repositories (GitHub and Bitbucket), and it has been more than well worth the minor effort.

There is a huge wealth of information on revision control in general and on installing and using Git available on the internet, including full textbooks. For the complete beginners, I’d recommend starting with The Chronicle’s “A Gentle Introduction to Version Control.” Keep in mind that though some of these resources look highly technical, it’s because many are trying to enumerate every function one could potentially desire, when even just the basic core functionality is more than enough to begin with. (I could analogize it to learning to drive a car versus actually reading the full manual so that you know how to take the engine apart and put it back together from scratch. To start with revision control, you only need to learn to “drive.”) Professors might also avail themselves of the use of their local institutional libraries which may host small sessions on learning such tools, or they might avail themselves of the help of their colleagues or students in the computer science department. For others, I’d recommend taking a look at Git’s primary website. BitBucket has an excellent step-by-step tutorial (and troubleshooting) for setting up the requisite software and using it.

What do you use for revision control?

I’ll welcome any thoughts, experiences, or additional resources one might want to share with others in the comments.

Syndicated copies to:

Rap Genius, a Textual Annotation Browser for Education, Digital Humanities, Science, and Publishing

Since the beginning of January, I’ve come back to regularly browsing and using the website Rap GeniusI’m sure that some of the education uses including poetry and annotations of classics had existed the last time I had visited, but I was very interested in seeing some of the scientific journal article uses which I hadn’t seen before. Very quickly browsing around opened up a wealth of ideas for using the platform within the digital humanities as well as for a variety of educational uses.

Rap Genius logo

Overview of Rap Genius

Briefly, the Rap Genius website was originally set up as an innovative lyrics service to allow users to not only upload song lyrics, but to mark them up with annotations as to the meanings of words, phrases, and provide information about the pop-culture references within the lyrics themselves.  (It’s not too terribly different from Google’s now-defunct Sidewicki or the impressive Highbrow, textual annotation browser, but has some subtle differences as well as improvements.)

Users can use not only text, but photos, video, and even audio to supplement the listings. Built-in functionality includes the ability to link the works to popular social media audio services SoundCloud, and Spotify as well as YouTube. Alternately one might think of it as VH1’s “Pop-up Video”, but for text on the Internet. Ultimately the site expanded to include the topics of rock, poetry, and news.  The rock section is fairly straightforward following the format of the rap section while the poetry section includes not only works of poetry (from The Rime of the Ancient Mariner to the King James version of The Bible), but also plays (the works of William Shakespeare) and complete novels (like F. Scott Fitzgerald’s The Great Gatsby.) News includes articles as well as cultural touchstones like the 2013 White House Correspondents’ Dinner Speech and the recent State of the Union. Ultimately all of the channels within Rap Genius platform share the same types of functionality, but are applied to slightly different categories to help differentiate the content and make things easier to find.  Eventually there may be a specific “Education Genius” (or other) landing page(s) to split out the content in the future depending on user needs.

On even its first blush, I can see this type of website functionality being used in a variety of educational settings including Open Access Journals, classroom use, for close readings, for MOOCs, publishing in general, and even for maintaining simple-to-use websites for classes. The best part is that the ecosystem is very actively growing and expanding with a recent release of an iPhone app and an announcement of a major deal with Universal to license music lyrics.

General Education Use

To begin with, Rap Genius’ YouTube channel includes an excellent short video on how Poetry Genius might be used in a classroom setting for facilitating close-readings. In addition to the ability to make annotations, the site can be used to maintain a class specific website (no need to use other blogging platforms like WordPress or Blogger for things like this anymore) along with nice additions like maintaining a class roster built right in.  Once material begins to be posted, students and teachers alike are given a broad set of tools to add content, make annotations, ask questions, and provide answers in an almost real-time setting.

Screen capture from Poetry Genius featuring The Great Gatsby

MOOC Use Cases

Given the rapid growth of the MOOC-revolution (massively open online courseware) over the past several years, one of the remaining difficulties in administering such a class can hinge not only on being able to easily provide audio visual content to students, but allow them a means of easily interacting with it and each other in the learning process.  Poetry Genius (aka Education Genius) has a very interesting view into solving both of these problems, and, in fact, I can easily see the current version of the platform being used to replace competing platforms like Coursera, EdX, Udacity and others in a whole cloth fashion.

Currently most MOOC’s provide some type of simple topic-based threaded fora in which students post comments and questions as well as answers.  In many MOOCs this format becomes ungainly because of the size of the class (10,000+ students) and the quality of the content which is being placed into it. Many students simply eschew the fora because the time commitment per amount of knowledge/value gained is simply not worth their while. Within the Poetry Genius platform, students can comment directly on the material or ask questions, or even propose improvements, and the administrators (the professor or teaching assistants in this case) can accept, reject or send feedback request to students to amend their work and add it to the larger annotated work.  Fellow classmates can also vote up or down individual comments.

As I was noticing the interesting educational-related functionality of the Rap Genius platform, I ran across what is presumably the first MOOC attempting to integrate the platform into its pedagogical structure. Dr. Laura Nasrallah’s HarvardX course “Early Christianity: The Letters of Paul,” which started in January, asks students to also create Poetry Genius accounts to read and comment on the biblical texts which are a part of the course. The difficult portion of attempting to use Poetry Genius for this course is the thousands of “me-too” posters who are simply making what one might consider to be “throw-away” commentary rather than the intended “close reading” commentary for a more academic environment. (This type of posting is also seen in many of the fora-based online courses.) Not enough students are contributing substantial material, and when they are, it needs to be better and more quickly edited and curated into the main post to provide greater value to students as they’re reading along. Thus when 20,000 students jump into the fray, there’s too much initial chaos and the value that is being extracted out of it upon initial use is fairly limited – particularly if one is browsing through dozens of useless comments. It’s not until after-the-fact – once comments have been accepted/curated – that the real value will emerge. The course staff is going to have to spend more time doing this function in real time to provide greater value to the students in the class, particularly given the high number of people without intense scholarly training just jumping into the system and filling it with generally useless commentary. In internet parlance, the Poetry Genius site is experiencing the “Robert Scoble Effect” which changes the experience on it. (By way of explanation, Robert Scoble is a technology journalist/pundit/early-adopter with a massive follower base.  His power-user approach and his large following can drastically change his experience with web-based technology compared to the  common everyday user. It can also often bring down new services as was common in the early days of the social media movement.)

Typically with the average poem or rap song, the commentary grows slowly/organically and is edited along the way. In a MOOC setting with potentially hundreds of thousands of students, the commentary is like a massive fire-hose which makes it seemingly useless without immediate real-time editing. Poetry Genius may need a slightly different model for using their platform in larger MOOC-style courses versus the smaller classroom settings seen in high school or college (10-100 students). In the particular case for “The Letters of Paul,” if the course staff had gone into the platform first and seeded some of the readings with their own sample commentary to act as a model of what is expected, then the students would be a bit more accepting of what is expected. I understand Dr. Nasrallah and her teaching assistants are in the system and annotating as well, but it should also be more obvious which annotations are hers (or those of teaching assistants) to help better guide the “discussion” and act as a model. Certainly the materials generated on Poetry Genius will be much more useful for future students who take the course in future iterations. Naturally, Poetry Genius exists for the primary use of annotation, while I’m sure that the creators will be tweaking classroom-specific use as the platform grows and user needs/requirements change.

As a contrast to the HarvardX class, and for an additional example, one can also take a peek at Cathy Davidson’s Rap Genius presence for her Coursera class “The History and Future (Mostly) of Higher Education.”

Open Access Journal Use

In my mind, this type of platform can easily and usefully be used for publishing open access journal articles. In fact, one could use the platform to self-publish journal articles and leave them open to ongoing peer review. Sadly at present, there seems to be only a small handful of examples on the site, including a PLOS ONE article, which will give a reasonable example of some of the functionality which is possible.  Any author could annotate and footnote their own article as well as include a wealth of photos, graphs, and tables giving a much more multimedia view into their own work.  Following this any academic with an account could also annotate the text with questions, problems, suggestions and all of these can be voted up or down as well as be remedied within the text itself. Other articles can also have the ability to directly cross-reference specific sections of previously posted articles.

Individual labs or groups with “journal clubs” could certainly join in the larger public commentary and annotation on a particular article, but higher level administrative accounts within the system can also create a proverbial clean slate on an article and allow members to privately post up their thoughts and commentaries which are then closed to the group and not visible to the broader public. (This type of functionality can be useful for Mrs. Smith’s 10th grade class annotating The Great Gatsby so that they’re not too heavily influenced by the hundreds or possibly thousands of prior comments within a given text as they do their own personal close readings.) One may note that some of this type of functionality can already be seen in competitive services like Mendeley, but the Rap Genius platform seems to take the presentation and annotation functionalities to the next level. For those with an interest in these types of uses, I recommend Mendeley’s own group: Reinventing the Scientific Paper.

A Rap Genius representative indicated they were pursuing potential opportunities with JSTOR that might potentially expand on these types of opportunities.


Like many social media related sites including platforms like WordPress, Tumblr, and Twitter, Rap Genius gives it’s users the ability to self-publish almost any type of content. I can see some excellent cross-promotional opportunities with large MOOC-type classes and the site. For example, professors/teachers who have written their own custom textbooks for MOOCs (eg. Keith Devlin’s Introduction to Mathematical Thinking course at Stanford via Coursera) could post up the entire text on the Poetry Genius site and use it not only to correct mistakes/typos and make improvements over time, but they can use it to discover things which aren’t clear to students who can make comments, ask questions, etc. There’s also the possibility that advanced students can actively help make portions clear themselves when there are 10,000+ students and just 1-2 professors along with 1-2 teaching assistants. Certainly either within or without the MOOC movement, this type of annotation set up may work well to allow authors to tentatively publish, edit, and modify their textbooks, novels, articles, journal articles, monographs, or even Ph.D. theses. I’m particularly reminded of Kathleen Fitzpatrick’s open writing/editing of her book Planned Obsolescence via Media Commons. Academics could certainly look at the Rap Genius platform as a simpler more user-friendly version of this type of process.

Other Uses

I’m personally interested in being able to annotate science and math related articles and have passed along some tips for the Rap Genius team to include functionality like mathjax to be able to utilize Tex/LaTeX related functionality for typesetting mathematics via the web in the future.

Naturally, there are a myriad of other functionalities that can be built into this type of platform – I’m personally waiting for a way to annotate episodes of “The Simpsons”, so I can explain all of the film references and in-jokes to friends who laugh at their jokes, but never seem to know why – but I can’t write all of them here myself.

Interested users can easily sign up for a general Rap Genius account and dig right into the interface.  Those interested in education-specific functionality can request to be granted an “Educator Account” within the Rap Genius system to play around with the additional functionality available to educators. Every page in the system has an “Education” link at the top for further information and details. There’s also an Educator’s Forum [requires free login] for discussions relating specifically to educational use of the site.

Are there particular (off-label) applications you think you might be able to use the Rap Genius platform for? Please add your comments and thoughts below.

Syndicated copies to:

How to Sidestep Mathematical Equations in Popular Science Books

In the publishing industry there is a general rule-of-thumb that every mathematical equation included in a book will cut the audience of science books written for a popular audience in half – presumably in a geometric progression. This typically means that including even a handful of equations will give you an effective readership of zero – something no author and certainly no editor or publisher wants.

I suspect that there is a corollary to this that every picture included in the text will help to increase your readership, though possibly not by as proportionally a large amount.

In any case, while reading Melanie Mitchell’s text Complexity: A Guided Tour [Cambridge University Press, 2009] this weekend, I noticed that, in what appears to be a concerted effort to include an equation without technically writing it into the text and to simultaneously increase readership by including a picture, she cleverly used a picture of Boltzmann’s tombstone in Vienna! Most fans of thermodynamics will immediately recognize Boltzmann’s equation for entropy, S = k log W , which appears engraved on the tombstone over his bust.

Page 51 of Melanie Mitchell's book "Complexity: A Guided Tour"
Page 51 of Melanie Mitchell’s book “Complexity: A Guided Tour” featuring Boltzmann’s tombstone in Vienna.

I hope that future mathematicians, scientists, and engineers will keep this in mind and have their tombstones engraved with key formulae to assist future authors in doing the same – hopefully this will help to increase the amount of mathematics that is deemed “acceptable” by the general public.

Academy of Motion Picture Arts & Sciences study on The Digital Dilemma

With a slight nod toward the Academy’s announcements of the Oscar nominees this morning, there’s something more interesting which they’ve recently released which hasn’t gotten nearly as much press, but portends to be much more vital in the long run.


As books enter the digital age and we watch the continued convergence of rich media like video and audio enter into e-book formats with announcements last week like Apple’s foray into digital publishing, the ability to catalog, maintain and store many types of digital media is becoming an increasing problem.  Last week the Academy released part two of their study on strategic issues in archiving and accessing digital motion picture materials in their report entitled The Digital Dilemma 2. Many of you will find it interesting/useful, particularly in light of the Academy’s description

The Digital Dilemma 2 reports on digital preservation issues facing communities that do not have the resources of large corporations or other well-funded institutions: independent filmmakers, documentarians and nonprofit audiovisual archives.

Clicking on the image of the report below provides some additional information as well as the ability (with a simple login) to download a .pdf copy of their entire report.


There is also a recent Variety article which gives a more fully fleshed out overview of many of the issues at hand.

In the meanwhile, if you’re going to make a bet in this year’s Oscar pool, perhaps putting your money on the “Digital Dilemma” might be more useful than on Brad Pitt for Best Actor in “Moneyball”?