👓 I worked in a video store for 25 years. Here’s what I learned as my industry died. | Vox

I worked in a video store for 25 years. Here’s what I learned as my industry died. by Dennis Perkins (Vox)

Some interesting analysis of what we’re loosing with the death of video stores. In particular, we’re losing some of the same type of recommendations and serendipity we’re loosing with the rise of e-books and less use of libraries/librarians. In particular, loosing well-curated collections is a big issue as we replace them with streaming services which don’t seem to have the same curatorial business models.

I particularly enjoyed this quote:

A great video store’s library of films is like a little bubble outside the march of technology or economics, preserving the fringes, the forgotten, the noncommercial, or the straight-up weird. Championed by a store’s small army of film geeks, such movies get more traffic than they did in their first life in the theater, or any time since. Not everything that was on VHS made the transition to DVD, and not every movie on DVD is available to stream. The decision to leave a movie behind on the next technological leap is market-driven, which makes video stores the last safety net for things our corporate overlords discard.

Syndicated copies to:

Hopkins in Hollywood | Johns Hopkins Alumni Event on 1-12-17

Join students and alumni from the Film and Media Studies Program in Culver City

I’ve been invited to participate in a panel discussion as part of an Intersession course by the Johns Hopkins Film and Media Studies Program. I hope fellow alumni in the entertainment and media sectors will come out and join us in Culver City on Thursday.


Join the Hopkins in Hollywood Affinity Group (AEME LA) as they welcome Linda DeLibero, Director of the JHU Film and Media Studies Program, and current students of the program for a dynamic evening of networking which features an alumni panel of industry experts.

Open to alumni, students, and friends of Hopkins, this event is sponsored by Donald Kurz (A&S ’77), Johns Hopkins University Emeritus Trustee and School of Arts and Sciences Advisory Board Member, and the Hopkins in Hollywood (AEME LA) Affinity Group.

Event Date: Thursday, January 12, 2017
Start Time: 6:30pm
End Time: 8:30pm

Panelists

Donald Kurz, A&S ’77
Moderator

Donald Kurz is Chairman and CEO of Omelet LLC, an innovative new media and marketing services firm based in Los Angeles.   Previously, Mr. Kurz was co-founder and CEO of hedge fund Artemis Capital Partners.  Between 1990 and 2005, Mr. Kurz was Chairman, President, and CEO of EMAK Worldwide, Inc, a global, NASDAQ-traded company providing Fortune 500 companies with strategic and marketing services internationally. Mr. Kurz’s 25 years’ experience in senior leadership includes management positions with Willis Towers Watson, PwC, and the J.C. Penney Company. Mr. Kurz is a Trustee Emeritus of the Johns Hopkins University, having served for 12 years on the Hopkins board.  He received an MBA from the Columbia University Graduate School of Business and a BA from Johns Hopkins University.

J Altman

Jason Altman, A&S ’99

Jason Altman is an Executive Producer at Activision working on the Skylanders franchise and new development projects.  Prior to Activision, he spent the past 5 years at Ubisoft Paris in different leadership roles, most recently as the Executive Producer of Just Dance, the #1 music video game franchise.  He is a veteran game producer who loves the industry, and is a proud graduate of the media studies program at Johns Hopkins.

Boardman

Paul Harris Boardman, A&S ’89

Paul Boardman wrote The Exorcism of Emily Rose (2005) and Devil’s Knot (2014), both of which he also produced, and Deliver Us From Evil (2014), which he also executive produced.  In 2008, Paul produced The Day the Earth Stood Still for Fox, and he did production rewrites on Poltergeist, Scream 4, The Messengers, and Dracula 2000, as well as writing and directing the second unit for Hellraiser:  Inferno (2000) and writing Urban Legends:  Final Cut (2000).  Paul has written screenplays for various studios and production companies, including Trimark, TriStar, Phoenix Pictures, Miramax/Dimension, Disney, Bruckheimer Films, IEG, APG, Sony, Lakeshore, Screen Gems, Universal and MGM.

D Chivvis

Devon Chivvis, A&S ’96

Devon Chivvis is a showrunner/director/producer of narrative and non-fiction television and film. Inspired by a life-long passion for visual storytelling combined with a love of adventure and the exploration of other cultures, Devon has made travel a priority through her work in film and television. Devon holds a B.A. from Johns Hopkins University in International Relations and French, with a minor in Italian.

Chris Aldrich

Chris Aldrich, Engr ’96

Chris started his career at Hopkins while running several movie groups on campus and was responsible for over $200,000 of renovations in Shriver Hall including installing a new screen, sound system, and 35mm projection while also running the 29th Annual Milton S. Eisenhower Symposium “Framing Society: A Century of Cinema” on the 100th anniversary of the moving picture.

Following Hopkins he joined Creative Artists Agency where he worked in Motion Picture Talent and also did work in music-crossover. He later joined Davis Entertainment with a deal at 20th Century Fox where he worked on the productions of Heartbreakers, Dr. Dolittle 2, Behind Enemy Lines as well as acquisition and development of Alien v. Predator, Paycheck, Flight of the Phoenix, Garfield, The Man from U.N.C.L.E., I, Robot and countless others.

Missing the faster pace of representation, he later joined Writers & Artists Agency for several years working in their talent, literary, and book departments. Since that time he’s had his own management company focusing on actors, writers, authors, and directors. Last year he started Boffo Socko Books, an independent publishing company and recently put out the book Amerikan Krazy.

Source: Hopkins in Hollywood | Johns Hopkins Alumni

 

Register Here

More information Office of Alumni Relations
800-JHU-JHU1 (548-5481)
alumevents@jhu.edu

Part of the course:

The Entertainment Industry in Contemporary Hollywood

Students will have the opportunity to spend one week in Los Angeles with Film and Media Studies Director Linda DeLibero. Students will meet and network with JHU alums in the entertainment industry, as well as heads of studios and talent agencies, screenwriters, directors, producers, and various other individuals in film and television. Associated fee with this intersession course is $1400 (financial support is available for those who qualify). Permission of Linda DeLibero is required. Film and Media Studies seniors and juniors will be given preference for the eight available slots, followed by senior minors.Students are expected to arrive in Los Angeles on January 8. The actual course runs January 9-13 with lodging check-in on January 8 and check-out on January 14.

Course Number: AS.061.377.60
Credits: 1
Distribution: H
Days:  Monday 1/9/2017 – Friday 1/13/2017
Times:  M – TBA | Tu- TBA | W- TBA | Th- TBA | F- TBA
Instructor: Linda DeLibero

Omlete LLC, 3540 Hayden Ave, Culver City, CA 90232

Syndicated copies to:

Recap of Our Little Free Library Grand Opening

A big "Thank You!" to all those who helped make our Little Free Library Grand Opening so successful.

First a major note of thanks to everyone who helped to make the launch of Little Free Library Branch #8424 a fantastic success.  Everyone’s support and encouragement is truly appreciated.

When I was setting up, I naturally brought a book to read, but I did it mostly thinking that only two people might actually stop by. (Hey, I’ll be the first to admit that this is a pretty nerdy and a very local pursuit. It’s easy to click “like” on a post; it’s a whole other thing to visit a small neighborhood library even one with free oatmeal cookies.) Fortunately and very pleasantly, there was a steady stream of people from start to finish, so much so that, as the host, I didn’t get to chat with the visitors as much as I would have liked. Apologies to those I couldn’t chat with more, and even moreso to those who heard answers to the same questions multiple times.

In the end, we had over 20 people and a few pets stop by our little event.

A Double Drive-by… booking?!

Under the heading of “Only in LA” I’ll mention that, the highlight of our grand opening was what I can only describe as a double “drive-by booking.” Fortunately no one was hurt.

About 20 minutes into the event a car drove up with two bibliophiles. They each had a book to donate, but apparently didn’t have the time to park and actually stop for a glass of tea or any cookies. So they simply dropped off their books anonymously and then drove immediately off into the sunset. A few minutes later, another car drove up and did the same thing: they donated a book, said hello, and then proceeded on their way without joining the party!  Maybe they had an important book signing or a library event to rush off to? Maybe the library police were chasing them for late fines?  The mafia probably would have called foul as they didn’t technically put a foot on the curb or call us out, but hopefully this is as dangerous as things get in the Little Free Library world. One of the donated books had its South Pasadena Library serial number filed off, possibly to keep it from being traced, but authorities are working diligently on the case.

As if the double drive-by wasn’t odd enough, we also had a minivan drove by with a brief stop to ask what was going on. The driver mentioned that the car of several people happened to include two librarians, so apparently we’ll have to keep our eyes peeled for possible additional drive-by bookings.

Thanks for the Donations!

Special thanks go to Adam and Darren who dropped off 3 books. And to Delilah from down the street who was responsible for our first children’s book donation. And we can’t forget the massive donation of 8 books of literary fiction from Jeffrey Stewart making the largest, single one time donation. Several other neighbors dropped books off, and many browsed and found something interesting to take with them. I have to admit that I’m glad that I live in a neighborhood with such great taste in books.

The award for the longest distance donation goes to Samantha Marks who donated a signed copy of her new book A Fatal Family Secret which she shipped from Ellicott City, Maryland just in time for the Grand Opening. It counts as the newest book in our collection as it was just published in May. Since it was checked out almost as soon as it entered the collection, it also rates as our quickest check out; those in a rush may want to pick up a copy at Amazon or other fine booksellers.

In all we had a total of  26 donations for our Grand Opening, bringing our grand total to 49, so far.

As a special mention, the award for the furthest distance traveled to make our grand opening goes to Jocelyn, who came from London on her way to Oklahoma!

Again, a big “Thank You!” to everyone who helped to make our Grand Opening such a lovely success! We look forward to seeing everyone come back soon!

Website and Social Media

For those who weren’t able to stop by, we’re now open 24/7 365 days a year.  You can visit our branch online at its own website or by means of your favorite social media platform:

 

The oatmeal cookies and iced tea we served at the opening.
The oatmeal cookies and iced tea we served at the opening.
Syndicated copies to:

Little Free Library #8424 Progress

My Little Free Library is getting closer to launch...

Almost the same moment I saw my first Little Free Library, I decided that I wanted to host one of my very own, so I registered with the intent of building one in my free time. The registration arrived and I’d drafted some very serious custom plans, but just never gotten around to purchasing the supplies and building it.

Recently I saw something a bit more quirky and interesting than my original plans that I could up-cycle, so I made the purchase (happy belated birthday to me)!  It’s got two spacious shelves with two doors including a glass fronted one, and it’s got the capacity for at least 6 linear feet of books. We’re nearly ready to go.

Little Free Library #8424 (prelaunch)
Little Free Library #8424 (prelaunch)

I’m hoping to get some mounting materials and have the library up and running soon.  My plan is to specialize in literary fiction, though I’m sure we’ll also stock a fair amount of popular science and non-fiction as well as thriller, mystery, and suspense as well.

Invitations to the “launch” party should be coming shortly! If you’ve got some books you’d like to donate toward the cause, let me know in the comments below. Be sure to include a Book Crossing ID number on them if you’d like to track where your favorite objects head off to in the future.

 

Syndicated copies to:

Machiavelli in Hollywood | Gavin Polone’s ‘Textbook’ on the Entertainment Industry

A series of articles by producer Gavin Polone can serve as an excellent introduction to the business of Hollywood.

Dearth of (Great) Textbooks on The Entertainment Business

In having previously taught several classes on the business of the entertainment industry, I was never quite able to pick out even a mediocre textbook for such a class. There are a handful that will give one an overview of the nuts and bolts and one or two that will provide some generally useful numbers (see the syllabi from those classes), but none comes close to providing the philosophy of how the business works in a short period of time.

A Short Term Solution

To remedy this problem, I was always a fan of producer and ex-agent Gavin Polone, who had a series of articles in New York Magazine/Vulture.  I’ve recently gone through and linked to all of the forty-four articles, in chronological order, he produced in that series from 9/21/11 to 5/7/14.

I’ve aggregated the series via Readlists.com, so one can click on each of the articles individually.  Better yet, for students and teachers alike, one can click on the “export” link and very easily download them all in most ebook formats (including Kindle, iPad, etc.) for your reading/studying convenience.

My hope is that for others, they may create an excellent starter textbook on how the entertainment business works and, more importantly: how successful people in the business think. For those who need more, Gavin is also an occasional contributor to the Hollywood Reporter.  (And, as a note for those not trained in the classics and prone to modern-day stereotypes, I’ll make the caveat that I use the title “Machiavelli” above with the utmost reverence and honor.)

I’m still slowly, but surely making progress on my own all-encompassing textbook, but, until then, I hope others find this series of articles as interesting and useful as I have.

 

Gavin Polone is an agent turned manager turned producer. His production company, Pariah, has brought you such movies and TV shows as Panic Room, Zombieland, Gilmore Girls, and Curb Your Enthusiasm. Follow him on Twitter @gavinpolone

Syndicated copies to:

Git and Version Control for Novelists, Screenwriters, Academics, and the General Public

Revision (or version) control is used in tracking changes in computer programs, but it can easily be used for tracking changes in almost any type of writing from novels, short stories, screenplays, legal contracts, or any type of textual documentation.

Marginalia and Revision Control

At the end of April, I read an article entitled “In the Margins” in the Johns Hopkins University Arts & Sciences magazine.  I was particularly struck by the comments of eminent scholar Jacques Neefs on page thirteen (or paragraph 20) about computers making marginalia a thing of the past:

Neefs believes contemporary literature is losing a valuable component in an age when technology often precludes and trumps the need to save manuscripts or rough drafts. But it is not something that keeps him up at night. ‘The modern technique of computers and everything makes [marginalia] a thing of the past,’ he says. ‘There’s a new way of creation. Some would say it’s tragic, but something new has been invented. I don’t consider it tragic. There are still great writers who write and continue to have a way to keep the process.’

Photo looking over the shoulder of Jacques Neefs onto the paper he's been studing on the table in front of him.
Jacques Neefs (Image courtesy of Johns Hopkins University)

I actually think that he may be completely wrong and that current technology actually allows us to keep far more marginalia! (Has anyone heard of digital exhaust?) The bigger issue may be that many writers just don’t know how to keep a better running log of their work to maintain all the relevant marginalia they’re actually producing. (Of course there’s also the subsequent broader librarian’s “digital dilemma” of maintaining formats for the future. As an example, thing about how easy or hard it might be for you to read that ubiquitous 3.5 inch floppy disk you used in 1995.)

A a technologist who has spent many years in the entertainment industry, I feel compelled to point everyone towards the concept of revision control (or version control) within the realm of computer science.  Though it’s primarily used in tracking changes in computer programs and is often a tool used by large teams of programmers, it can very easily be used for tracking changes in almost any type of writing from novels, short stories, screenplays, legal contracts, or any type of textual documentation of nearly any sort.

Example Use Cases for Revision Control

Publishing

As a direct example, I’m using what is known as a Git repository to track every change I make in a textbook I’m currently writing.  I can literally go back and view every change I’ve made since beginning the project, so though I’m directly revising one (or more) text files, all of my “marginalia” and revisions are saved and available.  Currently I’m only doing it for my own reference and for additional backup not supposing that anyone other than myself or an editor possibly may want to ever peruse it.  If I was working in conjunction with otheres, there are ways for me to track the changes, edits, or notes that others (perhaps an editor or collaborator) might make.

In addition to the general back-up of the project (in case of catastrophic computer failure), I also have the ability to go back and find that paragraph (or multiple pages) I deleted last week in haste, but realize that I desperately want them back now instead of having to recreate them de n0vo.

Because it’s all digital, future scholars also won’t have problems parsing my handwriting issues as has occasionally come up in differentiating Mary Shelley’s writing from that of her husband in digital projects like the Shelley Godwin Archive. The fact that all changes are tracked and placed in a tree-like structure will indicate who wrote what and when and will indicate which changes were ultimately accepted and merged into the final version.

Screenplays in Hollywood

One particular use case I can easily see for such technology is tracking changes in screenplays over time.  I’m honestly shocked that every production company or even more likely studios don’t use such technology to follow changes in drafts over time. In the end, doing such tracking will certainly make Writers Guild of America (WGA) arbitrations much easier as literally every contribution to a script can be tracked to give screenwriters appropriate credit. The end results with the easy ability to time-machine one’s way back into older drafts is truly lovely, and the outputs give so much more information about changes in the script compared to the traditional and all-too-simple (*) which screenwriters use to indicate that something/anything changed on a specific line or the different colored pages which are used on scripts during production.

I can also picture future screenwriters using services like GitHub as platforms for storing and distributing their screenplays to potential agents, managers, and producers.

Redlining Legal Documents

Having seen thousands of legal agreements go back and forth over the years, revision control is a natural tool for tracking the redlining and changes of legal documents as they change over time before they are finally (or even never) executed. I have to imagine that being able to abstract out the appropriate metadata in the long run may actually help attorneys, agents, etc. to become better negotiators, but something like this is a project for another day.

Academia

In addition to direct research for projects being undertaken by academics like Neefs, academics should look into using revision control in their own daily work and writings.  While writing a book, paper, journal article, essay, monograph, etc. (or graduate students writing theses) one could use their own Git repository to not only save but to back up all of their own work not only for themselves primarily, but also future scholars who come later who would not otherwise have access to the “marginalia” one creates while manufacturing their written thoughts in digital form.

I can easily picture Git as a very simple “next step” in furthering the concept of the digital humanities as well as in helping to bridge the gap between C.P. Snow’s “two cultures.” (I’d also suggest that revision control is a relatively simple step one could take before learning a particular programming language, which I think should be a mandatory tool in everyone’s daily toolbox regardless of their field(s) of interest.)

Git Logo 

Start Using Revision Control

“But how do I get started?” you ask.

Know going in that it may take parts of a day to get things set up and running, but once you’ve started with the basics, things are actually pretty easy and you can continue to learn the more advanced subtleties as you progress.  Once things are working smoothly, the additional overhead you’ll be expending won’t be too much more than the old method of hitting Alt-S to save one of your old Word documents in the time before auto-save became ubiquitous.

First one should start by choosing one of the myriad revision control systems that exist.  For the sake of brevity in this short introductory post, I’ll simply suggest that users take a very close look at Git because of its ubiquity and popularity in the computer science world and the fact that it includes a tremendously large amount of free information and support from a variety of sites on the internet. Git also has the benefit of having versions for all major operating systems (Windows, MacOS, and Linux). Git also has the benefit of a relatively long and robust life within the computer science community meaning that it’s very stable and has many more resources for the uninitiated to draw upon.

Once one has Git installed on their computer and has begun using it, I’d then recommending linking one’s local copy of the repository to a cloud storage solution like either GitHub or BitBucket.  While GitHub is certainly one of the most popular Git-related services out there (because it acts, in part, as the hub for a large portion of the open internet and thus promotes sharing), I often recommend using BitBucket as it allows free unlimited private but still share-able repositories while GitHub requires a small subscription fee for keeping one’s work private. Having a repository in the cloud will help tremendously in that your work will be available and downloadable from almost anywhere and because it also serves as a de-facto back-up solution for your work.

I’ve recently been playing around with version control to help streamline the writing/editing process for a book I’ve been writing. Though Git and it’s variants probably seem more daunting than they should to the everyday user, they really represent a very powerful tool. I’ve spent less than two days learning the basics of both Git and hosted repositories (GitHub and Bitbucket), and it has been more than well worth the minor effort.

There is a huge wealth of information on revision control in general and on installing and using Git available on the internet, including full textbooks. For the complete beginners, I’d recommend starting with The Chronicle’s “A Gentle Introduction to Version Control.” Keep in mind that though some of these resources look highly technical, it’s because many are trying to enumerate every function one could potentially desire, when even just the basic core functionality is more than enough to begin with. (I could analogize it to learning to drive a car versus actually reading the full manual so that you know how to take the engine apart and put it back together from scratch. To start with revision control, you only need to learn to “drive.”) Professors might also avail themselves of the use of their local institutional libraries which may host small sessions on learning such tools, or they might avail themselves of the help of their colleagues or students in the computer science department. For others, I’d recommend taking a look at Git’s primary website. BitBucket has an excellent step-by-step tutorial (and troubleshooting) for setting up the requisite software and using it.

What do you use for revision control?

I’ll welcome any thoughts, experiences, or additional resources one might want to share with others in the comments.
 

Syndicated copies to:

John C. Malone on Assets in the Entertainment Industry

John C. Malone (1941 – ), American business executive, landowner, and philanthropist
at Sun Valley Conference 2012, quoted in New York Times

 

Syndicated copies to:

Academy of Motion Picture Arts & Sciences study on The Digital Dilemma

With a slight nod toward the Academy’s announcements of the Oscar nominees this morning, there’s something more interesting which they’ve recently released which hasn’t gotten nearly as much press, but portends to be much more vital in the long run.

Academy_awards

As books enter the digital age and we watch the continued convergence of rich media like video and audio enter into e-book formats with announcements last week like Apple’s foray into digital publishing, the ability to catalog, maintain and store many types of digital media is becoming an increasing problem.  Last week the Academy released part two of their study on strategic issues in archiving and accessing digital motion picture materials in their report entitled The Digital Dilemma 2. Many of you will find it interesting/useful, particularly in light of the Academy’s description

The Digital Dilemma 2 reports on digital preservation issues facing communities that do not have the resources of large corporations or other well-funded institutions: independent filmmakers, documentarians and nonprofit audiovisual archives.

Clicking on the image of the report below provides some additional information as well as the ability (with a simple login) to download a .pdf copy of their entire report.

Digitaldilemma

There is also a recent Variety article which gives a more fully fleshed out overview of many of the issues at hand.

In the meanwhile, if you’re going to make a bet in this year’s Oscar pool, perhaps putting your money on the “Digital Dilemma” might be more useful than on Brad Pitt for Best Actor in “Moneyball”?

Barnes & Noble Board Would Face Tough Choices in a Buyout Vote | Dealbook

Barnes & Noble Faces Tough Choices in a Buyout Vote by Steven Davidoff Solomon (DealBook)
If Leonard Riggio, Barnes & Noble's chairman, joins Liberty Media's proposed buyout of his company, the board needs to decide how to handle his 30 percent stake before shareholders vote on the deal.
Media_httpgraphics8ny_rfodt

This story from the New York Times’ Dealbook is a good quick read on some of the details and machinations of the Barnes & Noble buyout. Perhaps additional analysis on it from a game theoretical viewpoint would yield new insight?

Syndicated copies to:

IPTV primer: an overview of the fusion of TV and the Internet | Ars Technica

IPTV primer: an overview of the fusion of TV and the Internet by Iljitsch Van BeijnumIljitsch Van Beijnum (Ars Technica)

This brief overview of IPTV is about as concise as they get. It’s recommended for entertainment executives who need to get caught up on the space as well as for people who are contemplating “cutting the cable cord.” There’s still a lot of improvement the area can use…

Profound as it may be, the Internet revolution still pales in comparison to that earlier revolution that first brought screens in millions of homes: the TV revolution. Americans still spend more of their non-sleep, non-work time on watching TV than on any other activity. And now the immovable object (the couch potato) and the irresistible force (the business-model destroying Internet) are colliding.

For decades, the limitations of technology only allowed viewers to watch TV programs as they were broadcast. Although limiting, this way of watching TV has the benefit of simplicity: the viewer only has to turn on the set and select a channel. They then get to see what was deemed broadcast-worthy at that particular time. This is the exact opposite of the Web, where users type a search query or click a link and get their content whenever they want. Unsurprisingly, TV over the Internet, a combination that adds Web-like instant gratification to the TV experience, has seen an enormous growth in popularity since broadband became fast enough to deliver decent quality video. So is the Internet going to wreck TV, or is TV going to wreck the Internet? Arguments can certainly be made either way.

The process of distributing TV over a data network such as the Internet, a process often called IPTV, is a little more complex than just sending files back and forth. Unless, that is, a TV broadcast is recorded and turned into a file. The latter, file-based model is one that Apple has embraced with its iTunes Store, where shows are simply downloaded like any other file. This has the advantage that shows can be watched later, even when there is no longer a network connection available, but the download model doesn’t exactly lend itself to live broadcasts—or instant gratification, for that matter.

Streaming

Most of the new IPTV services, like Netflix and Hulu, and all types of live broadcasts use a streaming model. Here, the program is set out in real time. The computer—or, usually by way of a set-top-box, the TV—decodes the incoming stream of audio and video and then displays it pretty much immediately. This has the advantage that the video starts within seconds. However, it also means that the network must be fast enough to carry the audio/video at the bitrate that it was encoded with. The bitrate can vary a lot depending on the type of program—talking heads compress a lot better than car crashes—but for standard definition (SD) video, think two megabits per second (Mbps).

To get a sense just how significant this 2Mbps number is, it’s worth placing it in the context of the history of the Internet, as it has moved from transmitting text to images to audio and video. A page of text that takes a minute to read is a few kilobytes in size. Images are tens to a few hundred kilobytes. High quality audio starts at about 128 kilobits per second (kbps), or about a megabyte per minute. SD TV can be shoehorned in some two megabits per second (Mbps), or about 15 megabytes per minute. HDTV starts around 5Mbps, 40 megabytes per minute. So someone watching HDTV over the Internet uses about the same bandwidth as half a million early-1990s text-only Web surfers. Even today, watching video uses at least ten times as much bandwidth as non-video use of the network.

In addition to raw capacity, streaming video also places other demands on the network. Most applications communicate through TCP, a layer in the network stack that takes care of retransmitting lost data and delivering data to the receiving application in the right order. This is despite the fact that the IP packets that do TCP’s bidding may arrive out of order. And when the network gets congested, TCP’s congestion control algorithms slow down the transmission rate at the sender, so the network remains usable.

However, for real-time audio and video, TCP isn’t such a good match. If a fraction of a second of audio or part of a video frame gets lost, it’s much better to just skip over the lost data and continue with what follows, rather than wait for a retransmission to arrive. So streaming audio and video tended to run on top of UDP rather than TCP. UDP is the thinnest possible layer on top of IP and doesn’t care about lost packets and such. But UDP also means that TCP’s congestion control is out the door, so a video stream may continue at full speed even though the network is overloaded and many packets—also from other users—get lost. However, more advanced streaming solutions are able to switch to lower quality video when network conditions worsen. And Apple has developed a way to stream video using standard HTTP on top of TCP, by splitting the stream into small files that are downloaded individually. Should a file fail to download because of network problems, it can be skipped, continuing playback with the next file.

Where are the servers? Follow the money

Like any Internet application, streaming of TV content can happen from across town or across the world. However, as the number of users increases, the costs of sending such large amounts of data over large distances become significant. For this reason, content delivery networks (CDNs), of which Akamai is probably the most well-known, try to place servers as close to the end-users as possible, either close to important interconnect locations where lots of Internet traffic comes together, or actually inside the networks of large ISPs.

Interestingly, it appears that CDNs are actually paying large ISPs for this privilege. This makes the IPTV business a lot like the cable TV business. On the Internet, the assumption is that both ends (the consumer and the provider of over-the-Internet services) pay their own ISPs for the traffic costs, and the ISPs just transport the bits and aren’t involved otherwise. In the cable TV world, this is very different. An ISP provides access to the entire Internet; a cable TV provider doesn’t provide access to all possible TV channels. Often, the cable companies pay for access to content.

A recent dispute between Level3 and Comcast can be interpreted as evidence of a power struggle between the CDNs and the ISPs in the IPTV arena.

Walled gardens

For services like Netflix or Hulu, where everyone is watching their own movie or their own show, streaming makes a lot of sense. Not so much with live broadcasts.

So far, we’ve only been looking at IPTV over the public Internet. However, many ISPs around the world already provide cable-like service on top of ADSL or Fiber-To-The-Home (FTTH). With such complete solutions, the ISPs can control the whole service, from streaming servers to the set-top box that decodes the IPTV data and delivers it to a TV. This “walled garden” type of IPTV typically provides a better and more TV-like experience—changing channels is faster, image quality is better, and the service is more reliable.

Such an IPTV Internet access service is a lot like what cable networks provide, but there is a crucial difference: with cable, the bandwidth of the analog cable signal is split into channels, which can be used for analog or digital TV broadcasts or for data. TV and data don’t get in each other’s way. With IPTV on the other hand, TV and Internet data are communication vessels: what is used by one is unavailable to the other. And to ensure a good experience, IPTV packets are given higher priority than other packets. When bandwidth is plentiful, this isn’t an issue, but when a network fills up to the point that Internet packets regularly have to take a backseat to IPTV packets, this could easily become a network neutrality headache.

Multicast to the rescue

Speaking of networks that fill up: for services like Netflix or Hulu, where everyone is watching their own movie or their own show, streaming makes a lot of sense. Not so much with live broadcasts. If 30 million people were to tune into Dancing with the Stars using streaming, that means 30 million copies of each IPTV packet must flow down the tubes. That’s not very efficient, especially given that routers and switches have the capability to take one packet and deliver a copy to anyone who’s interested. This ability to make multiple copies of a packet is called multicast, and it occupies territory between broadcasts, which go to everyone, and regular communications (called unicast), which go to only one recipient. Multicast packets are addressed to a special group address. Only systems listening for the right group address get a copy of the packet.

Multicast is already used in some private IPTV networks, but it has never gained traction on the public Internet. Partially, this is a chicken/egg situation, where there is no demand because there is no supply and vice versa. But multicast is also hard to make work as the network gets larger and the number of multicast groups increases. However, multicast is very well suited to broadcast type network infrastructures, such as cable networks and satellite transmission. Launching multiple satellites that just send thousands of copies of the same packets to thousands of individual users would be a waste of perfectly good rockets.

Peer-to-peer and downloading

Converging to a single IP network that can carry the Web, other data services, telephony, and TV seems like a no-brainer.

Multicast works well for a relatively limited number of streams that are each watched by a reasonably sized group of people—but having very many multicast groups takes up too much memory in routers and switches. For less popular content, there’s another delivery method that requires no or few streaming servers: peer-to-peer streaming. This was the technology used by the Joost service in 2007 and 2008. With peer-to-peer streaming, all the systems interested in a given stream get blocks of audio/video data from upstream peers, and then send those on to downstream peers. This approach has two downsides: the bandwidth of the stream has to be limited to fit within the upload capacity of most peers, and changing channels is a very slow process because a whole new set of peers must be contacted.

For less time-critical content, downloading can work very well. Especially in a form like podcasts, where an RSS feed allows a computer to download new episodes of shows without user intervention. It’s possible to imagine a system where regular network TV shows are made available for download one or two days before they air—but in encrypted form. Then, “airing” the show would just entail distributing the decryption keys to viewers. This could leverage unused network capacity at night. Downloads might also happen using IP packets with a lower priority, so they don’t get in the way of interactive network use.

IP addresses and home networks

A possible issue with IPTV could be the extra IP addresses required. There are basically two approaches to handling this issue: the one where the user is in full control, and the one where an IPTV service provider (usually the ISP) has some control. In the former case, streaming and downloading happens through the user’s home network and no extra addresses are required. However, wireless home networks may not be able to provide bandwidth with enough consistency to make streaming work well, so pulling Ethernet cabling may be required.

When the IPTV provider provides a set-top box, it’s often necessary to address packets toward that set-top box, so the box must be addressable in some way. This can eat up a lot of addresses, which is a problem in these IPv4-starved times. For really large ISPs, the private address ranges in IPv4 may not even be sufficient to provide a unique address to every customer. Issues in this area are why Comcast has been working on adopting IPv6 in the non-public part of its network for many years. When an IPTV provider provides a home gateway, this gateway is often outfitted with special quality-of-service mechanisms that make (wireless) streaming work better than run-of-the-mill home gateways that treat all packets the same.

Predicting the future

Converging to a single IP network that can carry the Web, other data services, telephony, and TV seems like a no-brainer. The phone companies have been working on this for years because that will allow them to buy cheap off-the-shelf routers and switches, rather than the specialty equipment they use now. So it seems highly likely that in the future, we’ll be watching our TV shows over the Internet—or at least over an IP network of some sort. The extra bandwidth required is going to be significant, but so far, the Internet has been able to meet all challenges thrown at it in this area. Looking at the technologies, it would make sense to combine nightly pushed downloads for popular non-live content, multicast for popular live content, and regular streaming or peer-to-peer streaming for back catalog shows and obscure live content.

However, the channel flipping model of TV consumption has proven to be quite popular over the past half century, and many consumers may want to stick with it—for at least part of their TV viewing time. If nothing else, this provides an easy way to discover new shows. The networks are also unlikely to move away from this model voluntarily, because there is no way they’ll be able to sell 16 minutes of commercials per hour using most of the other delivery methods. However, we may see some innovations. For instance, if you stumble upon a show in progress, wouldn’t it be nice to be able to go back to the beginning? In the end, TV isn’t going anywhere, and neither is the Internet, so they’ll have to find a way to live together.

Correction: The original article incorrectly stated that cable providers get paid by TV networks. For broadcast networks, cable operators are required by the law’s “must carry” provisions to carry all of the TV stations broadcast in a market. Ars regrets the error.

Failings and Opportunities of the Publishing Industry in the Digital Age

On Sunday, the Los Angeles Times printed a story about the future of reading entitled "Book publishers see their role as gatekeepers shrink." The article covers most of the story fairly well, but leaves out some fundamental pieces of the business ...

On Sunday, the Los Angeles Times printed a story about the future of reading entitled “Book publishers see their role as gatekeepers shrink.” 

The article covers most of the story fairly well, but leaves out some fundamental pieces of the business picture.  It discusses a few particular cases of some very well known authors in the publishing world including the likes of Stephen King, Seth Godin, Paulo Coehlo, Greg Bear, and Neal Stephenson and how new digital publishing platforms are slowly changing the publishing business.

Indeed, many authors are bypassing traditional publishing routes and self-publishing their works directly online, and many are taking a much larger slice of the financial rewards in doing so.

The article, however, completely fails to mention or address how new online methods will be handling editorial and publicity functions differently than they’re handled now, and the future of the publishing business both now and in the future relies on both significantly.

It is interesting, and not somewhat ironic to note that, even in the case of this particular article, as the newspaper business in which it finds its outlet, has changed possibly more drastically than the book publishing business. If reading the article online, one is forced to click through four different pages on which a minimum of five different (and in my opinion, terrifically) intrusive ads appear per page. Without getting into the details of the subject of advertising, even more interesting, is that many of these ads are served up by Google Ads based on keywords, so three just on the first page were specifically publishing related.

Two of the ads were soliciting people to self-publish their own work. One touts how easy it is to publish, while the other glosses over the publicity portion with a glib statement offering an additional “555 Book Promotion Tips”! (I’m personally wondering if there can possibly be so many book promotion tips?)

 

Google_ads
Google_children

 

 

 

 

 

Following the link in the third ad on the first page to its advertised site one discovers it states:

Learning how to publish a children’s book is no child’s play.

From manuscript editing to book illustration, distribution and marketing – a host of critical decisions can make or break your publishing venture.

Fortunately, you can skip the baby steps and focus on what authors like you do best-crafting the best children’s book possible for young inquisitive minds. Leave the rest to us.

Count on the collective publishing and book marketing expertise of a children book publisher with over thirteen years’ experience. We have helped over 20,000 independent authors fulfill their dream of publication.

Take advantage of our extensive network of over 25,000 online bookstores and retailers that include such names Amazon, Barnes & Noble and Borders, among thousands of others.

Tell us about your Children’s book project and we will send you a Free Children’s Book Publishing Guide to start you off on your publishing adventure!

 

Although I find the portion about “baby steps” particularly entertaining, the first thing I’ll note is that the typical person is likely more readily equipped with the ability to distribute and market a children’s book than they might be at crafting one. Sadly however, there are very few who are capable of any of these tasks at a particularly high level, which is why there are relatively few new childrens’ books on the market each year and the majority of sales are older tried-and-true titles.

I hope the average reader sees the above come-on as the twenty-first century equivalent of the snake oil salesman who is tempting the typical wanna-be-author to call about their so-called “Free” Children’s Book Publishing Guide. I’m sure recipients of the guide end up paying the publisher to get their book out the door and more likely than not, it doesn’t end up in main stream brick-and-mortar establishments like Barnes & Noble or Borders, but only sells a handful of copies in easy to reach online venues like Amazon. I might suggest that the majority of sales will come directly from the author and his or her friends and family. I would further argue that neither now nor in the immediate or even distant future that many aspiring authors will be self-publishing much of anything and managing to make even a modest living by doing so.

Now of course all of the above begs the question of why exactly is it that people need/want a traditional publisher? What role or function do publishers actually perform for the business and why might they be around in the coming future?

The typical publishing houses perform three primary functions: filtering/editing material, distributing material, and promoting material. The current significant threat to the publishing business from online retailers like Amazon.com, Barnes & Noble, Borders, and even the recently launched Google Books is the distribution platforms themselves.  It certainly doesn’t take much to strike low cost deals with online retailers to distribute books, and even less so when they’re distributing them as e-books which cuts out the most significant cost in the business — that of the paper to print them on. This leaves traditional publishing houses with two remaining functions: filtering/editing material and the promotion/publicity function.

The Los Angeles Times article certainly doesn’t state it, but everyone you meet on the street could tell you that writers like Stephen King don’t really need any more publicity than what they’ve got already. Their fan followings are so significantly large that they only need to tell two people online that they’ve got a new book and they’ll sell thousands of copies of any book they release. In fact, I might wager that Stephen King could release ten horrific (don’t mistake this for horror) novels before their low quality would likely begin to significantly erode his sales numbers.  If he’s releasing them on Amazon.com and keeping 70% of the income compared to the average 6-18% most writers are receiving, he’s in phenomenally good shape. (I’m sure given his status and track record in the publishing business, he’s receiving a much larger portion of his book sales from his publisher than 18% by the way; I’d also be willing to bet if he approached Amazon directly, he could get a better distribution deal than the currently offered 70/30 split.)

What will eventually sway the majority of the industry is when completely unknown new writers can publish into these electronic platforms and receive the marketing push they need to become the next Stephen King or Neal Stephenson. At the moment, none of the major e-book publishing platforms are giving much, if any, of this type of publicity to any of their new authors, and many aren’t even giving it to the major writers. Thus, currently, even the major writers are relying primarily on their traditional publishers for publicity to push their sales.

I will admit that when 80% of all readers are online and consuming their reading material in e-book format and utilizing the full support of social media and cross-collateralization of the best portion of their word-of-mouth, that perhaps authors won’t need as much PR help. But until that day platforms will significantly need to ramp it up. Financially one wonders what a platform like Amazon.com will charge for a front and center advertisement for a new best-seller to push sales? Will they be looking for a 50/50 split on those sales? Exclusivity in their channel? This is where the business will become even more dicey. Suddenly authors who think they’re shedding the chains of their current publishers will be shackling themselves with newer and more significant manacles and leg irons.

The last piece of the business that needs to be subsumed is the editorial portion of the manufacturing process.  Agents and editors serve a significant role in that they filter out thousands and thousands of terrifically unreadable books. In fact, one might argue that even now they’re letting far too many marginal books through the system and into the market.

If we consider the millions of books housed in the Library of Congress and their general circulation, one might realize that only one tenth of a percent or less of books are receiving all the attention. Certainly classics like William Shakespeare and Charles Dickens are more widely read than the millions of nearly unknown writers who take up just as much shelf space in that esteemed library.

Most houses publish on the order of ten to a hundred titles per year, but they rely heavily on only one or two of them being major hits to cover not only the cost of the total failures, but to provide the company with some semblance of profit.  (This model is not unlike the same way that the feature film business works in Hollywood; if you throw enough spaghetti, something is bound to stick.)

The question then becomes: “how does the e-publishing business accomplish this editing and publicity in a better and less expensive way?” This question needs to be looked at from a pre-publication as well as a post-publication perspective.

From the pre-publication viewpoint the Los Angeles Times article interestingly mentions that many authors appreciate having a “conversation” with their readers and allowing it to inform their work. However, creators of the stature of Stephen King cannot possibly take in and consume criticism from their thousands of fans in any reasonable way not to mention the detriment to their output if they were forced to read and deal with all that criticism and feedback.  Even smaller stature authors often find it overwhelming to take in criticism from their agents, editors, and even a small handful of close friends, family, and colleagues.  Taking a quick look at the acknowledgement portions of a few dozen books generally reveals fewer than 10 people being thanked much less hundreds of names from their general reading public – people they neither know well, much less trust implicitly.

From the post-publication perspective, both printing on demand and e-book formats excise one of the largest costs of the supply chain management portions of the publishing world, but staff costs and salary are certainly very close in line after them.  One might argue that social media is the answer here and we can rely on services like LibraryThing, GoodReads, and others to supply this editorial/publicity process and eventually broad sampling and positive and negative reviews will win the day to cross good, but unknown writers into the popular consciousness. This may sound reasonable on the surface, but take a look at similar large recommendation services in the social media space like Yelp. These services already have hundreds of thousands of users, but they’re not nearly as useful as they need to be from a recommendation perspective and they’re not terrifically reliable in that they’re very often easily gamed. (Consider the number of positive reviews that appear on Yelp that are most likely written by the proprietors of the establishments themselves.) This outlet for editorial certainly has the potential to improve in the coming years, but it will still be quite some time before it has the possibility of totally ousting the current editorial and filtering regime.

From a mathematical and game theoretical perspective one must also consider how many people are going to subject themselves (willingly and for free) to some really bad reading material and then bother to write either a good or bad review of their experience. This particularly when the vast majority of readers are more than content to ride the coattails of the “suckers” who do the majority of the review work.

There are certainly a number of other factors at play in the publishing business as it changes form, but those discussed above are certainly significant in its continuing evolution.  Given the state of technology and its speed, if people feel that the tradition publishing world will collapse, then we should take its evolution to the nth degree. Using an argument like this, then even platforms like Amazon and Google Books will eventually need to narrow their financial split with authors down to infinitesimal margins as authors should be able to control every portion of their work without any interlopers taking any portion of their proceeds. We’ll leave the discussion of whether all of this might fit into the concept of the tragedy of the commons for a future date.

Information Flow in Hollywood is Changing Rapidly as Alyssa Milano’s Representation Drops the Ball

There are many in the industry who have Twitter and Facebook accounts, but generally they shy away from using them, particularly when it relates to their daily workflow.  Naturally there are instances when representatives and business affairs executives will post the occasional congratulatory emails, but typically nothing relevant or revealing is ever said.

But tonight Twitter began to change the landscape of how Hollywood, and in particular the representation segment, does its day-to-day business.

It began with the news that Alyssa Milano’s ABC series ROMANTICALLY CHALLENGED, which premiered on April 19th earlier this year, had been cancelled. Michael Ausiello of the Ausiello Files for Entertainment Weekly broke the story online at 7:44 pm (Pacific) and tweeted out the news. Alyssa Milano saw the news on Twitter about an hour later, and at 8:45 pm, she tweeted out her disappointment to the world.

Her agent/manager is going to have a fire to put out tomorrow, if it doesn’t burn itself into oblivion tonight!  If anything, her agent typically could have or should have been amongst one of the first to know, generally being informed by the studio executive in charge of the project or potentially by the producer of the show who would also have been in that first round to know about the cancellation. And following the news from the network, Alyssa should have been notified immediately.

Typically this type of news is treated like pure commodity within the representation world. If a competing agent, particularly one who wanted a client like Alyssa, to move to their agency, they would dig up the early news, call her at home, break the bad news early and fault the current representative for dropping the ball and not doing their job.  Further, the agent would likely put together a group of several new scripts (which the servicing agent either wouldn’t have access to or wouldn’t have sent her) and have them sent over to her for her immediate consideration.  Suddenly there’s an unhappy client who is seriously considering taking their business across the street.

The major difference here is that it isn’t a competing agent breaking the bad news, but the broader internet! Despite the brevity of the less than 140 characters Ms. Milano had, it’s quite obvious that she’s both shocked and a bit upset at the news.  We cannot imagine that she’s happy with the source of the news; it’s very likely that her representation got an upset call this evening which they’re currently scurrying to verify and then put out the subsequent fire.

Beyond this frayed relationship, there is also the subsequent strain on the relationships between representation and the overseeing studio executive(s), studio/network chief, and potentially further between the Agency and the Network over what is certain to be one of the more expensive television talent deals in the business right now.

We’re sure there will be a few more agents, managers, and attorneys who sign up for Twitter accounts tomorrow and begin monitoring their clients’ brands more closely on the real-time web.

[As a small caveat to all of this, keep in mind that the show was picked up in early August last year and only aired four episodes premiering in April of this year, so from a technical point of view, the show’s cancellation isn’t a major surprise simply given the timing of the pick-up and the premiere, the promotional push behind the show, or the show’s ratings. Nevertheless, this is sure to have an effect on the flow of business.]

Syndicated copies to: