Interesting to see this roll out as Facebook is having some serious data collection problems. This looks a bit like a means for Google to directly link users with content they’re consuming online and then leveraging it much the same way that Facebook was with apps and companies like Cambridge Analytica.
Highlights, Quotes, & Marginalia
Paying for a subscription is a clear indication that you value and trust your subscribed publication as a source. So we’ll also highlight those sources across Google surfaces
So Subscribe with Google will also allow you to link subscriptions purchased directly from publishers to your Google account—with the same benefits of easier and more persistent access.
you can then use “Sign In with Google” to access the publisher’s products, but Google does the billing, keeps your payment method secure, and makes it easy for you to manage your subscriptions all in one place.
I immediately wonder who owns my related subscription data? Is the publisher only seeing me as a lumped Google proxy or do they get may name, email address, credit card information, and other details?
How will publishers be able (or not) to contact me? What effect will this have on potential customer retention?
Our tech columnist tried to skip digital news for a while. His old-school experiment led to three main conclusions.
A somewhat link-baity headline, but overall a nice little article with some generally solid advice. I always thought that even the daily paper was at too quick a pace and would much prefer a weekly or monthly magazine that does a solid recap of all the big stories and things one ought to know, that way the stories had had some time to simmer and all the details had time to come out. Kind of like reading longer form non-fiction of periods of history, just done on a somewhat shorter timescale.
Last month, in its second round of layoffs in as many years, comedy hub Funny or Die reportedly eliminated its entire editorial team following a trend of comedy websites scaling back, shutting down, or restructuring their business model away from original online content.
Hours after CEO Mike Farah delivered the news via an internal memo, Matt Klinman took to Twitter, writing, “Mark Zuckerberg just walked into Funny or Die and laid off all my friends.” It was a strong sentiment for the longtime comedy creator, who started out at UCB and The Onion before launching Pitch, the Funny or Die-incubated joke-writing app, in 2017.
This article really has so much. It also contains a microcosm of what’s been happening in journalism recently as well. I have a feeling that if outlets like Funny or Die were to go back and own their original content, there would still be a way for them to exist, we just need to evolve the internet away from the centralized direction we’ve been moving for the past decade and change.
Highlights, Quotes, & Marginalia
eliminated its entire editorial team following a trend of comedy websites scaling back, shutting down, or restructuring their business model away from original online content. Hours after CEO Mike Farah delivered the news via an internal memo, Matt Klinman took to Twitter, writing, “Mark Zuckerberg just walked into Funny or Die and laid off all my friends.” It was a strong sentiment for the longtime comedy creator, who started out at UCB and The Onion before launching Pitch, the Funny or Die-incubated joke-writing app, in 2017.
“Mark Zuckerberg just walked into Funny or Die and laid off all my friends.”
The whole story is basically that Facebook gets so much traffic that they started convincing publishers to post things on Facebook. For a long time, that was fine. People posted things on Facebook, then you would click those links and go to their websites. But then, gradually, Facebook started exerting more and more control of what was being seen, to the point that they, not our website, essentially became the main publishers of everyone’s content. Today, there’s no reason to go to a comedy website that has a video if that video is just right on Facebook. And that would be fine if Facebook compensated those companies for the ad revenue that was generated from those videos, but because Facebook does not pay publishers, there quickly became no money in making high-quality content for the internet.
Facebook has created a centrally designed internet. It’s a lamer, shittier looking internet.
The EU has a bunch of laws kicking in to keep this in check — one is algorithmic transparency, where these places need to tell me why they are showing me something.
If someone at Facebook sees this, I want them to know, if they care at all about the idea that was the internet, they need to start thinking through what they are doing. Otherwise, then you’re just like Lennie from Of Mice and Men — a big dumb oaf crushing the little mouse of the internet over and over and not realizing it.
And I want it to feel that way to other people so that when they go to a cool website, they are inspired: They see human beings putting love and care into something.
Facebook is essentially running a payola scam where you have to pay them if you want your own fans to see your content.
It’s like if The New York Times had their own subscriber base, but you had to pay the paperboy for every article you wanted to see.
And then it becomes impossible to know what a good thing to make is anymore.
This is where webmentions on sites can become valuable. People posting “read” posts or “watch” posts (or even comments) indicating that they saw something could be the indicator to the originating site that something is interesting/valuable and could be displayed by that site. (This is kind of like follower counts, but for individual pieces of content, so naturally one would need to be careful about gaming.)
Here’s another analogy, and I learned this in an ecology class: In the 1800s (or something), there were big lords, or kings or something, who had giant estates with these large forests. And there were these foresters who had this whole notion of how to make a perfectly designed forest, where the trees would be pristinely manicured and in these perfect rows, and they would get rid of all the gross stuff and dirt. It was just trees in a perfect, human-devised formation that you could walk through. Within a generation, these trees were emaciated and dying. Because that’s how a forest works — it needs to be chaotic. It needs bugs and leaves, it makes the whole thriving ecosystem possible. That’s what this new internet should be. It won’t survive as this human-designed, top-down thing that is optimized for programmatic ads. It feels like a desert. There’s no nutrition, there’s no opportunity to do anything cool.
Recommending things for people is a personal act, and there are people who are good at it. There are critics. There are blogs. It’s not beneficial to us to turn content recommendations over to an algorithm, especially one that’s been optimized for garbage.
the internet was a better place 3-4 years ago. It used to be fruitful, but it’s like a desert now.
RSVPed Interested in Attending https://www.rjionline.org/events/dodging-the-memory-hole-2017
Please join us at Dodging the Memory Hole 2017: Saving Online News on Nov. 15-16 at the Internet Archive headquarters in San Francisco. Speakers, panelists and attendees will explore solutions to the most urgent threat to cultural memory today — the loss of online news content. The forum will focus on progress made in and successful models of long-term preservation of born-digital news content. Journalistic content published on websites and through social media channels is ephemeral and easily lost in a tsunami of digital content. Join professional journalists, librarians, archivists, technologists and entrepreneurs in addressing the urgent need to save the first rough draft of history in digital form.
The two-day forum — funded by the Donald W. Reynolds Journalism Institute and an Institute of Museum and Library Services grant awarded to the Journalism Digital News Archive, UCLA Library and the Educopia Institute — will feature thought leaders, stakeholders and digital preservation practitioners who are passionate about preserving born-digital news. Sessions will include speakers, multi-member panels, lightning round speakers and poster presenters examining existing initiatives and novel practices for protecting and preserving online journalism.
I attended this conference at UCLA in Fall 2016; it was fantastic! I highly recommend it to journalists, coders, Indieweb enthusiasts, publishers, and others interested in the related topics covered.
So you're sitting near the gate at Philadelphia International Airport, waiting for your plane. After you read your newspaper (I hope) and finish making calls on your cellphone, check emails and Snapchat (millennials only), you look at the wall-mounted TV screen, and there's CNN.
When you walk through the terminal changing planes in Chicago, there's CNN. And when you reach your final destination, San Francisco, the airport screens are showing CNN -- not Fox, not MSNBC, not ESPN.
Tweetstorms have been getting a horrific reputation lately.  But used properly, they can sometimes have an excellent and beneficial effect. In fact, recently I’ve seen some journalists using it for both marketing and on the spot analysis in their areas of expertise.Even today Aram Zucker-Scharff, a journalism critic in his own tweetstorm , suggests that this UI form may have an interesting use case in relation to news outlets like CNN which make multiple changes to a news story which lives at one canonical (and often not quickly enough archived) URL, but which is unlikely to be visited multiple times:
Why not publish a sequence of small stories that connect together rather than one big one on the same URL that keeps changing?
Why not publish a sequence of small stories that connect together rather t
— Aram Zucker-Scharff (@Chronotope) February 10, 2017
A newsstorm-type user experience could better lay out the ebb and flow of a particular story over time and prevent the loss of data, context, and even timeframe that otherwise occurs on news websites that regularly update content on the same URL. (Though there are a few tools in the genre like Memento which could potentially be useful.)
It’s possible that tweetstorms could even be useful for world leaders who lack the focus to read full sentences formed into paragraphs, and possibly even multiple paragraphs that run long enough to comprise articles, research documents, or even books. I’m not holding my breath though.
Technical problems for tweetstorms
But the big problem with tweetstorms–even when they’re done well and without manthreading–is actually publishing them quickly, rapidly, and without letting any though process between one tweet and the next.
Noter Live–the solution!
Last week this problem just disappeared: I think Noter Live has just become the best-in-class tool for tweetstorms.
Noter Live was already the go-to tool for live tweeting at conferences, symposia, workshops, political debates, public fora, and even live cultural events like the Superbowl or the Academy Awards. But with a few simple tweaks Kevin Marks, the king of covering conferences live on Twitter, has just updated it in a way that allows one to strip off the name of the speaker so that an individual can type in their own stream of consciousness simply and easily.
But wait! It has an all-important added bonus feature in addition to the fact that it automatically creates the requisite linked string of tweets for easier continuous threaded reading on Twitter…
When you’re done with your screed, which you probably wrote in pseudo-article form anyway, you can cut it out of the Noter Live app, dump it into your blog (you remember?–that Twitter-like app you’ve got that lets you post things longer than 140 characters at a time?), and voila! The piece of writing that probably should have been a blog post anyway can easily be archived for future generations in a far more readable and useful format! And for those who’d prefer a fancier version, it can also automatically add additional markup, microformats, and even Hovercards!
Bonus tip, after you’ve saved the entire stream on your own site, why not tweet out the URL permalink to the post as the last in the series? It’ll probably be a nice tweak on the nose that those who just read through a string of 66 tweets over the span of 45 minutes were waiting for!
So the next time you’re at a conference or just in the mood to rant, remember Noter Live is waiting for you.
Aside: I really wonder how it is that Twitter hasn’t created the ability (UX/UI) to easily embed an entire tweetstorm in one click? It would be a great boon to online magazines and newspapers who more frequently cut and paste tweets from them to build articles around. Instead most sites just do an atrocious job of cutting and pasting dozens to hundreds of tweets in a long line to try to tell these stories.