Black box algorithms are simply the bane of the world. How hard would it be to give us some manual and granular control over our own feeds. That’s really the next killer app. If the rise of the independent and decentralized web isn’t the thing that kills social media, it’s going to be a company that figures out how to act more human and give people the ability to control what they read.
Although I'm not a practicing Catholic anymore, old habits are hard to die. I plan to reduce my time on social media this Lenten season. Less time here and more on my blogs:
Personal Blog: http://bryanruby.com/
Fifty-Two Posts a Year: http://fiftytwoposts.com
I like the concept of this. Lent done #IndieWeb style.
I don’t post “notes” to Facebook often, but I’d noticed a few weeks ago that several pieces I’d published like this a while back were apparently unpublished by the platform. I hadn’t seen or heard anything from Facebook about them being unpublished or having issues, so I didn’t realize the problem until I randomly stumbled back across my notes page.
They did have a piece of UI to indicate that I wanted to contest and republish them, so I clicked on it. Apparently this puts these notes into some type of limbo “review” process, but it’s been a few weeks now and there’s no response about either of them. They’re still both sitting unseen in my dashboard with sad notes above them saying:
There is no real indication if they’ll ever come back online. Currently my only option is to delete them. There’s also no indication, clear or otherwise, of which community standard they may have violated.
I can’t imagine how either of the posts may have run afoul of their community standards, or why “notes” in particular seem to be more prone to this sort of censorship in comparison with typical status updates. I’m curious if others have had this same experience?
This is just another excellent example of why one shouldn’t trust third parties over which you have no control to publish your content on the web. Fortunately I’ve got my own website with the original versions of these posts  that are freely readable. If you’ve experienced this or other pernicious problems in social media, I recommend you take a look at the helpful IndieWeb community which has some excellent ideas and lots of help for re-exerting control over your online presence.
Notes on Facebook were an early 2009 era attempt for Facebook to have more blog-like content and included a rather clean posting interface, not un-reminiscent of Medium’s interface, that also allowed one to include images and even hyperlinks into pages.
The note post type has long since fallen by the wayside and I rarely, if ever, come across people using it anymore in the wild despite the fact that it’s a richer experience than traditional status updates. I suspect the Facebook black box algorithm doesn’t encourage its use. I might posit that it’s not encouraged as unlike most Facebook functionality, hyperlinks in notes on desktop browsers physically take one out of the Facebook experience and into new windows!
The majority of notes about me are spammy chain mail posts like “25 Random Things About Me”, which also helpfully included written instructions for how to actually use notes.
25 Random Things About Me
Rules: Once you’ve been tagged, you are supposed to write a note with 25 random things, facts, habits, or goals about you. At the end, choose 25 people to be tagged. You have to tag the person who tagged you. If I tagged you, it’s because I want to know more about you.
(To do this, go to “notes” under tabs on your profile page, paste these instructions in the body of the note, type your 25 random things, tag 25 people (in the right hand corner of the app) then click publish.)
Most of my published notes were experiments in syndicating my content from my own blog to Facebook (via POSSE). At the time, the engagement didn’t seem much different than posting raw text as status updates, so I abandoned it. Perhaps I’ll try again with this post to see what happens? I did rather like the ability to actually have links to content and other resources in my posts there.
Just as I was getting sick last week, Colin Walker wrote “There has to be a better way to subscribe to sites.” He’s definitely hit the nail right on the head. The process is currently painful and disorganized, it’s also working on technology that’s almost two decades old and difficult for newcomers at best.
I’ve always posited that one of the reasons that social media silos have been so successful is that they’ve built some fantastic readers. Sure their UI is cleaner and just dead simple, but to a great extent 95% of their product is an evolved feed reader while the other 5% is a simple posting interface that makes it easy to interact. To compare, most CMSes are almost completely about posting interface, and spend very little time, if any, worrying about providing a reading experience.
The IndieWeb has been making some serious strides on making cross-site interactions easier with the Webmention and Micropub protocols, but the holy grail is still out there: allowing people to have an integrated feed reader built into their website (or alternately a standalone feed reader that’s tightly integrated with their site via Micropub or other means).
For those watching the space with as much interest as I have, there are a couple of interesting tools in the space and a few on the immediate horizon that are sure to make the process a whole lot easier and create a new renaissance in the open web.
SubToMe: a Universal Subscribe Button
First, for a relatively simple one-size-fits-all subscribe button, I recommend people take a look at SubToMe which touts itself as a “Universal Follow button” because it “makes it easy for people to follow web sites,because browsers don’t do it.” The button is fairly straightforward and has an awful lot of flexibility built in. In the simplest sense it has some solid feed detection so it finds available feeds on a web page and then provides a handful of recommended major readers to the user. With two clicks, one can pretty quickly and almost immediately subscribe to almost any feed in their reader of choice.
For publishers, one can quickly install a simple button on their site. They can further provide a list of specific feeds they want to advertise, and they can even recommend a particular feed reader if they choose.
For consumers, the service provides a simple browser bookmarklet so that if a site doesn’t have a button, they can click a subscribe button in their browser. Then click on a provider. Done. One can also choose a preferred provider to shorten the process.
Since last June there’s been a quietly growing new web spec called Microsub that will assuredly shake up the subscription and reader spaces. In short it provides a standardized way for clients to consume and interact with feeds collected by a server.
While it gets pretty deep pretty quickly, the spec is meant to help decouple some of the heavy architecture of building a feed reader. In some way it’s analogous to the separation of content and display that HTML and CSS allows, but applied to the mechanics of feed readers and how readers display their content.
There are already a few interesting projects by the names of Together and Indigenous that are taking advantage of the architecture
I can’t wait to see how it all dovetails together to make a more integrated reading and posting interface as well as the potential it has for individual CMSs to potentially leverage the idea to include integrated interfaces into their products. I can’t wait for the day when my own personal website is compatible with Microsub, so that I can use any Microsub client to read my timeline and follow people.
I’m also sure that decoupling the idea of displaying posts from actually fetching remote feeds will make it easier to build a reader clients in general. I hope this has a Cambrian explosion-type of effect on the state of the art of feed readers.
Last month, in its second round of layoffs in as many years, comedy hub Funny or Die reportedly eliminated its entire editorial team following a trend of comedy websites scaling back, shutting down, or restructuring their business model away from original online content.
Hours after CEO Mike Farah delivered the news via an internal memo, Matt Klinman took to Twitter, writing, “Mark Zuckerberg just walked into Funny or Die and laid off all my friends.” It was a strong sentiment for the longtime comedy creator, who started out at UCB and The Onion before launching Pitch, the Funny or Die-incubated joke-writing app, in 2017.
This article really has so much. It also contains a microcosm of what’s been happening in journalism recently as well. I have a feeling that if outlets like Funny or Die were to go back and own their original content, there would still be a way for them to exist, we just need to evolve the internet away from the centralized direction we’ve been moving for the past decade and change.
Highlights, Quotes, & Marginalia
eliminated its entire editorial team following a trend of comedy websites scaling back, shutting down, or restructuring their business model away from original online content. Hours after CEO Mike Farah delivered the news via an internal memo, Matt Klinman took to Twitter, writing, “Mark Zuckerberg just walked into Funny or Die and laid off all my friends.” It was a strong sentiment for the longtime comedy creator, who started out at UCB and The Onion before launching Pitch, the Funny or Die-incubated joke-writing app, in 2017.
“Mark Zuckerberg just walked into Funny or Die and laid off all my friends.”
The whole story is basically that Facebook gets so much traffic that they started convincing publishers to post things on Facebook. For a long time, that was fine. People posted things on Facebook, then you would click those links and go to their websites. But then, gradually, Facebook started exerting more and more control of what was being seen, to the point that they, not our website, essentially became the main publishers of everyone’s content. Today, there’s no reason to go to a comedy website that has a video if that video is just right on Facebook. And that would be fine if Facebook compensated those companies for the ad revenue that was generated from those videos, but because Facebook does not pay publishers, there quickly became no money in making high-quality content for the internet.
Facebook has created a centrally designed internet. It’s a lamer, shittier looking internet.
The EU has a bunch of laws kicking in to keep this in check — one is algorithmic transparency, where these places need to tell me why they are showing me something.
If someone at Facebook sees this, I want them to know, if they care at all about the idea that was the internet, they need to start thinking through what they are doing. Otherwise, then you’re just like Lennie from Of Mice and Men — a big dumb oaf crushing the little mouse of the internet over and over and not realizing it.
And I want it to feel that way to other people so that when they go to a cool website, they are inspired: They see human beings putting love and care into something.
Facebook is essentially running a payola scam where you have to pay them if you want your own fans to see your content.
It’s like if The New York Times had their own subscriber base, but you had to pay the paperboy for every article you wanted to see.
And then it becomes impossible to know what a good thing to make is anymore.
This is where webmentions on sites can become valuable. People posting “read” posts or “watch” posts (or even comments) indicating that they saw something could be the indicator to the originating site that something is interesting/valuable and could be displayed by that site. (This is kind of like follower counts, but for individual pieces of content, so naturally one would need to be careful about gaming.)
Here’s another analogy, and I learned this in an ecology class: In the 1800s (or something), there were big lords, or kings or something, who had giant estates with these large forests. And there were these foresters who had this whole notion of how to make a perfectly designed forest, where the trees would be pristinely manicured and in these perfect rows, and they would get rid of all the gross stuff and dirt. It was just trees in a perfect, human-devised formation that you could walk through. Within a generation, these trees were emaciated and dying. Because that’s how a forest works — it needs to be chaotic. It needs bugs and leaves, it makes the whole thriving ecosystem possible. That’s what this new internet should be. It won’t survive as this human-designed, top-down thing that is optimized for programmatic ads. It feels like a desert. There’s no nutrition, there’s no opportunity to do anything cool.
Recommending things for people is a personal act, and there are people who are good at it. There are critics. There are blogs. It’s not beneficial to us to turn content recommendations over to an algorithm, especially one that’s been optimized for garbage.
the internet was a better place 3-4 years ago. It used to be fruitful, but it’s like a desert now.
Since the old Lanyrd site was back up over the weekend, I went in and saved all of the old data I wanted from it before it decided to shut down again (there is no news on when this may happen). Sadly there is no direct export, but I was able to save pages individually and/or save them to the Internet Archive.
One thing we very much believe in is that you should own your own data. As such, we didn’t want to just suck your data into Notist and leave it at that. Instead, we’ve built a tool that gives you access to the content as HTML and JSON, ready for you to take away today.
Today’s web is very different from what it was 8 years ago. We’ve said it several times: publishing and consuming content are new frontiers for most of the web giants like Facebook, Google or Apple. We consume the web from mobile devices, we discover content on silo-ed social networks and, more importantly, the base metaphor for the web is shifting from “space” to “time”.
Superfeedr, the open web’s leading feed API and PubSubHubbub hub has been an independent player for 8 years. Superfeedr exists in order to enable people to exchange information on the web more freely and easily. Today, we’re excited to announce Superfeedr has been acquired by Medium. In many ways, it’s a very natural fit: Medium wants to create the best place to publish, distribute and consume content on the web. Together, we are hoping to keep Medium the company a leader in good industry practices, and Medium the network a place where this conversation can gain even more traction.
I consider myself a member of the open web community and very friendly with the goals of the IndieWeb community. I too wish for a world where web giants have less power and where the user is in control of more of their data. Yet, I now work for a large (the largest?) publishing platform. It is not often easy to reconcile, but one thing that I can tell you for a fact is that your data is, on average, safer on large hosting provider than it is on your small indie site.
I’d be curious to see more concrete numbers on these statistics, though I suspect that for “mature” sites, it may actually be the case. Some of the small, middling platforms however… The other side of the coin though is that when airplanes do crash, the death toll is seemingly large, and this is also the case with major silos.
While he mentions personal sites disappearing, it’s typically something that the site owner can often at least make a conscious choice to do and they can also mothball the data for later use. With a silo death, they really have no choice and often can’t get any data at all.
This just goes to point out that we need better solutions for both openness and longevity. How much of what I write on line will survive the next 500+ years? More or less than what Copernicus or Newton wrote? (Of course, who will care is an entirely different question…)
I hope that perhaps Medium opens up in the future to do some of the functionality that he mentions.