YouTube’s C.E.O. spends her days contemplating condoms and bestiality, talking advertisers off the ledge and managing a property the size of Netflix.
Obviously they took the easy route. You may need to measure what matters, but getting to that goal by any means necessary or using indefensible shortcuts is the fallacy here. They could have had that North Star, but it’s the means they used by which to reach it that were wrong.
This is another great example of tech ignoring basic ethics to get to a monetary goal. (Another good one is Marc Zuckerberg’s “connecting people” mantra when what he should be is “connecting people for good” or “creating positive connections”.
This is a great summation of the issue.
Just hit that sparkle, fam
Apparently so many people are using shortcuts like “filter:follows -filter:replies” from a few months back that they’ve decided to fix their UI.
Of course the article indicates that it seems to be higher engagement (aka clicks for advertising) as the motivator rather than simply making a stronger and more usable product:
Keith Coleman, vice president of product at Twitter, told The Verge that in tests, users who had access to the easy toggle participated in more conversations than average.
As an interesting aside, I’ll note that just a few months ago that YouTube allowed people to do embeds with several options, but they’re recently removed the option to prevent their player from recommending additional videos once you’re done. Thus the embedding site is still co-opted to some extent by YouTube and their vexing algorithmic recommendations.
In a similar vein audio is also an issue, but at least an easier and much lower bandwidth one. I’ve been running some experiments lately on my own website by posting what I’m listening to on a regular basis as a “faux-cast” and embedding the original audio. I’ve also been doing it pointedly as a means of helping others discover good content, because in some sense I can say I love the most recent NPR podcast or click like on it somewhere, but I’m definitely sure that doesn’t have as much weight or value as my tacitly saying, “I’ve actually put my time and attention on the line and actually listened to this particular episode.” I think having and indicating skin-in-the-game can make a tremendous difference in these areas. In a similar vein, sites like Twitter don’t really have a good bookmarking feature, so readers don’t know if the sharing user actually read any of an article or if it was just the headline. Posting these things separately on my own site as either reads or bookmarks allows me to differentiate between the two specifically and semantically, both for others’ benefit as well as, and possibly most importantly, for my own (future self).
I’ve spent some time this morning thinking about the deplatforming of the abhorrent social media site Gab.ai by Google, Apple, Stripe, PayPal, and Medium following the Tree of Life shooting in Pennsylvania. I’ve created a deplatforming page on the IndieWeb wiki with some initial background and history. I’ve also gone back and tagged (with “deplatforming”) a few articles I’ve read or podcasts I’ve listened to recently that may have some interesting bearing on the topic.
The particular design question I’m personally looking at is roughly:
How can we reshape the web and social media in a way that allows individuals and organizations a platform for their own free speech and communication without accelerating or amplifying the voices of the abhorrent fringes of people espousing broadly anti-social values like virulent discrimination, racism, fascism, etc.?
In some sense, the advertising driven social media sites like Facebook, Twitter, et al. have given the masses the equivalent of not simply a louder voice within their communities, but potential megaphones to audiences previously far, far beyond their reach. When monetized against the tremendous value of billions of clicks, there is almost no reason for these corporate giants to filter or moderate socially abhorrent content. Their unfiltered and unregulated algorithms compound the issue from a societal perspective. I look at it in some sense as the equivalent of the advent of machine guns and ultimately nuclear weapons in 20th century warfare and their extreme effects on modern society.
The flip side of the coin is also potentially to allow users the ability to better control and/or filter out what they’re presented on platforms and thus consuming, so solutions can relate to both the output as well as the input stages.
Comments and additions to the page (or even here below) particularly with respect to positive framing and potential solutions on how to best approach this design hurdle for human communication are more than welcome.
Deplatforming or no platform is a form of banning in which a person or organization is denied the use of a platform (physical or increasingly virtual) on which to speak.
In addition to the banning of those with socially unacceptable viewpoints, there has been a long history of marginalized voices (particularly trans, LGBTQ, sex workers, etc.) being deplatformed in systematic ways.
The banning can be from any of a variety of spaces ranging from physical meeting spaces or lectures, journalistic coverage in newspapers or television to domain name registration, web hosting, and even from specific social media platforms like Facebookor Twitter. Some have used these terms as narrowly as in relation to having their Twitter “verified” status removed.
“We need to puncture this myth that [deplatforming]’s only affecting far-right people. Trans rights activists, Black Lives Matterorganizers, LGBTQI people have been demonetized or deranked. The reason we’re talking about far-right people is that they have coverage on Fox News and representatives in Congress holding hearings. They already have political power.” — Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10
Glenn Beck parted ways with Fox News in what some consider to have been a network deplatforming. He ultimately moved to his own platform consisting of his own website.
Milo Yiannopoulos, the former Breitbart personality, was permanently banned from Twitter in 2016 for inciting targeted harassment campaigns against actress Leslie Jones. He resigned from Breitbart over comments he made about pedophilia on a podcast. These also resulted in the termination of a book deal with Simon & Schuster as well as the cancellation of multiple speaking engagements at Universities.
The Daily Stormer
Neo-Nazi site The Daily Stormer was deplatformed by Cloudflare in the wake of 2017’s “Unite the Right” rally in Charlottesville. Following criticism, Matthew Prince, Cloudflare CEO, announced that he was ending the Daily Stormer’s relationship with Cloudflare, which provides services for protecting sites against distributed denial-of service (DDoS) attacks and maintaining their stability.
Alex Jones and his Infowars were deplatformed by Apple, Spotify, YouTube, and Facebook in late summer 2018 for his Network’s false claims about the Newtown shooting.
- Deplatforming Works in Motherboard
Gab.ai was deplatformed from PayPal, Stripe, Medium †, Apple, and Google as a result of their providing a platform for alt-right and racist groups as well as the shooter in the Tree of Life Synagogue shooting in October 2018
Gab.com is under attack. We have been systematically no-platformed by App Stores, multiple hosting providers, and several payment processors. We have been smeared by the mainstream media for defending free expression and individual liberty for all people and for working with law enforcement to ensure that justice is served for the horrible atrocity committed in Pittsburgh. Gab will continue to fight for the fundamental human right to speak freely. As we transition to a new hosting provider Gab will be inaccessible for a period of time. We are working around the clock to get Gab.com back online. Thank you and remember to speak freely.
—from the Gab.ai homepage on 2018-10-29
- Gab, the Social Media Site for the Alt-Right, Gets Deplatformed
- Microsoft threatened to stop hosting the alt-right’s favorite social network in Quartz 2018-08-10
- Face the Racist Nation from On The Media | WNYC Studios on 2018-08-31 includes a segment about deplatforming racist groups from news coverage in the early 1900’s.
- ‘By Whatever Means Necessary’: The Origins of the ‘No Platform Policy’ on Hatful of History on 2015-11-03
- Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10
- Study finds Reddit’s controversial ban of its most toxic subreddits actually worked TechCrunch on 2017-09-11
- The case for quarantining extremist ideas by Joan Donovan and Danah Boyd in The Guardian on 2018-06/01
- You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech Proc. ACM Hum.-Comput. Interact., Vol. 1, No. 2, Article 31. Publication date: November 2017.
Researchers have long known that local actors—as well as Russia—use manipulative tactics to spread information online. With Facebook suspending a slew of domestic accounts, a difficult reckoning is upon us.
We need something in the digtial world that helps to put the brakes on gossip and falsehoods much the same way real life social networks tend to slow these things down. Online social networks that gamify and monopolize based on clicks using black box algorithms are destroying some of the fabric of our society.
Lies were able to go across the world before the truth had a chance to put on it’s breeches in the past, but it’s ability to do so now is even worse. We need to be able to figure out a way to flip the script.
In the meantime, the company says it’s fixing its timeline settings