The Twitter-like platform Gab has been forced offline, as their payment providers, hosting provider and domain provider all told them their business was no longer welcome. The platform is home to people with extremist views claiming their freedom of speech is under threat. At issue is of course wher...
They can spew hate amongst themselves for eternity, but without amplification it won’t thrive. ❧
This is a key point. Social media and the way it (and its black box algorithms) amplifies almost anything for the benefit of clicks towards advertising is one of its most toxic features. Too often the extreme voice draws the most attention instead of being moderated down by more civil and moderate society.Syndicated copies to:
I think this is a false dilemma, Bernd.
I’d say that it would be great if those extremists would see using a distributed tool like Mastodon as the only remaining viable platform for them. It would not suppress their speech. But it woud deny them any amplification, which they now enjoy by being very visible on mainstream platforms, giving them the illusion they are indeed mainstream. It will be much easier to convince, if at all needed, instance moderators to not federate with instances of those guys, reducing them ever more to their own bubble. They can spew hate amongst themselves for eternity, but without amplification it won’t thrive. Jotted down some thoughts on this earlier in “What does Gab’s demise mean for federation?“
The alternative social media network that was reportedly used by the suspect in the deadly shooting at a Pittsburgh synagogue is now down. Gab.com is a social network that touts itself as an alternative to Twitter and Facebook to give conservatives a platform for free speech. But it also has been criticized for providing a platform for anti-Semitism and white nationalism. The site has come in for increased scrutiny since the shooting.
I’ve spent some time this morning thinking about the deplatforming of the abhorrent social media site Gab.ai by Google, Apple, Stripe, PayPal, and Medium following the Tree of Life shooting in Pennsylvania. I’ve created a deplatforming page on the IndieWeb wiki with some initial background and history. I’ve also gone back and tagged (with “deplatforming”) a few articles I’ve read or podcasts I’ve listened to recently that may have some interesting bearing on the topic.
The particular design question I’m personally looking at is roughly:
How can we reshape the web and social media in a way that allows individuals and organizations a platform for their own free speech and communication without accelerating or amplifying the voices of the abhorrent fringes of people espousing broadly anti-social values like virulent discrimination, racism, fascism, etc.?
In some sense, the advertising driven social media sites like Facebook, Twitter, et al. have given the masses the equivalent of not simply a louder voice within their communities, but potential megaphones to audiences previously far, far beyond their reach. When monetized against the tremendous value of billions of clicks, there is almost no reason for these corporate giants to filter or moderate socially abhorrent content. Their unfiltered and unregulated algorithms compound the issue from a societal perspective. I look at it in some sense as the equivalent of the advent of machine guns and ultimately nuclear weapons in 20th century warfare and their extreme effects on modern society.
The flip side of the coin is also potentially to allow users the ability to better control and/or filter out what they’re presented on platforms and thus consuming, so solutions can relate to both the output as well as the input stages.
Comments and additions to the page (or even here below) particularly with respect to positive framing and potential solutions on how to best approach this design hurdle for human communication are more than welcome.
Deplatforming or no platform is a form of banning in which a person or organization is denied the use of a platform (physical or increasingly virtual) on which to speak.
In addition to the banning of those with socially unacceptable viewpoints, there has been a long history of marginalized voices (particularly trans, LGBTQ, sex workers, etc.) being deplatformed in systematic ways.
The banning can be from any of a variety of spaces ranging from physical meeting spaces or lectures, journalistic coverage in newspapers or television to domain name registration, web hosting, and even from specific social media platforms like Facebookor Twitter. Some have used these terms as narrowly as in relation to having their Twitter “verified” status removed.
“We need to puncture this myth that [deplatforming]’s only affecting far-right people. Trans rights activists, Black Lives Matterorganizers, LGBTQI people have been demonetized or deranked. The reason we’re talking about far-right people is that they have coverage on Fox News and representatives in Congress holding hearings. They already have political power.” — Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10
Glenn Beck parted ways with Fox News in what some consider to have been a network deplatforming. He ultimately moved to his own platform consisting of his own website.
Milo Yiannopoulos, the former Breitbart personality, was permanently banned from Twitter in 2016 for inciting targeted harassment campaigns against actress Leslie Jones. He resigned from Breitbart over comments he made about pedophilia on a podcast. These also resulted in the termination of a book deal with Simon & Schuster as well as the cancellation of multiple speaking engagements at Universities.
The Daily Stormer
Neo-Nazi site The Daily Stormer was deplatformed by Cloudflare in the wake of 2017’s “Unite the Right” rally in Charlottesville. Following criticism, Matthew Prince, Cloudflare CEO, announced that he was ending the Daily Stormer’s relationship with Cloudflare, which provides services for protecting sites against distributed denial-of service (DDoS) attacks and maintaining their stability.
Alex Jones and his Infowars were deplatformed by Apple, Spotify, YouTube, and Facebook in late summer 2018 for his Network’s false claims about the Newtown shooting.
- Deplatforming Works in Motherboard
Gab.ai was deplatformed from PayPal, Stripe, Medium †, Apple, and Google as a result of their providing a platform for alt-right and racist groups as well as the shooter in the Tree of Life Synagogue shooting in October 2018
Gab.com is under attack. We have been systematically no-platformed by App Stores, multiple hosting providers, and several payment processors. We have been smeared by the mainstream media for defending free expression and individual liberty for all people and for working with law enforcement to ensure that justice is served for the horrible atrocity committed in Pittsburgh. Gab will continue to fight for the fundamental human right to speak freely. As we transition to a new hosting provider Gab will be inaccessible for a period of time. We are working around the clock to get Gab.com back online. Thank you and remember to speak freely.
—from the Gab.ai homepage on 2018-10-29
- Gab, the Social Media Site for the Alt-Right, Gets Deplatformed
- Microsoft threatened to stop hosting the alt-right’s favorite social network in Quartz 2018-08-10
- Face the Racist Nation from On The Media | WNYC Studios on 2018-08-31 includes a segment about deplatforming racist groups from news coverage in the early 1900’s.
- ‘By Whatever Means Necessary’: The Origins of the ‘No Platform Policy’ on Hatful of History on 2015-11-03
- Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10
- Study finds Reddit’s controversial ban of its most toxic subreddits actually worked TechCrunch on 2017-09-11
- The case for quarantining extremist ideas by Joan Donovan and Danah Boyd in The Guardian on 2018-06/01
- You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech Proc. ACM Hum.-Comput. Interact., Vol. 1, No. 2, Article 31. Publication date: November 2017.
- web hosting
- shadow banning
- demonitazition – a practice (particularly leveled at YouTube) of preventing users and voices from monetizing their channels. This can have a chilling effect on people who rely on traffic for income to support their work (see also 1)
"I am really pleased to see different sites deciding not to privilege aggressors' speech over their targets'," Phillips said. "That tends to be the default position in so many online 'free speech' debates which suggest that if you restrict aggressors' speech, you're doing a disservice to America—a position that doesn't take into account the fact that antagonistic speech infringes on the speech of those who are silenced by that kind of abuse." ❧
New Data & Society report recommends editorial “better practices” for reporting on online bigots and manipulators; interviews journalists on accidental amplification of extreme agendas
This report draws on in-depth interviews by scholar Whitney Phillips to showcase how news media was hijacked from 2016 to 2018 to amplify the messages of hate groups.
Offering extremely candid comments from mainstream journalists, the report provides a snapshot of an industry caught between the pressure to deliver page views, the impulse to cover manipulators and “trolls,” and the disgust (expressed in interviewees’ own words) of accidentally propagating extremist ideology.
After reviewing common methods of “information laundering” of radical and racist messages through the press, Phillips uses journalists’ own words to propose a set of editorial “better practices” intended to reduce manipulation and harm.
As social and digital media are leveraged to reconfigure the information landscape, Phillips argues that this new domain requires journalists to take what they know about abuses of power and media manipulation in traditional information ecosystems; and apply and adapt that knowledge to networked actors, such as white nationalist networks online.
This work is the first practitioner-focused report from Data & Society’s Media Manipulation Initiative, which examines how groups use the participatory culture of the internet to turn the strengths of a free society into vulnerabilities.
Why the troll problem is actually a culture problem: how online trolling fits comfortably within today's media landscape.
Internet trolls live to upset as many people as possible, using all the technical and psychological tools at their disposal. They gleefully whip the media into a frenzy over a fake teen drug crisis; they post offensive messages on Facebook memorial pages, traumatizing grief-stricken friends and family; they use unabashedly racist language and images. They take pleasure in ruining a complete stranger's day and find amusement in their victim's anguish. In short, trolling is the obstacle to a kinder, gentler Internet. To quote a famous Internet meme, trolling is why we can't have nice things online. Or at least that's what we have been led to believe. In this provocative book, Whitney Phillips argues that trolling, widely condemned as obscene and deviant, actually fits comfortably within the contemporary media landscape. Trolling may be obscene, but, Phillips argues, it isn't all that deviant. Trolls' actions are born of and fueled by culturally sanctioned impulses—which are just as damaging as the trolls' most disruptive behaviors.
Phillips describes, for example, the relationship between trolling and sensationalist corporate media—pointing out that for trolls, exploitation is a leisure activity; for media, it's a business strategy. She shows how trolls, “the grimacing poster children for a socially networked world,” align with social media. And she documents how trolls, in addition to parroting media tropes, also offer a grotesque pantomime of dominant cultural tropes, including gendered notions of dominance and success and an ideology of entitlement. We don't just have a trolling problem, Phillips argues; we have a culture problem. This Is Why We Can't Have Nice Things isn't only about trolls; it's about a culture in which trolls thrive.
This book explores the weird and mean and in-between that characterize everyday expression online, from absurdist photoshops to antagonistic Twitter hashtags to deceptive identity play.
Whitney Phillips and Ryan M. Milner focus especially on the ambivalence of this expression: the fact that it is too unwieldy, too variable across cases, to be essentialized as old or new, vernacular or institutional, generative or destructive. Online expression is, instead, all of the above. This ambivalence, the authors argue, hinges on available digital tools. That said, there is nothing unexpected or surprising about even the strangest online behavior. Ours is a brave new world, and there is nothing new under the sun – a point necessary to understanding not just that online spaces are rife with oddity, mischief, and antagonism, but why these behaviors matter.
The Ambivalent Internet is essential reading for students and scholars of digital media and related fields across the humanities, as well as anyone interested in mediated culture and expression.
Recently the concept of ‘no platform’ was in the news again when there were attempts to cancel a talk by Germaine Greer at Cardiff University. While there is no doubt that the use of ‘no platform’ has expanded since its first use in the 1970s, the term is bandied about in the media with little definition and understanding of how it was developed as a specific response to the fascism of the National Front (and later the British National Party). This post looks back at the origins of the term and how it was developed into a practical anti-fascist strategy.
hat tip: Kevin MarksSyndicated copies to:
It seems like just the other day that Reddit finally banned a handful of its most hateful and deplorable subreddits, including r/coontown and r/fatpeoplehate. The move was, at the time, derided by some as pointless, akin to shooing criminals away from one neighborhood only to trouble another. But a…
“We need to puncture this myth that it’s only affecting far-right people. Trans rights activists, Black Lives Matter organizers, LGBTQI people have been demonetized or deranked. The reason we’re talking about far-right people is that they have coverage on Fox News and representatives in Congress holding hearings. They already have political power.” ❧