The cover of my book, Alt-America: The Rise of the Radical Right, is now being equated to actual hate speech that I report on.
Ed and Brian Krassenstein are banned for life after ‘operating multiple fake accounts and purchasing account interactions,’ a Twitter spokesman said.
"Let's replace the shadows that Twitter and Facebook and Google have been on the media with some business-model fundamentals. As 2018 has shown, they've offered us a lot more heartache than it feels like they're actually worth."
This is a very staid and sober statement about the ills of social media platforms (aka silos) and a proposed way forward for 2019. His argument is tremendously bolstered by the fact that he’s actually got his own website where he’s hosting and distributing his own content.
Ernie, should you see this, I’d welcome you to come join a rapidly growing group of creators who have been doing almost exactly what you’ve prescribed. We’re amassing a wealth of knowledge, tools, code, and examples at Indieweb.org to help you and others on their journey to better owning and controlling their online identities in almost the exact way in which you’re talking about in your article. Both individually and together we’re trying to build web websites that allow all the functionality of the platforms, but in a way that is both easy and beautiful for everyone to manage and use. Given the outlet for your piece, I’ll also mention that there’s a specific page for IndieWeb and Journalism.
I’d invite you to join the online chat and add yourself as an example to any of the appropriate pages, including perhaps for Craft. Also feel free to discuss your future plans and ask for any help or support you’d like to see for improving your own website. Together I hope we can all make your prediction for 2019 a reality.
Highlights, Quotes, Annotations, & Marginalia
But what if, in 2019, we take a step back and decide not to let the platform decide how to run the show? ❧
January 09, 2019 at 07:55AM
I’ve been working on a redesign of my site recently, using a more robust CMS, and the advantages of controlling the structure of the platform soup-to-nuts are obvious, even if it requires more upfront work. ❧
January 09, 2019 at 07:57AM
2019 is the year when publishers — whether big ones like Axios or the Los Angeles Times or tiny ones like mine or Judd Legum’s Popular Information — move away from letting someone else call all the shots. Or, at least, they should. ❧
January 09, 2019 at 08:01AM
Platforms are the key to influence in the modern era. We’ve spent years being burned by them and complaining about them for either doing too much or not enough. But what if, in 2019, we take a step back and decide not to let the platform decide how to run the show? Great points by Ernie Smith in a...
Foldable Phone, Online Civility
- The Samsung Developers Conference Keynote features a foldable phone, SmartThings IoT, and Bixby innovations.
- Android will support foldable phones.
- Google employees stage a walkout over sexual harassment
- Tim Berners-Lee's Contract for the Web
- How to encourage civility online
- YouTube Content ID
- Facebook and "White Genocide"
- Young people are deleting Facebook in droves
- Facebook's holiday pop-up store
- Everybody gets free Amazon shipping
- Amazon's new HQ2(s)
- 8 new Chromebook features
- Google Home Hub teams up with Sephora
- Ajit Pai's FCC is hopping mad about robocalls
Picks of the Week
- Jeff's Number: Black Friday home tech deals
- Stacey's Thing: Extinct cables, Alexa Christmas Lights
Leo Laporte doesn’t talk about it directly within an IndieWeb specific framework, but he’s got an interesting discussion about YouTube Content ID that touches on the ideas of Journalism and IndieWeb and particularly as they relate to video, streaming video, and YouTube Live.
While most people are forced to rely on Google as their silo of choice for video and specifically live streaming video, he points out a painful single point of failure in their system with regard to copyright rules and Google’s automatic filters that could get a user/content creator permanently banned. Worse, as Leo indicates, this ban could also extend to related Google accounts (YouTube, Gmail, etc.) One is thus open to potential chilling effects of intimidation, censorship, and deplatforming.
Leo discusses the fact that he’s not as beholden to YouTube because he streams and hosts all of his content on his own website and only utilizes silos like YouTube as ancillary distribution. In IndieWeb parlance what he does is known as POSSE or Post to your Own Site, Syndicate Elsewhere and this prevents his journalism, commentary, and even his business from being ravaged by the whims of corporate entities whose rules he can’t control directly.
The discussion starts at 1:05:11 into the episode and goes for about 10 minutes for those who are interested in this particular sub-topic.
The Twitter-like platform Gab has been forced offline, as their payment providers, hosting provider and domain provider all told them their business was no longer welcome. The platform is home to people with extremist views claiming their freedom of speech is under threat. At issue is of course wher...
This is a key point. Social media and the way it (and its black box algorithms) amplifies almost anything for the benefit of clicks towards advertising is one of its most toxic features. Too often the extreme voice draws the most attention instead of being moderated down by more civil and moderate society.
I think this is a false dilemma, Bernd.
I’d say that it would be great if those extremists would see using a distributed tool like Mastodon as the only remaining viable platform for them. It would not suppress their speech. But it woud deny them any amplification, which they now enjoy by being very visible on mainstream platforms, giving them the illusion they are indeed mainstream. It will be much easier to convince, if at all needed, instance moderators to not federate with instances of those guys, reducing them ever more to their own bubble. They can spew hate amongst themselves for eternity, but without amplification it won’t thrive. Jotted down some thoughts on this earlier in “What does Gab’s demise mean for federation?“
The alternative social media network that was reportedly used by the suspect in the deadly shooting at a Pittsburgh synagogue is now down. Gab.com is a social network that touts itself as an alternative to Twitter and Facebook to give conservatives a platform for free speech. But it also has been criticized for providing a platform for anti-Semitism and white nationalism. The site has come in for increased scrutiny since the shooting.
I’ve spent some time this morning thinking about the deplatforming of the abhorrent social media site Gab.ai by Google, Apple, Stripe, PayPal, and Medium following the Tree of Life shooting in Pennsylvania. I’ve created a deplatforming page on the IndieWeb wiki with some initial background and history. I’ve also gone back and tagged (with “deplatforming”) a few articles I’ve read or podcasts I’ve listened to recently that may have some interesting bearing on the topic.
The particular design question I’m personally looking at is roughly:
How can we reshape the web and social media in a way that allows individuals and organizations a platform for their own free speech and communication without accelerating or amplifying the voices of the abhorrent fringes of people espousing broadly anti-social values like virulent discrimination, racism, fascism, etc.?
In some sense, the advertising driven social media sites like Facebook, Twitter, et al. have given the masses the equivalent of not simply a louder voice within their communities, but potential megaphones to audiences previously far, far beyond their reach. When monetized against the tremendous value of billions of clicks, there is almost no reason for these corporate giants to filter or moderate socially abhorrent content. Their unfiltered and unregulated algorithms compound the issue from a societal perspective. I look at it in some sense as the equivalent of the advent of machine guns and ultimately nuclear weapons in 20th century warfare and their extreme effects on modern society.
The flip side of the coin is also potentially to allow users the ability to better control and/or filter out what they’re presented on platforms and thus consuming, so solutions can relate to both the output as well as the input stages.
Comments and additions to the page (or even here below) particularly with respect to positive framing and potential solutions on how to best approach this design hurdle for human communication are more than welcome.
Deplatforming or no platform is a form of banning in which a person or organization is denied the use of a platform (physical or increasingly virtual) on which to speak.
In addition to the banning of those with socially unacceptable viewpoints, there has been a long history of marginalized voices (particularly trans, LGBTQ, sex workers, etc.) being deplatformed in systematic ways.
The banning can be from any of a variety of spaces ranging from physical meeting spaces or lectures, journalistic coverage in newspapers or television to domain name registration, web hosting, and even from specific social media platforms like Facebookor Twitter. Some have used these terms as narrowly as in relation to having their Twitter “verified” status removed.
“We need to puncture this myth that [deplatforming]’s only affecting far-right people. Trans rights activists, Black Lives Matterorganizers, LGBTQI people have been demonetized or deranked. The reason we’re talking about far-right people is that they have coverage on Fox News and representatives in Congress holding hearings. They already have political power.” — Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10
Glenn Beck parted ways with Fox News in what some consider to have been a network deplatforming. He ultimately moved to his own platform consisting of his own website.
Milo Yiannopoulos, the former Breitbart personality, was permanently banned from Twitter in 2016 for inciting targeted harassment campaigns against actress Leslie Jones. He resigned from Breitbart over comments he made about pedophilia on a podcast. These also resulted in the termination of a book deal with Simon & Schuster as well as the cancellation of multiple speaking engagements at Universities.
The Daily Stormer
Neo-Nazi site The Daily Stormer was deplatformed by Cloudflare in the wake of 2017’s “Unite the Right” rally in Charlottesville. Following criticism, Matthew Prince, Cloudflare CEO, announced that he was ending the Daily Stormer’s relationship with Cloudflare, which provides services for protecting sites against distributed denial-of service (DDoS) attacks and maintaining their stability.
Alex Jones and his Infowars were deplatformed by Apple, Spotify, YouTube, and Facebook in late summer 2018 for his Network’s false claims about the Newtown shooting.
- Deplatforming Works in Motherboard
Gab.ai was deplatformed from PayPal, Stripe, Medium †, Apple, and Google as a result of their providing a platform for alt-right and racist groups as well as the shooter in the Tree of Life Synagogue shooting in October 2018
Gab.com is under attack. We have been systematically no-platformed by App Stores, multiple hosting providers, and several payment processors. We have been smeared by the mainstream media for defending free expression and individual liberty for all people and for working with law enforcement to ensure that justice is served for the horrible atrocity committed in Pittsburgh. Gab will continue to fight for the fundamental human right to speak freely. As we transition to a new hosting provider Gab will be inaccessible for a period of time. We are working around the clock to get Gab.com back online. Thank you and remember to speak freely.
—from the Gab.ai homepage on 2018-10-29
- Gab, the Social Media Site for the Alt-Right, Gets Deplatformed
- Microsoft threatened to stop hosting the alt-right’s favorite social network in Quartz 2018-08-10
- Face the Racist Nation from On The Media | WNYC Studios on 2018-08-31 includes a segment about deplatforming racist groups from news coverage in the early 1900’s.
- ‘By Whatever Means Necessary’: The Origins of the ‘No Platform Policy’ on Hatful of History on 2015-11-03
- Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10
- Study finds Reddit’s controversial ban of its most toxic subreddits actually worked TechCrunch on 2017-09-11
- The case for quarantining extremist ideas by Joan Donovan and Danah Boyd in The Guardian on 2018-06/01
- You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech Proc. ACM Hum.-Comput. Interact., Vol. 1, No. 2, Article 31. Publication date: November 2017.
"I am really pleased to see different sites deciding not to privilege aggressors' speech over their targets'," Phillips said. "That tends to be the default position in so many online 'free speech' debates which suggest that if you restrict aggressors' speech, you're doing a disservice to America—a position that doesn't take into account the fact that antagonistic speech infringes on the speech of those who are silenced by that kind of abuse." ❧
"Now that we've done it I feel like a complete ass for waiting so long. So will everyone else who makes the same call."
New Data & Society report recommends editorial “better practices” for reporting on online bigots and manipulators; interviews journalists on accidental amplification of extreme agendas
This report draws on in-depth interviews by scholar Whitney Phillips to showcase how news media was hijacked from 2016 to 2018 to amplify the messages of hate groups.
Offering extremely candid comments from mainstream journalists, the report provides a snapshot of an industry caught between the pressure to deliver page views, the impulse to cover manipulators and “trolls,” and the disgust (expressed in interviewees’ own words) of accidentally propagating extremist ideology.
After reviewing common methods of “information laundering” of radical and racist messages through the press, Phillips uses journalists’ own words to propose a set of editorial “better practices” intended to reduce manipulation and harm.
As social and digital media are leveraged to reconfigure the information landscape, Phillips argues that this new domain requires journalists to take what they know about abuses of power and media manipulation in traditional information ecosystems; and apply and adapt that knowledge to networked actors, such as white nationalist networks online.
This work is the first practitioner-focused report from Data & Society’s Media Manipulation Initiative, which examines how groups use the participatory culture of the internet to turn the strengths of a free society into vulnerabilities.
Why the troll problem is actually a culture problem: how online trolling fits comfortably within today's media landscape.
Internet trolls live to upset as many people as possible, using all the technical and psychological tools at their disposal. They gleefully whip the media into a frenzy over a fake teen drug crisis; they post offensive messages on Facebook memorial pages, traumatizing grief-stricken friends and family; they use unabashedly racist language and images. They take pleasure in ruining a complete stranger's day and find amusement in their victim's anguish. In short, trolling is the obstacle to a kinder, gentler Internet. To quote a famous Internet meme, trolling is why we can't have nice things online. Or at least that's what we have been led to believe. In this provocative book, Whitney Phillips argues that trolling, widely condemned as obscene and deviant, actually fits comfortably within the contemporary media landscape. Trolling may be obscene, but, Phillips argues, it isn't all that deviant. Trolls' actions are born of and fueled by culturally sanctioned impulses—which are just as damaging as the trolls' most disruptive behaviors.
Phillips describes, for example, the relationship between trolling and sensationalist corporate media—pointing out that for trolls, exploitation is a leisure activity; for media, it's a business strategy. She shows how trolls, “the grimacing poster children for a socially networked world,” align with social media. And she documents how trolls, in addition to parroting media tropes, also offer a grotesque pantomime of dominant cultural tropes, including gendered notions of dominance and success and an ideology of entitlement. We don't just have a trolling problem, Phillips argues; we have a culture problem. This Is Why We Can't Have Nice Things isn't only about trolls; it's about a culture in which trolls thrive.
This book explores the weird and mean and in-between that characterize everyday expression online, from absurdist photoshops to antagonistic Twitter hashtags to deceptive identity play.
Whitney Phillips and Ryan M. Milner focus especially on the ambivalence of this expression: the fact that it is too unwieldy, too variable across cases, to be essentialized as old or new, vernacular or institutional, generative or destructive. Online expression is, instead, all of the above. This ambivalence, the authors argue, hinges on available digital tools. That said, there is nothing unexpected or surprising about even the strangest online behavior. Ours is a brave new world, and there is nothing new under the sun – a point necessary to understanding not just that online spaces are rife with oddity, mischief, and antagonism, but why these behaviors matter.
The Ambivalent Internet is essential reading for students and scholars of digital media and related fields across the humanities, as well as anyone interested in mediated culture and expression.