An exercise I gave my students helps illustrate the risks to privacy in our everyday, offline lives.
On the morning of October 21, 2017, the budding New York choreographer Jinah Parker was sitting in bed, her husband lying alongside, when she opened her email and found a deeply unsettling, one-paragraph message about her debut dance production.
The show was called SHE, a Choreoplay, an off-off-Broadway interpretative dance in which four women vividly monologize rape and abuse.
Parker wrote and directed. Her newlywed husband, Kevin Powell, was the producer. In 1992, as a tenacious 26-year-old activist, he appeared on the inaugural season of MTV’s genre-defining reality show, The Real World. In the decades since, he’d become a prolific public speaker, author of 13 books, and a two-time congressional candidate.
Powell also has a history of violence. He assaulted women in college and once shoved a girlfriend into a bathroom door. Now he’s a sophist of male fragility, and an essential component of his activist repertoire is to engage in public reflection—usually with equal parts self-effacement and self-righteousness—upon this personal shame.
It would seem that this couple got just what they had coming to them, though it’s a bit disingenuous that they can go to crowd funding platforms to spread the blame out. I’m hoping that it was only all the people to whom they spread their invective to that ended helping to foot part of their bill.
Smartphone apps track a staggering amount of data about our whereabouts every day. That data has become a hot commodity.
No meme account is safe—not even @God.
“We are our own BuzzFeed,” said Declan Mortimer, a 16-year-old who ran the @ComedySlam account, with more than 11 million followers. Kaamil Lakhani and Jonathan Foley, who work together on @SocietyFeelings, said they were even in the process of building a dedicated website, as accounts such as @Daquan have already done.
Despite the Christmas setback, most meme account holders mentioned in this article said that they weren’t planning to abandon the platform anytime soon. But the incident served as an acute reminder of how quickly they can lose it all and be forced to start from scratch. “We’re playing on rented property,” said Goswami, “and that’s just so apparent now more than ever before.”
Micro.blog now has 3 distinct styles of usernames to make the platform more compatible with other services: Micro.blog usernames, e.g. @you. These are simple usernames for @-mentioning someone else in the Micro.blog community. Mastodon usernames, e.g. @email@example.com. When you search Micro.blog ...
It also sounds very much like Kevin Marks’ Distributed Verification scheme using the rel=”me” attribute on web pages for which he built a chrome browser extension to actually implement it. Kevin also recently reported that Mastodon now actually supports this verification scheme in one of their most recent updates which should be used by instances that are regularly updating. The benefit is that this scheme already exists, is relatively well supported, there are parsers available for it, and it’s actually working on the open web. It’s also truly distributed in that it doesn’t rely on any central provisioning authorities that require ongoing maintenance or which could provide a monopoly on such a service.
Some more details about a proposed solution for MoodleNet that could solve some problems around decentralised identity.
(Mind you, since you can self-host Mastodon, you should really verify links yourself instead of relying on a cosmetic feature as I could have just faked that via a bit of CSS.) ;)
The particular design question I’m personally looking at is roughly:
How can we reshape the web and social media in a way that allows individuals and organizations a platform for their own free speech and communication without accelerating or amplifying the voices of the abhorrent fringes of people espousing broadly anti-social values like virulent discrimination, racism, fascism, etc.?
In some sense, the advertising driven social media sites like Facebook, Twitter, et al. have given the masses the equivalent of not simply a louder voice within their communities, but potential megaphones to audiences previously far, far beyond their reach. When monetized against the tremendous value of billions of clicks, there is almost no reason for these corporate giants to filter or moderate socially abhorrent content. Their unfiltered and unregulated algorithms compound the issue from a societal perspective. I look at it in some sense as the equivalent of the advent of machine guns and ultimately nuclear weapons in 20th century warfare and their extreme effects on modern society.
The flip side of the coin is also potentially to allow users the ability to better control and/or filter out what they’re presented on platforms and thus consuming, so solutions can relate to both the output as well as the input stages.
Comments and additions to the page (or even here below) particularly with respect to positive framing and potential solutions on how to best approach this design hurdle for human communication are more than welcome.
Deplatforming or no platform is a form of banning in which a person or organization is denied the use of a platform (physical or increasingly virtual) on which to speak.
In addition to the banning of those with socially unacceptable viewpoints, there has been a long history of marginalized voices (particularly trans, LGBTQ, sex workers, etc.) being deplatformed in systematic ways.
The banning can be from any of a variety of spaces ranging from physical meeting spaces or lectures, journalistic coverage in newspapers or television to domain name registration, web hosting, and even from specific social media platforms like Facebookor Twitter. Some have used these terms as narrowly as in relation to having their Twitter “verified” status removed.
“We need to puncture this myth that [deplatforming]’s only affecting far-right people. Trans rights activists, Black Lives Matterorganizers, LGBTQI people have been demonetized or deranked. The reason we’re talking about far-right people is that they have coverage on Fox News and representatives in Congress holding hearings. They already have political power.” — Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10
Glenn Beck parted ways with Fox News in what some consider to have been a network deplatforming. He ultimately moved to his own platform consisting of his own website.
Milo Yiannopoulos, the former Breitbart personality, was permanently banned from Twitter in 2016 for inciting targeted harassment campaigns against actress Leslie Jones. He resigned from Breitbart over comments he made about pedophilia on a podcast. These also resulted in the termination of a book deal with Simon & Schuster as well as the cancellation of multiple speaking engagements at Universities.
The Daily Stormer
Neo-Nazi site The Daily Stormer was deplatformed by Cloudflare in the wake of 2017’s “Unite the Right” rally in Charlottesville. Following criticism, Matthew Prince, Cloudflare CEO, announced that he was ending the Daily Stormer’s relationship with Cloudflare, which provides services for protecting sites against distributed denial-of service (DDoS) attacks and maintaining their stability.
Alex Jones and his Infowars were deplatformed by Apple, Spotify, YouTube, and Facebook in late summer 2018 for his Network’s false claims about the Newtown shooting.
- Deplatforming Works in Motherboard
Gab.ai was deplatformed from PayPal, Stripe, Medium †, Apple, and Google as a result of their providing a platform for alt-right and racist groups as well as the shooter in the Tree of Life Synagogue shooting in October 2018
Gab.com is under attack. We have been systematically no-platformed by App Stores, multiple hosting providers, and several payment processors. We have been smeared by the mainstream media for defending free expression and individual liberty for all people and for working with law enforcement to ensure that justice is served for the horrible atrocity committed in Pittsburgh. Gab will continue to fight for the fundamental human right to speak freely. As we transition to a new hosting provider Gab will be inaccessible for a period of time. We are working around the clock to get Gab.com back online. Thank you and remember to speak freely.
—from the Gab.ai homepage on 2018-10-29
- Gab, the Social Media Site for the Alt-Right, Gets Deplatformed
- Microsoft threatened to stop hosting the alt-right’s favorite social network in Quartz 2018-08-10
- Face the Racist Nation from On The Media | WNYC Studios on 2018-08-31 includes a segment about deplatforming racist groups from news coverage in the early 1900’s.
- ‘By Whatever Means Necessary’: The Origins of the ‘No Platform Policy’ on Hatful of History on 2015-11-03
- Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10
- Study finds Reddit’s controversial ban of its most toxic subreddits actually worked TechCrunch on 2017-09-11
- The case for quarantining extremist ideas by Joan Donovan and Danah Boyd in The Guardian on 2018-06/01
- You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech Proc. ACM Hum.-Comput. Interact., Vol. 1, No. 2, Article 31. Publication date: November 2017.
This book explores the weird and mean and in-between that characterize everyday expression online, from absurdist photoshops to antagonistic Twitter hashtags to deceptive identity play.
Whitney Phillips and Ryan M. Milner focus especially on the ambivalence of this expression: the fact that it is too unwieldy, too variable across cases, to be essentialized as old or new, vernacular or institutional, generative or destructive. Online expression is, instead, all of the above. This ambivalence, the authors argue, hinges on available digital tools. That said, there is nothing unexpected or surprising about even the strangest online behavior. Ours is a brave new world, and there is nothing new under the sun – a point necessary to understanding not just that online spaces are rife with oddity, mischief, and antagonism, but why these behaviors matter.
The Ambivalent Internet is essential reading for students and scholars of digital media and related fields across the humanities, as well as anyone interested in mediated culture and expression.
The course is titled 'E-Learning 3.0' and could be subtitled 'Distributed Learning Technology'. This is a course about the next generation of learning technology. It's a broad and challenging domain that I've broken down into the following topics: data, cloud, graph, community, identity, resources, recognition, experience, agency.
I'm designing the course so that each week is one of these self-contained topics. This topic can then be approached from different directions, at different levels. The content is a starting point. I will provide a series of reflections. But I will be learning about each of these topics along with everyone else.
Surveillance capitalism turns a profit by making people more comfortable with discrimination
Facebook’s use of “ethnic affinity” as a proxy for race is a prime example. The platform’s interface does not offer users a way to self-identify according to race, but advertisers can nonetheless target people based on Facebook’s ascription of an “affinity” along racial lines. In other words. race is deployed as an externally assigned category for purposes of commercial exploitation and social control, not part of self-generated identity for reasons of personal expression. The ability to define one’s self and tell one’s own stories is central to being human and how one relates to others; platforms’ ascribing identity through data undermines both. ❧
October 15, 2018 at 09:34PM