Read Amazon Drivers Are Hanging Smartphones in Trees to Get More Work by Spencer Soper (Bloomberg)
Someone seems to have rigged Amazon system to get orders first
Operation reflects ferocious rivalry for gigs in a bad economy

They believe an unidentified person or entity is acting as an intermediary between Amazon and the drivers and charging drivers to secure more routes, which is against Amazon’s policies. 

Surely this would be the case as someone would potentially need to watch the phones in the tree to ensure they aren’t stolen. That may represent a larger cost in potential loss that the potential gain.
Annotated on September 11, 2020 at 08:39AM

A Flex driver who has been monitoring the activity said the company needs to take steps to make sure all drivers are treated fairly.“Amazon knows about it,” the driver said, “but does nothing.” 

Orders don’t necessarily need to be proximity based at the level of 20 feet, so Amazon should be able to make the changes at the level of several miles to prevent against something like this.
Annotated on September 11, 2020 at 08:42AM

Read Harmful speech as the new porn by Jeff Jarvis (BuzzMachine)
In 1968, Lyndon Johnson appointed a National Commission on Obscenity and Pornography to investigate the supposed sexual scourge corrupting America’s youth...
I kept getting interrupted while reading this. I started around 8:30 or so… Fascinating look and thoughts which Jeff writes here.

📑 Highlights and Annotations

For the parallels between the fight against harmful and hateful speech online today and the crusade against sexual speech 50 years ago are stunning: the paternalistic belief that the powerless masses (but never the powerful) are vulnerable to corruption and evil with mere exposure to content; the presumption of harm without evidence and data; cries calling for government to stamp out the threat; confusion about the definitions of what’s to be forbidden; arguments about who should be responsible; the belief that by censoring content other worries can also be erased.

Annotated on January 18, 2020 at 08:29AM

One of the essays comes from Charles Keating, Jr., a conservative whom Nixon added to the body after having created a vacancy by dispatching another commissioner to be ambassador to India. Keating was founder of Citizens for Decent Literature and a frequent filer of amicus curiae briefs to the Supreme Court in the Ginzberg, Mishkin, and Fanny Hill obscenity cases. Later, Keating was at the center of the 1989 savings and loan scandal — a foretelling of the 2008 financial crisis — which landed him in prison. Funny how our supposed moral guardians — Nixon or Keating, Pence or Graham — end up disgracing themselves; but I digress.

Annotated on January 18, 2020 at 08:40AM

The fear then was the corruption of the masses; the fear now is microtargeting drilling directly into the heads of a strategic few.

Annotated on January 18, 2020 at 08:42AM

McCarthy next asks: “Who selects what is to be recorded or transmitted to others, since not everything can be recorded?” But now, everything can be recorded and transmitted. That is the new fear: too much speech.

Annotated on January 18, 2020 at 08:42AM

Many of the book’s essayists defend freedom of expression over freedom from obscenity. Says Rabbi Arthur Lelyveld (father of Joseph, who would become executive editor of The New York Times): “Freedom of expression, if it is to be meaningful at all, must include freedom for ‘that which we loathe,’ for it is obvious that it is no great virtue and presents no great difficulty for one to accord freedom to what we approve or to that to which we are indifferent.” I hear too few voices today defending speech of which they disapprove.

I might take issue with this statement and possibly a piece of Jarvis’ argument here. I agree that it’s moral panic that there could be such a thing as “too much speech” because humans have a hard limit for how much they can individually consume.

The issue I see is that while anyone can say almost anything, the problem becomes when a handful of monopolistic players like Facebook or YouTube can use algorithms to programattically entice people to click on and consume fringe content in mass quantities and that subtly, but assuredly nudges the populace and electorate in an unnatural direction. Most of the history of human society and interaction has long tended toward a centralizing consensus in which we can manage to cohere. The large scale effects of algorithmic-based companies putting a heavy hand on the scales are sure to create unintended consequences and they’re able to do it at scales that the Johnson and Nixon administrations only wish they had access to.

If we look at as an analogy to the evolution of weaponry, I might suggest we’ve just passed the border of single shot handguns and into the era of machine guns. What is society to do when the next evolution occurs into the era of social media atomic weapons?
Annotated on January 18, 2020 at 10:42AM

Truth is hard.

Annotated on January 18, 2020 at 10:42AM

As an American and a staunch defender of the First Amendment, I’m allergic to the notion of forbidden speech. But if government is going to forbid it, it damned well better clearly define what is forbidden or else the penumbra of prohibition will cast a shadow and chill on much more speech.

Perhaps it’s not what people are saying so much as platforms are accelerating it algorithmically? It’s one thing for someone to foment sedition, praise Hitler, or yell their religious screed on the public street corner. The problem comes when powerful interests in the form of governments, corporations, or others provide them with megaphones and tacitly force audiences to listen to it.

When Facebook or Youtube optimize for clicks keyed on social and psychological constructs using fringe content, we’re essentially saying that machines, bots, and extreme fringe elements are not only people, but that they’ve got free speech rights, and they can be prioritized with the reach and exposure of major national newspapers and national television in the media model of the 80’s.

I highly suspect that if real people’s social media reach were linear and unaccelerated by algorithms we wouldn’t be in the morass we’re generally seeing on many platforms.
Annotated on January 18, 2020 at 11:08AM

“Privacy in law means various things,” he writes; “and one of the things it means is protection from intrusion.” He argues that in advertising, open performance, and public-address systems, “these may validly be regulated” to prevent porn from being thrust upon the unsuspecting and unwilling. It is an extension of broadcast regulation.
And that is something we grapple with still: What is shown to us, whether we want it shown to us, and how it gets there: by way of algorithm or editor or bot. What is our right not to see?

Privacy as freedom from is an important thing. I like this idea.
Annotated on January 18, 2020 at 11:20AM

The Twenty-Six Words that Created the Internet is Jeff Kosseff’s definitive history and analysis of the current fight over Section 230, the fight over who will be held responsible to forbid speech. In it, Kosseff explains how debate over intermediary liability, as this issue is called, stretches back to a 1950s court fight, Smith v. California, about whether an L.A. bookseller should have been responsible for knowing the content of every volume on his shelves.

For me this is the probably the key idea. Facebook doesn’t need to be responsible for everything that their users post, but when they cross the line into actively algorithmically promoting and pushing that content into their users’ feeds for active consumption, then they **do** have a responsibility for that content.

By analogy image the trusted local bookstore mentioned. If there are millions of books there and the user has choice when they walk in to make their selection in some logical manner. But if the bookseller has the secret ability to consistently walk up to children and put porn into their hands or actively herding them into the adult sections to force that exposure on them (and they have the ability to do it without anyone else realizing it), then that is the problem. Society at large would further think that this is even more reprehensible if they realized that local governments or political parties had the ability to pay the bookseller to do this activity.

In case the reader isn’t following the analogy, this is exactly what some social platforms like Facebook are allowing our politicans to do. They’re taking payment from politicans to actively lie, tell untruths, and create fear in a highly targeted manner without the rest of society to see or hear those messages. Some of these sorts of messages are of the type that if they were picked up on an open microphone and broadcast outside of the private group they were intended for would have been a career ending event.

Without this, then we’re actively stifling conversation in the public sphere and actively empowering the fringes. This sort of active targeted fringecasting is preventing social cohesion, consensus, and comprimise and instead pulling us apart.

Perhaps the answer for Facebook is to allow them to take the political ad money for these niche ads and then not just cast to the small niche audience, but to force them to broadcast them to everyone on the platform instead? Then we could all see who our politicians really are?
Annotated on January 18, 2020 at 11:50AM

Of course, it’s even more absurd to expect Facebook or Twitter or Youtube to know and act on every word or image on their services than it was to expect bookseller Eleazer Smith to know the naughty bits in every book on his shelves.

Here’s the point! We shouldn’t expect them to know, but similarly if they don’t know, then they should not be allowed to randomly privilege some messages over others for how those messages are distributed on the platform. Why is YouTube accelerating messages about Nazis instead of videos of my ham sandwich at lunch? It’s because they’re making money on the Nazis.
Annotated on January 18, 2020 at 12:07PM

there must be other factors that got us Trump

Primarily people not really knowing how racisit and horrible he really was in addition to his inability to think clearly, logically, or linearly. He espoused a dozen or so simple aphorisms like “Build the wall,” but was absolutely unable to indicate a plan that went beyond the aphorism. How will it be implemented, funded, what will the short and long term issues that result. He had none of those things that many others presumed would be worked out as details by smart and intelligent people rather than the “just do it” managerial style he has been shown to espouse.

Too many republicans, particularly at the end said, “he’s not really that bad” and now that he’s in power and more authoritarian than they expected are too weak to admit their mistake.
Annotated on January 18, 2020 at 12:28PM

Axel Bruns’ dismantling of the filter bubble.

research to read
Annotated on January 18, 2020 at 12:45PM

“To affirm freedom is not to applaud that which is done under its sign,” Lelyveld writes.

Annotated on January 18, 2020 at 12:51PM

Read As it settles into Vox, Recode is starting a new project to help people feel power over algorithms (Nieman Lab)
"It's about cutting through the apathy that a lot of people have about tech because it feels mysterious, letting people know there are decisions and changes you can make to your behavior that will feel empowering to people."
Listened to Computers Judge What Makes The Perfect Radio Voice by Audie Cornish from All Things Considered | NPR

A few weeks ago, All Tech Considered asked the audience to send voice samples to analyze. Those samples were put through an algorithm to figure out what kind of voice would make an appealing radio host. NPR's Audie Cornish explains how this experiment turned out.

hat tip: What do authority and curiosity sound like on the radio? NPR has been expanding that palette from its founding | NiemanLab
Read What Happened to Tagging? by Aaron DavisAaron Davis (Read Write Collect)
Alexandra Samuel reflects on tagging and its origins as a backbone to the social web. Along with RSS, tags allowed users to connect and collate content using such tools as feed readers. This all changed with the advent of social media and the algorithmically curated news feed. Samuel wonders if we h...

Alexander Samuel reflects on tagging and its origins as a backbone to the social web. Along with RSS, tags allowed users to connect and collate content using such tools as feed readers. This all changed with the advent of social media and the algorithmically curated news feed.

Tags were used for discovery of specific types of content. Who needs that now that our new overlords of artificial intelligence and algorithmic feeds can tell us what we want to see?!

Of course we still need tags!!! How are you going to know serendipitously that you need more poetry in your life until you run into the tag on a service like IndieWeb.xyz? An algorithmic feed is unlikely to notice–or at least in my decade of living with them I’ve yet to run into poetry in one.
–December 04, 2019 at 10:56AM

A chat about algorithms and colorectal cancer

Friend: I spent all day researching marketing for colorectal cancer for work. Now my social media feeds are overflowing with healthcare ads for cancer screenings and treatment.
Me: Social silo algorithms will get you every time with surveillance capitalism.
Friend:
Me:
Friend: THEY ARE LITERALLY ‘UP MY BUTT’!!!

👓 Universal yanks TWiT’s ‘Tech News Today’ episode from YouTube due to Mega Video clip | VentureBeat

Read Universal yanks TWiT’s ‘Tech News Today’ episode from YouTube due to Mega Video clip (VentureBeat)
Universal Music Group has taken action to remove a recent episode of Tech News Today from YouTube because it contained clips of a MegaUpload video that Universal claims violates its copyright agreements. Tech News Today is a web show hosted on Leo Laporte’s TWiT’s web TV news network. In the yanked episode, the show’s hosts …

👓 Amazon Workers Are Listening to What You Tell Alexa | Bloomberg

Read Amazon Workers Are Listening to What You Tell Alexa by Matt Day , Giles Turner , and Natalia Drozdiak (Bloomberg)
A global team reviews audio clips in an effort to help the voice-activated assistant respond to commands.

👓 How Math Can Be Racist: Giraffing | 0xabad1dea

Read How Math Can Be Racist: Giraffing (0xabad1dea)
Well, any computer scientist or experienced programmer knows right away that being “made of math” does not demonstrate anything about the accuracy or utility of a program. Math is a lot more of a social construct than most people think. But we don’t need to spend years taking classes in algorithms to understand how and why the types of algorithms used in artificial intelligence systems today can be tremendously biased. Here, look at these four photos. What do they have in common?

📑 Three things about Readers during IndieWebCamp Nürnberg | Seblog

Annotated Three things about Readers during IndieWebCamp Nürnberg by Sebastiaan AndewegSebastiaan Andeweg (seblog.nl)
I have a problem with algorithms that sort my posts by parameters I don’t know about, made by people who want to sell my attention to others.  

📑 Reply to Ben Werdmüller | Interdependent Thoughts

Annotated Reply to Ben Werdmüller by Ton Zijlstra (Interdependent Thoughts)
They can spew hate amongst themselves for eternity, but without amplification it won’t thrive.  
This is a key point. Social media and the way it (and its black box algorithms) amplifies almost anything for the benefit of clicks towards advertising is one of its most toxic features. Too often the extreme voice draws the most attention instead of being moderated down by more civil and moderate society.

Deplatforming and making the web a better place

I’ve spent some time this morning thinking about the deplatforming of the abhorrent social media site Gab.ai by Google, Apple, Stripe, PayPal, and Medium following the Tree of Life shooting in Pennsylvania. I’ve created a deplatforming page on the IndieWeb wiki with some initial background and history. I’ve also gone back and tagged (with “deplatforming”) a few articles I’ve read or podcasts I’ve listened to recently that may have some interesting bearing on the topic.

The particular design question I’m personally looking at is roughly:

How can we reshape the web and social media in a way that allows individuals and organizations a platform for their own free speech and communication without accelerating or amplifying the voices of the abhorrent fringes of people espousing broadly anti-social values like virulent discrimination, racism, fascism, etc.?

In some sense, the advertising driven social media sites like Facebook, Twitter, et al. have given the masses the equivalent of not simply a louder voice within their communities, but potential megaphones to audiences previously far, far beyond their reach. When monetized against the tremendous value of billions of clicks, there is almost no reason for these corporate giants to filter or moderate socially abhorrent content.  Their unfiltered and unregulated algorithms compound the issue from a societal perspective. I look at it in some sense as the equivalent of the advent of machine guns and ultimately nuclear weapons in 20th century warfare and their extreme effects on modern society.

The flip side of the coin is also potentially to allow users the ability to better control and/or filter out what they’re presented on platforms and thus consuming, so solutions can relate to both the output as well as the input stages.

Comments and additions to the page (or even here below) particularly with respect to positive framing and potential solutions on how to best approach this design hurdle for human communication are more than welcome.


Deplatforming

Deplatforming or no platform is a form of banning in which a person or organization is denied the use of a platform (physical or increasingly virtual) on which to speak.

In addition to the banning of those with socially unacceptable viewpoints, there has been a long history of marginalized voices (particularly trans, LGBTQ, sex workers, etc.) being deplatformed in systematic ways.

The banning can be from any of a variety of spaces ranging from physical meeting spaces or lectures, journalistic coverage in newspapers or television to domain name registration, web hosting, and even from specific social media platforms like Facebookor Twitter. Some have used these terms as narrowly as in relation to having their Twitter “verified” status removed.

“We need to puncture this myth that [deplatforming]’s only affecting far-right people. Trans rights activistsBlack Lives Matterorganizers, LGBTQI people have been demonetized or deranked. The reason we’re talking about far-right people is that they have coverage on Fox News and representatives in Congress holding hearings. They already have political power.” — Deplatforming Works: Alex Jones says getting banned by YouTube and Facebook will only make him stronger. The research says that’s not true. in Motherboard 2018-08-10

Examples

Glenn Beck

Glenn Beck parted ways with Fox News in what some consider to have been a network deplatforming. He ultimately moved to his own platform consisting of his own website.

Reddit Communities

Reddit has previously banned several communities on its platform. Many of the individual users decamped to Voat, which like Gab could potentially face its own subsequent deplatforming.

Milo Yiannopoulos

Milo Yiannopoulos, the former Breitbart personality, was permanently banned from Twitter in 2016 for inciting targeted harassment campaigns against actress Leslie Jones. He resigned from Breitbart over comments he made about pedophilia on a podcast. These also resulted in the termination of a book deal with Simon & Schuster as well as the cancellation of multiple speaking engagements at Universities.

The Daily Stormer

Neo-Nazi site The Daily Stormer was deplatformed by Cloudflare in the wake of 2017’s “Unite the Right” rally in Charlottesville. Following criticism, Matthew Prince, Cloudflare CEO, announced that he was ending the Daily Stormer’s relationship with Cloudflare, which provides services for protecting sites against distributed denial-of service (DDoS) attacks and maintaining their stability.

Alex Jones/Infowars

Alex Jones and his Infowars were deplatformed by Apple, Spotify, YouTube, and Facebook in late summer 2018 for his Network’s false claims about the Newtown shooting.

Gab

Gab.ai was deplatformed from PayPal, Stripe, Medium , Apple, and Google as a result of their providing a platform for alt-right and racist groups as well as the shooter in the Tree of Life Synagogue shooting in October 2018

Gab.com is under attack. We have been systematically no-platformed by App Stores, multiple hosting providers, and several payment processors. We have been smeared by the mainstream media for defending free expression and individual liberty for all people and for working with law enforcement to ensure that justice is served for the horrible atrocity committed in Pittsburgh. Gab will continue to fight for the fundamental human right to speak freely. As we transition to a new hosting provider Gab will be inaccessible for a period of time. We are working around the clock to get Gab.com back online. Thank you and remember to speak freely.

—from the Gab.ai homepage on 2018-10-29

History

Articles

Research

See Also

  • web hosting
  • why
  • shadow banning
  • NIPSA
  • demonitazition – a practice (particularly leveled at YouTube) of preventing users and voices from monetizing their channels. This can have a chilling effect on people who rely on traffic for income to support their work (see also 1)

👓 How Students Engage with News: Five Takeaways for Educators, Journalists, and Librarians | Project Information Literacy Research Institute

Read How Students Engage with News: Five Takeaways for Educators, Journalists, and Librarians [.pdf] by Alison J. Head, John Wihbey, P. Takis Metaxas, Margy MacMillan, and Dan Cohen (Project Information Literacy Research Institute)
Abstract: The News Study research report presents findings about how a sample of U.S. college students gather information and engage with news in the digital age. Results are included from an online survey of 5,844 respondents and telephone interviews with 37 participants from 11 U.S. colleges and universities selected for their regional, demographic, and red/blue state diversity. A computational analysis was conducted using Twitter data associated with the survey respondents and a Twitter panel of 135,891 college-age people. Six recommendations are included for educators, journalists, and librarians working to make students effective news consumers. To explore the implications of this study’s findings, concise commentaries from leading thinkers in education, libraries, media research, and journalism are included.
A great little paper about how teens and college students are finding, reading, sharing, and generally interacting with news. There’s some nice overlap here on both the topics of journalism and education which I find completely fascinating. In general, however, I think in a few places students are mis-reporting their general uses, so I’m glad a portion of the paper actually looks at data from Twitter in the wild to see what real world use cases actually are.

Perhaps there are some interesting segments and even references relevant to the topics of education and IndieWeb for Greg McVerry‘s recent project?

As I read this, I can’t help but think of some things I’ve seen Michael Caulfield writing about news and social media over the past several months. As I look, I notice that he’s already read and written a bit about a press release for this particular paper. I’ll have to take a look at his take on it tomorrow. I’m particularly interested in any insights he’s got on lateral reading and fake news above and beyond his prior thoughts.

Perhaps I missed it hiding in there reading so late at night, but another potentially good source for this paper’s recommended section would be Caulfield’s book Web Literacy for Student Fact-Checkers.

Highlights, Quotes, Annotations, & Marginalia

The purpose of this study was to better understand the preferences, practices, and motivations of young news consumers, while focusing on what students actually do, rather than what they do not do.  

October 22, 2018 at 08:28PM

YouTube (54%), Instagram (51%) or Snapchat (55%)  

I’m curious to know which sources in particular they’re using on these platforms. Snapchat was growing news sources a year ago, but I’ve heard those sources are declining. What is the general quality of these sources?

For example, getting news from television can range from PBS News Hour and cable news networks (more traditional sources) to comedy shows like Stephen Colbert and The Daily Show with Trevor Noah which have some underlying news in the comedy, but are far from traditional sources.
October 22, 2018 at 08:35PM

Some students (28%) received news from podcasts in the preceding week.  

October 22, 2018 at 08:38PM

news is stressful and has little impact on the day-to-day routines —use it for class assignments, avoid it otherwise.” While a few students like this one practiced news abstinence, such students were rare.  

This sounds a bit like my college experience, though I didn’t avoid it because of stressful news (and there wasn’t social media yet). I generally missed it because I didn’t subscribe directly to publications or watch much television. Most of my news consumption was the local college newspaper.
October 22, 2018 at 08:46PM

But on the Web, stories of all kinds can show up anywhere and information and news are all mixed together. Light features rotate through prominent spots on the “page” with the same weight as breaking news, sports coverage, and investigative pieces, even on mainstream news sites. Advertorial “features” and opinion pieces are not always clearly identified in digitalspaces.  

This difference is one of the things I miss about reading a particular newspaper and experiencing the outlet’s particular curation of their own stories. Perhaps I should spend more time looking at the “front page” of various news sites?
October 22, 2018 at 08:57PM

Some (36%) said they agreed that the threat of “‘fake news’ had made them distrust the credibility of any news.” Almost half (45%) lacked confidence with discerning “real news” from “fake news,” and only 14% said they were “very confident” that they could detect “fake news.”  

These numbers are insane!
October 22, 2018 at 09:04PM

As a matter of recourse, some students in the study “read the news laterally,” meaning they used sources elsewhere on the Internet to compare versions of a story in an attempt to verify its facts, bias, and ultimately, its credibility.25  

This reminds me how much I miss the old daily analysis that Slate use to do for the day’s top news stories in various outlets in their Today’s Papers segment.
October 22, 2018 at 09:15PM

Some respondents, though not all, did evaluate the veracity of news they shared on social media. More (62%) said they checked to see how current an item was, while 59% read the complete story before sharing and 57% checked the URL to see where a story originated (Figure 7). Fewer read comments about a post (55%) or looked to see how many times an item was tweeted or shared (39%).  

I’m not sure I believe these self-reported numbers at all. 59% read the complete story before sharing?! 57% checked the URL? I’ll bet that not that many could probably define what a URL is.
October 22, 2018 at 10:00PM

information diet  

October 22, 2018 at 11:02PM

At the tactical level, there are likely many small things that could be tested with younger audiences to help them better orient themselves to the crowded news landscape. For example, some news organizations are more clearly identifying different types of content such as editorials, features, and backgrounders/news analysis.57More consistent and more obvious use of these typological tags would help all news consumers, not just youth, and could also travel with content as itis posted and shared in social media. News organizations should engage more actively with younger audiences to see what might be helpful.  

October 22, 2018 at 11:37PM

When news began moving into the first digital spaces in the early 1990s, pro-Web journalists touted the possibilities of hypertext links that would give news consumers the context they needed. Within a couple of years, hypertext links slowly began to disappear from many news stories. Today, hypertext links are all but gone from most mainstream news stories.  

October 22, 2018 at 11:38PM

“Solutions journalism’ is another promising trend that answers some of the respondents’ sense of helplessness in the face of the barrage of crisis coverage.62  

October 22, 2018 at 11:40PM

👓 What We Learned from Studying the News Consumption Habits of College Students | Dan Cohen

Read What We Learned from Studying the News Consumption Habits of College Students by Dan CohenDan Cohen (Dan Cohen)
Over the last year, I was fortunate to help guide a study of the news consumption habits of college students, and coordinate Northeastern University Library’s services for the study, including great work by our data visualization specialist Steven Braun and necessary infrastructure from our digital team, including Sarah Sweeney and Hillary Corbett. “How Students Engage with News,” out today as both a long article and accompanying datasets and media, provides a full snapshot of how college students navigate our complex and high-velocity media environment.

Highlights, Quotes, Annotations, & Marginalia

Side note: After recently seeing Yale Art Gallery’s show “Seriously Funny: Caricature Through the Centuries,” I think there’s a good article to be written about the historical parallels between today’s visual memes and political cartoons from the past.  

This also makes me think back to other entertainments of the historical poor including the use/purpose of stained glass windows in church supposedly as a means of entertaining the illiterate Latin vulgate masses.
October 22, 2018 at 08:07PM

nearly 6,000 students from a wide variety of institutions  

Institutions = colleges/universities? Or are we also considering less educated youth as well?
October 22, 2018 at 08:08PM

A more active stance by librarians, journalists, educators, and others who convey truth-seeking habits is essential.  

In some sense these people can also be viewed as aggregators and curators of sorts. How can their work be aggregated and be used to compete with the poor algorithms of social media?
October 22, 2018 at 08:11PM