Read Harmful speech as the new porn by Jeff Jarvis (BuzzMachine)
In 1968, Lyndon Johnson appointed a National Commission on Obscenity and Pornography to investigate the supposed sexual scourge corrupting America’s youth...
I kept getting interrupted while reading this. I started around 8:30 or so… Fascinating look and thoughts which Jeff writes here.

📑 Highlights and Annotations

For the parallels between the fight against harmful and hateful speech online today and the crusade against sexual speech 50 years ago are stunning: the paternalistic belief that the powerless masses (but never the powerful) are vulnerable to corruption and evil with mere exposure to content; the presumption of harm without evidence and data; cries calling for government to stamp out the threat; confusion about the definitions of what’s to be forbidden; arguments about who should be responsible; the belief that by censoring content other worries can also be erased.

Annotated on January 18, 2020 at 08:29AM

One of the essays comes from Charles Keating, Jr., a conservative whom Nixon added to the body after having created a vacancy by dispatching another commissioner to be ambassador to India. Keating was founder of Citizens for Decent Literature and a frequent filer of amicus curiae briefs to the Supreme Court in the Ginzberg, Mishkin, and Fanny Hill obscenity cases. Later, Keating was at the center of the 1989 savings and loan scandal — a foretelling of the 2008 financial crisis — which landed him in prison. Funny how our supposed moral guardians — Nixon or Keating, Pence or Graham — end up disgracing themselves; but I digress.

Annotated on January 18, 2020 at 08:40AM

The fear then was the corruption of the masses; the fear now is microtargeting drilling directly into the heads of a strategic few.

Annotated on January 18, 2020 at 08:42AM

McCarthy next asks: “Who selects what is to be recorded or transmitted to others, since not everything can be recorded?” But now, everything can be recorded and transmitted. That is the new fear: too much speech.

Annotated on January 18, 2020 at 08:42AM

Many of the book’s essayists defend freedom of expression over freedom from obscenity. Says Rabbi Arthur Lelyveld (father of Joseph, who would become executive editor of The New York Times): “Freedom of expression, if it is to be meaningful at all, must include freedom for ‘that which we loathe,’ for it is obvious that it is no great virtue and presents no great difficulty for one to accord freedom to what we approve or to that to which we are indifferent.” I hear too few voices today defending speech of which they disapprove.

I might take issue with this statement and possibly a piece of Jarvis’ argument here. I agree that it’s moral panic that there could be such a thing as “too much speech” because humans have a hard limit for how much they can individually consume.

The issue I see is that while anyone can say almost anything, the problem becomes when a handful of monopolistic players like Facebook or YouTube can use algorithms to programattically entice people to click on and consume fringe content in mass quantities and that subtly, but assuredly nudges the populace and electorate in an unnatural direction. Most of the history of human society and interaction has long tended toward a centralizing consensus in which we can manage to cohere. The large scale effects of algorithmic-based companies putting a heavy hand on the scales are sure to create unintended consequences and they’re able to do it at scales that the Johnson and Nixon administrations only wish they had access to.

If we look at as an analogy to the evolution of weaponry, I might suggest we’ve just passed the border of single shot handguns and into the era of machine guns. What is society to do when the next evolution occurs into the era of social media atomic weapons?
Annotated on January 18, 2020 at 10:42AM

Truth is hard.

Annotated on January 18, 2020 at 10:42AM

As an American and a staunch defender of the First Amendment, I’m allergic to the notion of forbidden speech. But if government is going to forbid it, it damned well better clearly define what is forbidden or else the penumbra of prohibition will cast a shadow and chill on much more speech.

Perhaps it’s not what people are saying so much as platforms are accelerating it algorithmically? It’s one thing for someone to foment sedition, praise Hitler, or yell their religious screed on the public street corner. The problem comes when powerful interests in the form of governments, corporations, or others provide them with megaphones and tacitly force audiences to listen to it.

When Facebook or Youtube optimize for clicks keyed on social and psychological constructs using fringe content, we’re essentially saying that machines, bots, and extreme fringe elements are not only people, but that they’ve got free speech rights, and they can be prioritized with the reach and exposure of major national newspapers and national television in the media model of the 80’s.

I highly suspect that if real people’s social media reach were linear and unaccelerated by algorithms we wouldn’t be in the morass we’re generally seeing on many platforms.
Annotated on January 18, 2020 at 11:08AM

“Privacy in law means various things,” he writes; “and one of the things it means is protection from intrusion.” He argues that in advertising, open performance, and public-address systems, “these may validly be regulated” to prevent porn from being thrust upon the unsuspecting and unwilling. It is an extension of broadcast regulation.
And that is something we grapple with still: What is shown to us, whether we want it shown to us, and how it gets there: by way of algorithm or editor or bot. What is our right not to see?

Privacy as freedom from is an important thing. I like this idea.
Annotated on January 18, 2020 at 11:20AM

The Twenty-Six Words that Created the Internet is Jeff Kosseff’s definitive history and analysis of the current fight over Section 230, the fight over who will be held responsible to forbid speech. In it, Kosseff explains how debate over intermediary liability, as this issue is called, stretches back to a 1950s court fight, Smith v. California, about whether an L.A. bookseller should have been responsible for knowing the content of every volume on his shelves.

For me this is the probably the key idea. Facebook doesn’t need to be responsible for everything that their users post, but when they cross the line into actively algorithmically promoting and pushing that content into their users’ feeds for active consumption, then they **do** have a responsibility for that content.

By analogy image the trusted local bookstore mentioned. If there are millions of books there and the user has choice when they walk in to make their selection in some logical manner. But if the bookseller has the secret ability to consistently walk up to children and put porn into their hands or actively herding them into the adult sections to force that exposure on them (and they have the ability to do it without anyone else realizing it), then that is the problem. Society at large would further think that this is even more reprehensible if they realized that local governments or political parties had the ability to pay the bookseller to do this activity.

In case the reader isn’t following the analogy, this is exactly what some social platforms like Facebook are allowing our politicans to do. They’re taking payment from politicans to actively lie, tell untruths, and create fear in a highly targeted manner without the rest of society to see or hear those messages. Some of these sorts of messages are of the type that if they were picked up on an open microphone and broadcast outside of the private group they were intended for would have been a career ending event.

Without this, then we’re actively stifling conversation in the public sphere and actively empowering the fringes. This sort of active targeted fringecasting is preventing social cohesion, consensus, and comprimise and instead pulling us apart.

Perhaps the answer for Facebook is to allow them to take the political ad money for these niche ads and then not just cast to the small niche audience, but to force them to broadcast them to everyone on the platform instead? Then we could all see who our politicians really are?
Annotated on January 18, 2020 at 11:50AM

Of course, it’s even more absurd to expect Facebook or Twitter or Youtube to know and act on every word or image on their services than it was to expect bookseller Eleazer Smith to know the naughty bits in every book on his shelves.

Here’s the point! We shouldn’t expect them to know, but similarly if they don’t know, then they should not be allowed to randomly privilege some messages over others for how those messages are distributed on the platform. Why is YouTube accelerating messages about Nazis instead of videos of my ham sandwich at lunch? It’s because they’re making money on the Nazis.
Annotated on January 18, 2020 at 12:07PM

there must be other factors that got us Trump

Primarily people not really knowing how racisit and horrible he really was in addition to his inability to think clearly, logically, or linearly. He espoused a dozen or so simple aphorisms like “Build the wall,” but was absolutely unable to indicate a plan that went beyond the aphorism. How will it be implemented, funded, what will the short and long term issues that result. He had none of those things that many others presumed would be worked out as details by smart and intelligent people rather than the “just do it” managerial style he has been shown to espouse.

Too many republicans, particularly at the end said, “he’s not really that bad” and now that he’s in power and more authoritarian than they expected are too weak to admit their mistake.
Annotated on January 18, 2020 at 12:28PM

Axel Bruns’ dismantling of the filter bubble.

research to read
Annotated on January 18, 2020 at 12:45PM

“To affirm freedom is not to applaud that which is done under its sign,” Lelyveld writes.

Annotated on January 18, 2020 at 12:51PM

Read Introducing the idea of ‘hyperobjects’ by Timothy MortonTimothy Morton (High Country News)
A new way of understanding climate change and other phenomena.

We are obliged to do something about them, because we can think them.

Annotated on January 15, 2020 at 08:56AM

It’s very difficult to talk about something you cannot see or touch, yet we are obliged to do so, since global warming affects us all.

It’s also difficult to interact with those things when we’re missing the words and vocabulary to talk about them intelligently.
Annotated on January 15, 2020 at 09:00AM

Timothy Morton is Rita Shea Guffey Chair in English at Rice University in Houston. He is the author of Realist Magic: Objects, Ontology, Causality and Hyperobjects: Philosophy and Ecology after the End Of The World.

want to read these
Annotated on January 15, 2020 at 10:10AM

Or global warming. I can’t see or touch it. What I can see and touch are these raindrops, this snow, that sunburn patch on the back of my neck. I can touch the weather. But I can’t touch climate. So someone can declare: “See! It snowed in Boise, Idaho, this week. That means there’s no global warming!” We can’t directly see global warming, because it’s not only really widespread and really really long-lasting (100,000 years); it’s also super high-dimensional. It’s not just 3-D. It’s an incredibly complex entity that you have to map in what they call a high-dimensional- phase space: a space that plots all the states of a system. In so doing, we are only following the strictures of modern science, laid down by David Hume and underwritten by Immanuel Kant. Science can’t directly point to causes and effects: That would be metaphysical, equivalent to religious dogma. It can only see correlations in data. This is because, argues Kant, there is a gap between what a thing is and how it appears (its “phenomena”) that can’t be reduced, no matter how hard we try. We can’t locate this gap anywhere on or inside a thing. It’s a transcendental gap. Hyperobjects force us to confront this truth of modern science and philosophy.

A short, and very cogent argument here.
Annotated on January 15, 2020 at 10:07AM

Hat tip: Ethan Marcotte #

Quoted from email about "Policy change in regards to Social Media use for social learning from Centre for Innovation, Leiden University" by Tanja de Bie, Community Manager (Centre for Innovation, Leiden University via Coursera)

The Centre for Innovation of Leiden University has always strongly supported social or collaborative learning in online learning:  the interaction between learners facilitating learners, whether that is in discussion forums, peer review assignments or in our Facebook groups, contributes to a deeper understanding of subjects, and prepares learners to apply their knowledge.

However, the Centre for Innovation has a responsibility to our teachers, learners and volunteers, under GDPR and our own Privacy Policy. Based on this we conducted a review of different platforms that we made use of for collaborative, social learning and have decided to move away from those that do not allow us to meet our obligations and promises to those in our care.

Therefore we have decided to close all Facebook groups, Whatsapp groups and Instagram accounts currently under control of the Centre for Innovation, per the 29th of March 2019, and have adjusted our courses accordingly.

You can direct any questions or remarks in regards to this policy to MOOC@sea.leidenuniv.nl.

Kind Regards,

On behalf of Centre for Innovation, Leiden University,

Tanja de Bie, Community Manager

At least part of Leiden University is apparently making the moral and ethical call to close all their Facebook related properties. Kudos! They’ve already got a great website, perhaps they’ll move a bit more toward the IndieWeb?

👓 The story behind the gas lamps and leeries in ‘Mary Poppins Returns’ | Business Insider

Read The real story behind the gas lamps and lamplighters in 'Mary Poppins Returns' (Business Insider)
In 'Mary Poppins Returns', Lin-Manuel Miranda plays a lamplighter. Here's the history behind the lamps and the profession.

The Victorian periodical The Westminster Review wrote that the introduction of gas lamps would do more to eliminate immorality and criminality on the streets than any number of church sermons.  

🎧 The Daily: Paul Ryan’s Exit Interview | New York Times

Listened to The Daily: Paul Ryan’s Exit Interview by Michael Barbaro from New York Times

As speaker of the House, the Republican lawmaker should be at the peak of his powers. Instead, he’s walking away.

👓 Everything bad about Facebook is bad for the same reason | Wired

Read Everything bad about Facebook is bad for the same reason by Nikhil Sonnad (Quartz)
The philosophy of Hannah Arendt points to the banal evil beneath Facebook's many mistakes.
We definitely need some humanity and morality in our present mess. More and more I really want to rage quit Facebook for what it’s doing to the world, but I would like to have all my friends and family follow me.

👓 Abortion is Immoral, Except When It Comes to My Mistresses | McSweeny’s

Read Abortion is Immoral, Except When It Comes to My Mistresses by Devorah Blachor (McSweeney's)
TIM: Life begins at conception. Pregnancy is a gift from God, which is why I’m cosponsoring this anti-abortion legislation after asking my lover to have an abortion. I’m 65 and she’s 32, but you probably figured that out already.

🎧 ‘The Daily’: The C.I.A.’s Moral Reckoning | New York Times

Listened to ‘The Daily’: The C.I.A.’s Moral Reckoning by Michael Barbaro from nytimes.com

Gina Haspel, President Trump’s pick for C.I.A. director, faced the Senate Intelligence Committee for the first time on Wednesday as her confirmation hearings began. Lawmakers addressed her with an unusual line of questioning: What is your moral character?

On today’s episode:

• Matthew Rosenberg joins us from Washington, where he covers intelligence and national security for The New York Times.

Background reading:

• Ms. Haspel defended the C.I.A.’s torture of terrorism suspects after the Sept. 11 attacks, but vowed that she would not start another interrogation program.

• Among the issues raised in the hearing were Ms. Haspel’s involvement in a black site in Thailand where Qaeda suspects were tortured, her role in carrying out an order to destroy videotapes of C.I.A. interrogations, and her willingness to defy a president who has supported waterboarding.

We’ve recently seen the head of the F.B.I. be ousted because he ostensibly wouldn’t take a loyalty oath and refused to close an investigation. Could this happen again? Could it be far worse?

They stopped far too short here in opening up questions of harkening back to the Third Reich and Hitler and his government commanding people to commit genocide. We all know there’s a line one can’t cross and use the defense that “I was commanded to by the authorities.”

So the real question is: will Haspel stand up to Trump to prevent moral atrocities which Trump may want to inflict, whether this may extend to areas like torture or, perhaps, far worse?

🎧 ‘The Daily’: A ‘Big Price to Pay’ in Syria | The New York Times

Listened to ‘The Daily’: A ‘Big Price to Pay’ in Syria by Michael Barbaro from nytimes.com

After a suspected chemical attack in Syria, President Trump said Iran and Russia were responsible for backing “Animal Assad.” But Damascus may view the United States as being focused on a different fight.

President Trump has warned that there will be a “big price to pay” after yet another suspected chemical weapons attack in Syria.

But the suspicion that Syria continues to use those weapons suggests it views the United States as being focused on a different fight.



On today’s episode:

• Ben Hubbard, who covers the Middle East for The New York Times.

Background reading:

• Dozens suffocated in Syria after a reported chemical attack on a rebel-held suburb of Damascus.

• Trump sought a way out of Syria, but the latest attack is pulling him back in.

• There have been similar deadly assaults for years, including one in 2013 that killed more than 1,400.

Listening to this a few days on it sounds more like Trump has even more bluster than Obama, but he’s doing roughly the same thing. Yet again, small countries that should know far better are continuing to trod on their own people. Sadly, America is doing it to, just with far more sophisticated weapons. If we can’t figure out the right and wrong at the big obvious scale, how can we have proper morality at the smaller and more subtle scales?
Reposted A post by Amit Gawande (Musings. Et Al.)
I am tired of listening to comments on Facebook, mentioning it’s already too late or no point now or the platform is too valuable. We need to stop this. We are conveying to owners & other parties involved that don’t worry. It doesn’t matter how bad you screw up. You own us.

📺 Zeynep Tufekci: Machine intelligence makes human morals more important | TED

Watched Machine intelligence makes human morals more important by Zeynep TufekciZeynep Tufekci from ted.com

Machine intelligence is here, and we're already using it to make subjective decisions. But the complex way AI grows and improves makes it hard to understand and even harder to control. In this cautionary talk, techno-sociologist Zeynep Tufekci explains how intelligent machines can fail in ways that don't fit human error patterns -- and in ways we won't expect or be prepared for. "We cannot outsource our responsibilities to machines," she says. "We must hold on ever tighter to human values and human ethics."