Replied to Un podcast bien fait by Stéphane Deschamps (nota-bene.org)
Quand c’est bien fait, il faut le dire aussi.
Pardon the English, parce que mon français est très mal

You indicate at the bottom of the post (the rough English translation is mine) 

Bonus : c’est bien plus facile pour moi d’ajouter un texte à wallabag (au hasard) que de stocker un fichier audio pour une « consommation » facile. L’audio me demande toute une mise en œuvre assez pénible, pas le texte.
Bonus: It is much easier for me to add text to wallabag (at random) than to store an audio file for easy “consumption”. The audio requires quite a painful implementation, but not so for the text.

If you’re a fan of Wallabag for bookmarking text for later, you might appreciate using Huffduffer.com for your audio. It has a simple bookmarklet that will pull audio files, text, and tags from webpages and save them to your account. Your account then has a variety of iTunes audio feeds that you can subscribe to in your podcatcher of choice so that you can listen to the audio at your convenience later. If your podcatcher supports it, you can play it back at speeds that suit you (vite, donc).

 

Read Eliminating the Human by David ByrneDavid Byrne (MIT Technology Review)
We are beset by—and immersed in—apps and devices that are quietly reducing the amount of meaningful interaction we have with each other.
This piece makes a fascinating point about people and interactions. It’s the sort of thing that many in the design and IndieWeb communities should read and think about as they work.

I came to it via an episode of the podcast The Happiness Lab.

The consumer technology I am talking about doesn’t claim or acknowledge that eliminating the need to deal with humans directly is its primary goal, but it is the outcome in a surprising number of cases. I’m sort of thinking maybe it is the primary goal, even if it was not aimed at consciously.

Annotated on January 22, 2020 at 10:35AM

Most of the tech news we get barraged with is about algorithms, AI, robots, and self-driving cars, all of which fit this pattern. I am not saying that such developments are not efficient and convenient; this is not a judgment. I am simply noticing a pattern and wondering if, in recognizing that pattern, we might realize that it is only one trajectory of many. There are other possible roads we could be going down, and the one we’re on is not inevitable or the only one; it has been (possibly unconsciously) chosen.

Annotated on January 22, 2020 at 10:36AM

What I’m seeing here is the consistent “eliminating the human” pattern.

This seems as apt a name as any.
Annotated on January 22, 2020 at 10:39AM

“Social” media: This is social interaction that isn’t really social. While Facebook and others frequently claim to offer connection, and do offer the appearance of it, the fact is a lot of social media is a simulation of real connection.

Perhaps this is one of the things I like most about the older blogosphere and it’s more recent renaissance with the IndieWeb idea of Webmentions, a W3C recommendation spec for online interactions? While many of the interactions I get are small nods in the vein of likes, favorites, or reposts, some of them are longer, more visceral interactions.

My favorite just this past week was a piece that I’d worked on for a few days that elicited a short burst of excitement from someone who just a few minutes later wrote a reply that was almost as long as my piece itself.

To me this was completely worth the effort and the work, not because of the many other smaller interactions, but because of the human interaction that resulted. Not to mention that I’m still thinking out a reply still several days later.

This sort of human social interaction also seems to be at the heart of what Manton Reece is doing with micro.blog. By leaving out things like reposts and traditional “likes”, he’s really creating a human connection network to fix what traditional corporate social media silos have done to us. This past week’s episode of Micro Monday underlines this for us. (#)
Annotated on January 22, 2020 at 10:52AM

Antonio Damasio, a neuroscientist at USC wrote about a patient he called Elliot, who had damage to his frontal lobe that made him unemotional. In all other respects he was fine—intelligent, healthy—but emotionally he was Spock. Elliot couldn’t make decisions. He’d waffle endlessly over details. ­Damasio concluded that although we think decision-­making is rational and machinelike, it’s our emotions that enable us to actually decide.

Annotated on January 22, 2020 at 10:56AM

And in the meantime, if less human interaction enables us to forget how to cooperate, then we lose our advantage.

It may seem odd, but I think a lot of the success of the IndieWeb movement and community is exactly this: a group of people has come together to work and interact and increase our abilities to cooperate to make something much bigger, more diverse, and more interesting than any of us could have done separately.
Annotated on January 22, 2020 at 10:58AM

Remove humans from the equation, and we are less complete as people and as a society.

Annotated on January 22, 2020 at 10:59AM

A version of this piece originally appeared on his website, davidbyrne.com.

This piece seems so philosophical, it seems oddly trivial that I see this note here and can’t help but think about POSSE and syndication.
Annotated on January 22, 2020 at 11:01AM

Read Paywall blockers: how publishers should prepare for this changing technology by Mary-Katharine Phillips (Twipe)
With more than a quarter of all readers globally using ad blockers, the news media industry has had to come up with new ways to overcome this, whether it be technically or through new strategies. But as the industry makes the move towards reader revenue strategies, we’re seeing more readers employ...
Read Towards IndieWeb: POSSE and Notes by Steve Ivy (monkinetic.blog)
A common idiom is to differentiate Notes (small microblog-like posts) from Articles (longer blog posts with a title). Right now Goldfrog has a basic blog Post type, with (ID, Title, Slug, Tags, Body). I’d like to keep the posting experience as simple as possible, so I’m thinking about how to handle something that literally just has a Body (and Tags, because I parse and attach any - see? - in the content).
Read Weekly Roundup 2020 #1 by gRegor MorrillgRegor Morrill (gregorlove.com)
I saw Jamie Tanna has been posting weekly notes. I like the idea so figured I would try it out. No guarantees about consistency; I’m not treating this like a NaBloPoMo challenge. :] For the week of 2020-01-12, in no particular order: I have been working on a passwordless login system for my site s...
I’d love to do a weekly roundup too, but the problem is trying to stick with it on a regular basis. If I did, I don’t think I’d number them, but perhaps rely on my URL design to take the brunt of the work and let them auto number themselves that way. Just keeping up with the numbering is enough to make it that much harder, though I suspect my family might wish I did a roundup like this.
Read Why We Ended Legacy Admissions at Johns Hopkins by Ron Daniels (The Atlantic)
Eliminating an unfair tradition made our university more accessible to all talented students.
I remember hearing about discussions of this, but I’m glad they’ve made an official announcement and are moving in this direction. The Atlantic is such a great venue for writing about it too!
Read Managing Content Through Canonical Links by Aaron DavisAaron Davis (Read Write Respond)
One of the challenges with the web can be managing content across multiple sites, one answer, create canonical links and share from there. In a conversation on Twitter discussing the archiving images and canonical URLs, Amy Burvall explained that much of her work was simply stored on Instagram, whic...
Thanks for the pointer Aaron, I’m not sure how I’d missed this, but definitely apropos.
Read Making Meetable Easier to Install by Aaron PareckiAaron Parecki (Aaron Parecki)
I've been working towards making Meetable more useful to others by making it easier to configure and deploy. I took a few shortcuts during the initial development that let me finish it faster, primarily by offloading authentication and image resizing to external services. While that's great for me, ...
Read Harmful speech as the new porn by Jeff Jarvis (BuzzMachine)
In 1968, Lyndon Johnson appointed a National Commission on Obscenity and Pornography to investigate the supposed sexual scourge corrupting America’s youth...
I kept getting interrupted while reading this. I started around 8:30 or so… Fascinating look and thoughts which Jeff writes here.

📑 Highlights and Annotations

For the parallels between the fight against harmful and hateful speech online today and the crusade against sexual speech 50 years ago are stunning: the paternalistic belief that the powerless masses (but never the powerful) are vulnerable to corruption and evil with mere exposure to content; the presumption of harm without evidence and data; cries calling for government to stamp out the threat; confusion about the definitions of what’s to be forbidden; arguments about who should be responsible; the belief that by censoring content other worries can also be erased.

Annotated on January 18, 2020 at 08:29AM

One of the essays comes from Charles Keating, Jr., a conservative whom Nixon added to the body after having created a vacancy by dispatching another commissioner to be ambassador to India. Keating was founder of Citizens for Decent Literature and a frequent filer of amicus curiae briefs to the Supreme Court in the Ginzberg, Mishkin, and Fanny Hill obscenity cases. Later, Keating was at the center of the 1989 savings and loan scandal — a foretelling of the 2008 financial crisis — which landed him in prison. Funny how our supposed moral guardians — Nixon or Keating, Pence or Graham — end up disgracing themselves; but I digress.

Annotated on January 18, 2020 at 08:40AM

The fear then was the corruption of the masses; the fear now is microtargeting drilling directly into the heads of a strategic few.

Annotated on January 18, 2020 at 08:42AM

McCarthy next asks: “Who selects what is to be recorded or transmitted to others, since not everything can be recorded?” But now, everything can be recorded and transmitted. That is the new fear: too much speech.

Annotated on January 18, 2020 at 08:42AM

Many of the book’s essayists defend freedom of expression over freedom from obscenity. Says Rabbi Arthur Lelyveld (father of Joseph, who would become executive editor of The New York Times): “Freedom of expression, if it is to be meaningful at all, must include freedom for ‘that which we loathe,’ for it is obvious that it is no great virtue and presents no great difficulty for one to accord freedom to what we approve or to that to which we are indifferent.” I hear too few voices today defending speech of which they disapprove.

I might take issue with this statement and possibly a piece of Jarvis’ argument here. I agree that it’s moral panic that there could be such a thing as “too much speech” because humans have a hard limit for how much they can individually consume.

The issue I see is that while anyone can say almost anything, the problem becomes when a handful of monopolistic players like Facebook or YouTube can use algorithms to programattically entice people to click on and consume fringe content in mass quantities and that subtly, but assuredly nudges the populace and electorate in an unnatural direction. Most of the history of human society and interaction has long tended toward a centralizing consensus in which we can manage to cohere. The large scale effects of algorithmic-based companies putting a heavy hand on the scales are sure to create unintended consequences and they’re able to do it at scales that the Johnson and Nixon administrations only wish they had access to.

If we look at as an analogy to the evolution of weaponry, I might suggest we’ve just passed the border of single shot handguns and into the era of machine guns. What is society to do when the next evolution occurs into the era of social media atomic weapons?
Annotated on January 18, 2020 at 10:42AM

Truth is hard.

Annotated on January 18, 2020 at 10:42AM

As an American and a staunch defender of the First Amendment, I’m allergic to the notion of forbidden speech. But if government is going to forbid it, it damned well better clearly define what is forbidden or else the penumbra of prohibition will cast a shadow and chill on much more speech.

Perhaps it’s not what people are saying so much as platforms are accelerating it algorithmically? It’s one thing for someone to foment sedition, praise Hitler, or yell their religious screed on the public street corner. The problem comes when powerful interests in the form of governments, corporations, or others provide them with megaphones and tacitly force audiences to listen to it.

When Facebook or Youtube optimize for clicks keyed on social and psychological constructs using fringe content, we’re essentially saying that machines, bots, and extreme fringe elements are not only people, but that they’ve got free speech rights, and they can be prioritized with the reach and exposure of major national newspapers and national television in the media model of the 80’s.

I highly suspect that if real people’s social media reach were linear and unaccelerated by algorithms we wouldn’t be in the morass we’re generally seeing on many platforms.
Annotated on January 18, 2020 at 11:08AM

“Privacy in law means various things,” he writes; “and one of the things it means is protection from intrusion.” He argues that in advertising, open performance, and public-address systems, “these may validly be regulated” to prevent porn from being thrust upon the unsuspecting and unwilling. It is an extension of broadcast regulation.
And that is something we grapple with still: What is shown to us, whether we want it shown to us, and how it gets there: by way of algorithm or editor or bot. What is our right not to see?

Privacy as freedom from is an important thing. I like this idea.
Annotated on January 18, 2020 at 11:20AM

The Twenty-Six Words that Created the Internet is Jeff Kosseff’s definitive history and analysis of the current fight over Section 230, the fight over who will be held responsible to forbid speech. In it, Kosseff explains how debate over intermediary liability, as this issue is called, stretches back to a 1950s court fight, Smith v. California, about whether an L.A. bookseller should have been responsible for knowing the content of every volume on his shelves.

For me this is the probably the key idea. Facebook doesn’t need to be responsible for everything that their users post, but when they cross the line into actively algorithmically promoting and pushing that content into their users’ feeds for active consumption, then they **do** have a responsibility for that content.

By analogy image the trusted local bookstore mentioned. If there are millions of books there and the user has choice when they walk in to make their selection in some logical manner. But if the bookseller has the secret ability to consistently walk up to children and put porn into their hands or actively herding them into the adult sections to force that exposure on them (and they have the ability to do it without anyone else realizing it), then that is the problem. Society at large would further think that this is even more reprehensible if they realized that local governments or political parties had the ability to pay the bookseller to do this activity.

In case the reader isn’t following the analogy, this is exactly what some social platforms like Facebook are allowing our politicans to do. They’re taking payment from politicans to actively lie, tell untruths, and create fear in a highly targeted manner without the rest of society to see or hear those messages. Some of these sorts of messages are of the type that if they were picked up on an open microphone and broadcast outside of the private group they were intended for would have been a career ending event.

Without this, then we’re actively stifling conversation in the public sphere and actively empowering the fringes. This sort of active targeted fringecasting is preventing social cohesion, consensus, and comprimise and instead pulling us apart.

Perhaps the answer for Facebook is to allow them to take the political ad money for these niche ads and then not just cast to the small niche audience, but to force them to broadcast them to everyone on the platform instead? Then we could all see who our politicians really are?
Annotated on January 18, 2020 at 11:50AM

Of course, it’s even more absurd to expect Facebook or Twitter or Youtube to know and act on every word or image on their services than it was to expect bookseller Eleazer Smith to know the naughty bits in every book on his shelves.

Here’s the point! We shouldn’t expect them to know, but similarly if they don’t know, then they should not be allowed to randomly privilege some messages over others for how those messages are distributed on the platform. Why is YouTube accelerating messages about Nazis instead of videos of my ham sandwich at lunch? It’s because they’re making money on the Nazis.
Annotated on January 18, 2020 at 12:07PM

there must be other factors that got us Trump

Primarily people not really knowing how racisit and horrible he really was in addition to his inability to think clearly, logically, or linearly. He espoused a dozen or so simple aphorisms like “Build the wall,” but was absolutely unable to indicate a plan that went beyond the aphorism. How will it be implemented, funded, what will the short and long term issues that result. He had none of those things that many others presumed would be worked out as details by smart and intelligent people rather than the “just do it” managerial style he has been shown to espouse.

Too many republicans, particularly at the end said, “he’s not really that bad” and now that he’s in power and more authoritarian than they expected are too weak to admit their mistake.
Annotated on January 18, 2020 at 12:28PM

Axel Bruns’ dismantling of the filter bubble.

research to read
Annotated on January 18, 2020 at 12:45PM

“To affirm freedom is not to applaud that which is done under its sign,” Lelyveld writes.

Annotated on January 18, 2020 at 12:51PM

Read Responding to every tweet for one day by Matt Maldre (Spudart)
All those tweets in the world with no reply. People putting their thoughts and observations out into the world. But then left hanging. Is anybody reading? Or do most people on Twitter post tweets, and not read tweets? What if for one day, you responded to every single tweet in your stream? Ok, that might …
It is sort of depressing to see so many things with no interaction. This is one of the reason I sort of like “read posts” (like this one I’m making), is that those with the proper set up at least know that they’ve got readers, even if the interaction is relatively passive.
Read Two flipboard flips, separated by time, united by image by Matt Maldre (Spudart)
My brother uses Flipboard to share links, so I’m going back onto this platform again. (My username on Flipboard is mattmaldre) A user’s “flips” page is sorted chronologically. One of my most recent flips is from five days ago, and then the next one is from over a year ago ago. A little gap in …
Read Webmentions work log 20200115 by Jeremy Felt (jeremyfelt.com)
Tonight is Pullman’s first Homebrew Website Club and I’m going to use the allocated hacking time to figure out what might be misfiring in the Webmention plugin’s always approve feature. Side note: It feels weird typing “Webmention” rather than “webmention”. I think I’m going to use t...
Read Webmentions work log 20200117 by Jeremy Felt (jeremyfelt.com)
I hadn’t taken a close look at the IndieWeb comments documentation when I marked up the latest version of comments for this site last week. Today I’m going to follow some of the advice Chris had and stare closer at some prior art. My first objective is to remove all of the unnecessary classes ad...
Reading about Jeremy’s work is inspiring me to do more of my own.