Watched A bold idea to replace politicians by César Hidalgo from ted.com
César Hidalgo has a radical suggestion for fixing our broken political system: automate it! In this provocative talk, he outlines a bold idea to bypass politicians by empowering citizens to create personalized AI representatives that participate directly in democratic decisions. Explore a new way to make collective decisions and expand your understanding of democracy.

“It’s not a communication problem, it’s a cognitive bandwidth problem.”—César Hidalgo

He’s definitely right about the second part, but it’s also a communication problem because most of political speech is nuanced toward the side of untruths and covering up facts and potential outcomes to represent the outcome the speaker wants. There’s also far too much of our leaders saying “Do as I say (and attempt to legislate) and not as I do.” Examples include things like legislators working to actively take away things like abortion or condemn those who are LGBTQ when they actively do those things for themselves or their families or live out those lifestyles in secret.

“One of the reasons why we use Democracy so little may be because Democracy has a very bad user interface and if we improve the user interface of democracy we might be able to use it more.”—César Hidalgo

This is an interesting idea, but definitely has many pitfalls with respect to how we know AI systems currently work. We’d definitely need to start small with simpler problems and build our way up to the more complex. However, even then, I’m not so sure that the complexity issues could ultimately be overcome. On it’s face it sounds like he’s relying too much on the old “clockwork” viewpoint of phyiscs, though I know that obviously isn’t (or couldn’t be) his personal viewpoint. There’s a lot more pathways for this to become a weapon of math destruction currently than the utopian tool he’s envisioning.

Read Privacy Is Just the Beginning of the Debate Over Tech by Jathan Sadowski (onezero.medium.com)
Controversial ‘smart locks’ show the way that surveillance tech begins with the poor, before spreading to the rest of us

Instead, when we talk about technology, we should be thinking about power dynamics.

Great piece about ethics in technology.

👓 Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever | The Intercept

Read Thanks to Facebook, your cellphone company is watching you more closely than ever by Sam BiddleSam Biddle (The Intercept)

AMONG THE MEGA-CORPORATIONS that surveil you, your cellphone carrier has always been one of the keenest monitors, in constant contact with the one small device you keep on you at almost every moment. A confidential Facebook document reviewed by The Intercept shows that the social network courts carriers, along with phone makers — some 100 different companies in 50 countries — by offering the use of even more surveillance data, pulled straight from your smartphone by Facebook itself.

Offered to select Facebook partners, the data includes not just technical information about Facebook members’ devices and use of Wi-Fi and cellular networks, but also their past locations, interests, and even their social groups. This data is sourced not just from the company’s main iOS and Android apps, but from Instagram and Messenger as well. The data has been used by Facebook partners to assess their standing against competitors, including customers lost to and won from them, but also for more controversial uses like racially targeted ads.

📑 Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever | The Intercept

Annotated Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever by Sam BiddleSam Biddle (The Intercept)
“It sure smells like the prescreening provisions of the FCRA,” Reidenberg told The Intercept. “From a functional point of view, what they’re doing is filtering Facebook users on creditworthiness criteria and potentially escaping the application of the FCRA.”  

📑 Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever | The Intercept

Annotated Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever by Sam BiddleSam Biddle (The Intercept)
In an initial conversation with a Facebook spokesperson, they stated that the company does “not provide creditworthiness services, nor is that a feature of Actionable Insights.” When asked if Actionable Insights facilitates the targeting of ads on the basis of creditworthiness, the spokesperson replied, “No, there isn’t an instance where this is used.” It’s difficult to reconcile this claim with the fact that Facebook’s own promotional materials tout how Actionable Insights can enable a company to do exactly this. Asked about this apparent inconsistency between what Facebook tells advertising partners and what it told The Intercept, the company declined to discuss the matter on the record,  

📑 Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever | The Intercept

Annotated Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever by Sam BiddleSam Biddle (The Intercept)
How consumers would be expected to navigate this invisible, unofficial credit-scoring process, given that they’re never informed of its existence, remains an open question.  

📑 Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever | The Intercept

Annotated Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever by Sam BiddleSam Biddle (The Intercept)
But these lookalike audiences aren’t just potential new customers — they can also be used to exclude unwanted customers in the future, creating a sort of ad targeting demographic blacklist.  

🎧 Episode 011 – Surveillance Capitalism and Digital Redlining | Media and the End of the World Podcast

Listened to Episode 011 – Surveillance Capitalism and Digital Redlining by Adam Croom and Ralph Beliveau from Media and the End of the World Podcast

We are joined by Chris Gilliard, Professor of English at Macomb Community College. His scholarship concentrates on privacy, institutional tech policy, digital redlining, and the re-inventions of discriminatory practices through data mining and algorithmic decision-making, especially as these apply to college students. He is currently developing a project that looks at how popular misunderstandings of mathematical concepts create the illusions of fairness and objectivity in student analytics, predictive policing, and hiring practices. Follow him on Twitter at @hypervisible.

Show Notes

An interesting episode on surveillance capitalism and redlining.

I’m a bit surprised to find that I’ve been blocked by Chris Gilliard (@hypervisible) on Twitter. I hope I haven’t done or said anything in particular to have offended him. More likely I may have been put on a block list to which he’s subscribed?? Just not sure. I’ll have to follow him from another account as I’m really interested in his research particularly as it applies to fixing these areas within the edtech space and applications using IndieWeb principles. I think this may be the first instance that I’ve gone to someone’s account to notice that I’ve been blocked.

👓 The Most Measured Person in Tech Is Running the Most Chaotic Place on the Internet | New York Times

Read The Most Measured Person in Tech Is Running the Most Chaotic Place on the Internet (New York Times)
YouTube’s C.E.O. spends her days contemplating condoms and bestiality, talking advertisers off the ledge and managing a property the size of Netflix.

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
A 2015 clip about vaccination from iHealthTube.com, a “natural health” YouTube channel, is one of the videos that now sports a small gray box.  

Does this box appear on the video itself? Apparently not…

Examples:

But nothing on the embedded versions:

A screengrab of what this looks like for those that don’t want to be “infected” by the algorithmic act of visiting these YouTube pages:
YouTube video on vaccinations with a gray block below it giving a link to some factual information on Wikipedia

🎧 How President Trump’s Angry Tweets Can Ripple Across Social Media | NPR

Listened to How President Trump's Angry Tweets Can Ripple Across Social Media by Tim MakTim Mak from NPR

When Trump posts a mean tweet, how does it make its way across social media into the American consciousness? Researchers crunched the numbers to see if his negative tweets were shared more often.

This news segment references some interesting sounding research groups who are working in social media, emotional contagion, and sentiment analysis.

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
YouTube doesn’t give an exact recipe for virality. But in the race to one billion hours, a formula emerged: Outrage equals attention.  

Talk radio has had this formula for years and they’ve almost had to use it to drive any listenership as people left radio for television and other media.

I can still remember the different “loudness” level of talk between Bill O’Reilly’s primetime show on Fox News and the louder level on his radio show.

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
When Wojcicki took over, in 2014, YouTube was a third of the way to the goal, she recalled in investor John Doerr’s 2018 book Measure What Matters.“They thought it would break the internet! But it seemed to me that such a clear and measurable objective would energize people, and I cheered them on,” Wojcicki told Doerr. “The billion hours of daily watch time gave our tech people a North Star.” By October, 2016, YouTube hit its goal.  

Obviously they took the easy route. You may need to measure what matters, but getting to that goal by any means necessary or using indefensible shortcuts is the fallacy here. They could have had that North Star, but it’s the means they used by which to reach it that were wrong.

This is another great example of tech ignoring basic ethics to get to a monetary goal. (Another good one is Marc Zuckerberg’s “connecting people” mantra when what he should be is “connecting people for good” or “creating positive connections”.

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
Somewhere along the last decade, he added, YouTube prioritized chasing profits over the safety of its users. “We may have been hemorrhaging money,” he said. “But at least dogs riding skateboards never killed anyone.”