📑 Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever | The Intercept

Annotated Thanks to Facebook, Your Cellphone Company Is Watching You More Closely Than Ever by Sam BiddleSam Biddle (The Intercept)
But these lookalike audiences aren’t just potential new customers — they can also be used to exclude unwanted customers in the future, creating a sort of ad targeting demographic blacklist.  

🎧 Episode 011 – Surveillance Capitalism and Digital Redlining | Media and the End of the World Podcast

Listened to Episode 011 – Surveillance Capitalism and Digital Redlining by Adam Croom and Ralph Beliveau from Media and the End of the World Podcast

We are joined by Chris Gilliard, Professor of English at Macomb Community College. His scholarship concentrates on privacy, institutional tech policy, digital redlining, and the re-inventions of discriminatory practices through data mining and algorithmic decision-making, especially as these apply to college students. He is currently developing a project that looks at how popular misunderstandings of mathematical concepts create the illusions of fairness and objectivity in student analytics, predictive policing, and hiring practices. Follow him on Twitter at @hypervisible.

Show Notes

An interesting episode on surveillance capitalism and redlining.

I’m a bit surprised to find that I’ve been blocked by Chris Gilliard (@hypervisible) on Twitter. I hope I haven’t done or said anything in particular to have offended him. More likely I may have been put on a block list to which he’s subscribed?? Just not sure. I’ll have to follow him from another account as I’m really interested in his research particularly as it applies to fixing these areas within the edtech space and applications using IndieWeb principles. I think this may be the first instance that I’ve gone to someone’s account to notice that I’ve been blocked.

👓 The Most Measured Person in Tech Is Running the Most Chaotic Place on the Internet | New York Times

Read The Most Measured Person in Tech Is Running the Most Chaotic Place on the Internet (New York Times)
YouTube’s C.E.O. spends her days contemplating condoms and bestiality, talking advertisers off the ledge and managing a property the size of Netflix.

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
A 2015 clip about vaccination from iHealthTube.com, a “natural health” YouTube channel, is one of the videos that now sports a small gray box.  
Does this box appear on the video itself? Apparently not…

Examples:

But nothing on the embedded versions:

A screengrab of what this looks like for those that don’t want to be “infected” by the algorithmic act of visiting these YouTube pages:
YouTube video on vaccinations with a gray block below it giving a link to some factual information on Wikipedia

🎧 How President Trump’s Angry Tweets Can Ripple Across Social Media | NPR

Listened to How President Trump's Angry Tweets Can Ripple Across Social Media by Tim MakTim Mak from NPR

When Trump posts a mean tweet, how does it make its way across social media into the American consciousness? Researchers crunched the numbers to see if his negative tweets were shared more often.

This news segment references some interesting sounding research groups who are working in social media, emotional contagion, and sentiment analysis.

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
YouTube doesn’t give an exact recipe for virality. But in the race to one billion hours, a formula emerged: Outrage equals attention.  
Talk radio has had this formula for years and they’ve almost had to use it to drive any listenership as people left radio for television and other media.

I can still remember the different “loudness” level of talk between Bill O’Reilly’s primetime show on Fox News and the louder level on his radio show.

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
When Wojcicki took over, in 2014, YouTube was a third of the way to the goal, she recalled in investor John Doerr’s 2018 book Measure What Matters.“They thought it would break the internet! But it seemed to me that such a clear and measurable objective would energize people, and I cheered them on,” Wojcicki told Doerr. “The billion hours of daily watch time gave our tech people a North Star.” By October, 2016, YouTube hit its goal.  
Obviously they took the easy route. You may need to measure what matters, but getting to that goal by any means necessary or using indefensible shortcuts is the fallacy here. They could have had that North Star, but it’s the means they used by which to reach it that were wrong.

This is another great example of tech ignoring basic ethics to get to a monetary goal. (Another good one is Marc Zuckerberg’s “connecting people” mantra when what he should be is “connecting people for good” or “creating positive connections”.

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
Somewhere along the last decade, he added, YouTube prioritized chasing profits over the safety of its users. “We may have been hemorrhaging money,” he said. “But at least dogs riding skateboards never killed anyone.”  

📑 YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant

Annotated YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant by Mark Bergen (Bloomberg)
The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.  
This is a great summation of the issue.

👓 Amazon Workers Are Listening to What You Tell Alexa | Bloomberg

Read Amazon Workers Are Listening to What You Tell Alexa by Matt Day , Giles Turner , and Natalia Drozdiak (Bloomberg)
A global team reviews audio clips in an effort to help the voice-activated assistant respond to commands.

🎧 The Daily: Silicon Valley’s Military Dilemma | New York Times

Listened to The Daily: Silicon Valley’s Military Dilemma from New York Times

Should Big Tech partner with the Pentagon? We examine a cautionary tale.

Some great history and questions about ethics here.

I’m surprised that for it’s share of profits that Down didn’t spin off the napalm division to some defense contractor?

Of course some tech companies are already weaponizing their own products against people. What about those ethical issues.

The more I seeread, and hear about the vagaries of social media; the constant failings and foibles of Facebook, the trolling dumpster fire that is Twitter, the ills of Instagram; the spread of dark patterns; and the unchecked rise of surveillance capitalism, and weapons of math destruction the more I think that the underlying ideas focusing on people and humanity within the IndieWeb movement are its core strength.

Perhaps we need to create a new renaissance of humanism for the 21st century? Maybe we call it digital humanism to create some intense focus, but the emphasis should be completely on the people side.

Naturally there’s a lot more that they–and we all–need to do to improve our lives. Let’s band together to create better people-centric, ethical solutions.

🎧 Triangulation 380 The Age of Surveillance Capitalism | TWiT.TV

Listened to Triangulation 380 The Age of Surveillance Capitalism by Leo Laporte from TWiT.tv

Shoshana Zuboff is the author of The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. She talks with Leo Laporte about how social media is being used to influence people.

Links

Even for the people who are steeped in some of the ideas of surveillance capitalism, ad tech, and dark patterns, there’s a lot here to still be surprised about. If you’re on social media, this should be required listening/watching.

I can’t wait to get the copy of her book.

Folks in the IndieWeb movement have begun to fix portions of the problem, but Shoshana Zuboff indicates that there are several additional levels of humane understanding that will need to be bridged to make sure their efforts aren’t just in vain. We’ll likely need to do more than just own our own data, but we’ll need to go a step or two further as well.

The thing I was shocked to not hear in this interview (and which may not be in the book either) is something that I think has been generally left unmentioned with respect to Facebook and elections and election tampering (29:18). Zuboff and Laporte discuss Facebook’s experiments in influencing people to vote in several tests for which they published academic papers. Even with the rumors that Mark Zuckerberg was eyeing a potential presidential run in 2020 with his trip across America and meeting people of all walks of life, no one floated the general idea that as the CEO of Facebook, he might use what they learned in those social experiments to help get himself (or even someone else) elected by sending social signals to certain communities to prevent them from voting while sending other signals to other communities to encourage them to vote. The research indicates that in a very divided political climate that with the right sorts of voting data, it wouldn’t take a whole lot of work for Facebook to help effectuate a landslide victory for particular candidates or even entire political parties!! And of course because of the distributed nature of such an attack on democracy, Facebook’s black box algorithms, and the subtlety of the experiments, it would be incredibly hard to prove that such a thing was even done.

I like her broad concept (around 43:00) where she discusses the idea of how people tend to frame new situations using pre-existing experience and that this may not always be the most useful thing to do for what can be complex ideas that don’t or won’t necessarily play out the same way given the potential massive shifts in paradigms.

Also of great interest is the idea of instrumentarianism as opposed to the older ideas of totalitarianism. (43:49) Totalitarian leaders used to rule by fear and intimidation and now big data stores can potentially create these same types of dynamics, but without the need for the fear and intimidation by more subtly influencing particular groups of people. When combined with the ideas behind “swarming” phenomenon or Mark Granovetter’s ideas of threshold reactions in psychology, only a very small number of people may need to be influenced digitally to create drastic outcomes. I don’t recall the reference specifically, but I recall a paper about the mathematics with respect to creating ethnic neighborhoods that only about 17% of people needed to be racists and move out of a neighborhood to begin to create ethnic homogeneity and drastically less diversity within a community.

Also tangentially touched on here, but not discussed directly, I can’t help but think that all of this data with some useful complexity theory might actually go a long way toward better defining (and being able to actually control) Adam Smith’s economic “invisible hand.”

There’s just so much to consider here that it’s going to take several revisits to the ideas and some additional research to tease this all apart.

👓 Instacart and DoorDash’s Tip Policies Are Delivering Outrage | The New York Times

Read After Uproar, Instacart Backs Off Controversial Tipping Policy (New York Times)
The delivery app’s practice of counting tips toward guaranteed minimum payments for its contract workers drew accusations of wage theft.

👓 How Math Can Be Racist: Giraffing | 0xabad1dea

Read How Math Can Be Racist: Giraffing (0xabad1dea)
Well, any computer scientist or experienced programmer knows right away that being “made of math” does not demonstrate anything about the accuracy or utility of a program. Math is a lot more of a social construct than most people think. But we don’t need to spend years taking classes in algorithms to understand how and why the types of algorithms used in artificial intelligence systems today can be tremendously biased. Here, look at these four photos. What do they have in common?