Even for the people who are steeped in some of the ideas of surveillance capitalism, ad tech, and dark patterns, there’s a lot here to still be surprised about. If you’re on social media, this should be required listening/watching.
I can’t wait to get the copy of her book.
Folks in the IndieWeb movement have begun to fix portions of the problem, but Shoshana Zuboff indicates that there are several additional levels of humane understanding that will need to be bridged to make sure their efforts aren’t just in vain. We’ll likely need to do more than just own our own data, but we’ll need to go a step or two further as well.
The thing I was shocked to not hear in this interview (and which may not be in the book either) is something that I think has been generally left unmentioned with respect to Facebook and elections and election tampering (29:18). Zuboff and Laporte discuss Facebook’s experiments in influencing people to vote in several tests for which they published academic papers. Even with the rumors that Mark Zuckerberg was eyeing a potential presidential run in 2020 with his trip across America and meeting people of all walks of life, no one floated the general idea that as the CEO of Facebook, he might use what they learned in those social experiments to help get himself (or even someone else) elected by sending social signals to certain communities to prevent them from voting while sending other signals to other communities to encourage them to vote. The research indicates that in a very divided political climate that with the right sorts of voting data, it wouldn’t take a whole lot of work for Facebook to help effectuate a landslide victory for particular candidates or even entire political parties!! And of course because of the distributed nature of such an attack on democracy, Facebook’s black box algorithms, and the subtlety of the experiments, it would be incredibly hard to prove that such a thing was even done.
I like her broad concept (around 43:00) where she discusses the idea of how people tend to frame new situations using pre-existing experience and that this may not always be the most useful thing to do for what can be complex ideas that don’t or won’t necessarily play out the same way given the potential massive shifts in paradigms.
Also of great interest is the idea of instrumentarianism as opposed to the older ideas of totalitarianism. (43:49) Totalitarian leaders used to rule by fear and intimidation and now big data stores can potentially create these same types of dynamics, but without the need for the fear and intimidation by more subtly influencing particular groups of people. When combined with the ideas behind “swarming” phenomenon or Mark Granovetter’s ideas of threshold reactions in psychology, only a very small number of people may need to be influenced digitally to create drastic outcomes. I don’t recall the reference specifically, but I recall a paper about the mathematics with respect to creating ethnic neighborhoods that only about 17% of people needed to be racists and move out of a neighborhood to begin to create ethnic homogeneity and drastically less diversity within a community.
Also tangentially touched on here, but not discussed directly, I can’t help but think that all of this data with some useful complexity theory might actually go a long way toward better defining (and being able to actually control) Adam Smith’s economic “invisible hand.”
There’s just so much to consider here that it’s going to take several revisits to the ideas and some additional research to tease this all apart.