The result is that these new knowledge territories become the subject of political conflict. The first conflict is over the distribution of knowledge: “Who knows?” The second is about authority: “Who decides who knows?” The third is about power: “Who decides who decides who knows?” ❧
This is an important point! And nothing puts a finer point on it than Shoshona Zuboff’s recent book on surveillance capitalism.
Greetings, people of the future!
This piece has gotten a lot of attention over the years. I have heard a lot of people saying that they had been "inspired" by it. I fear that what they meant was that they were inspired by the one pull-quote that people tend to quote from it, and ignored the rest. So if someone has linked you to this page, or if you've googled that pull-quote and ended up here, let me give you some context. I wrote this in 2005, which was was more than a year before Facebook was open to the general public.
The world was different then.
When I hear people say that they were "inspired" by this, I fear that the result of such inspiration was most likely to cause them to participate in the construction of the Public-Private Surveillance Partnership. These people told themselves that they were building tools to "bring people together" when in fact what they were doing was constructing and enabling the information-broker business models used by companies like Facebook and Equifax, where people are not the customers but rather are the raw materials whose personal details are the product.
I was talking about decentralization and empowerment of the individual. They went and build the exact opposite.
It's not a great feeling to think that someone may have read your words and then gone on to construct the dystopian hellscape that we're now living in, where Twitter is the prime enabler of actual Nazis and Facebook's greatest accomplishment has been to put a racist rapist in the White House.
If all the people who claimed to have been "inspired" by this piece hadn't been, and had just kept writing middleware for banks or whatever, the world might have been a slightly better place.
I wish I had never published this.- jwz, 24-Nov-2017
The more I see, read, and hear about the vagaries of social media; the constant failings and foibles of Facebook, the trolling dumpster fire that is Twitter, the ills of Instagram; the spread of dark patterns; and the unchecked rise of surveillance capitalism, and weapons of math destruction the more I think that the underlying ideas focusing on people and humanity within the IndieWeb movement are its core strength.
Perhaps we need to create a new renaissance of humanism for the 21st century? Maybe we call it digital humanism to create some intense focus, but the emphasis should be completely on the people side.
Naturally there’s a lot more that they–and we all–need to do to improve our lives. Let’s band together to create better people-centric, ethical solutions.
The Bundeskartellamt has imposed on Facebook far-reaching restrictions in the processing of user data.
According to Facebook's terms and conditions users have so far only been able to use the social network under the precondition that Facebook can collect user data also outside of the Facebook website in the internet or on smartphone apps and assign these data to the user’s Facebook account. All data collected on the Facebook website, by Facebook-owned services such as e.g. WhatsApp and Instagram and on third party websites can be combined and assigned to the Facebook user account.
The authority’s decision covers different data sources:
(i) Facebook-owned services like WhatsApp and Instagram can continue to collect data. However, assigning the data to Facebook user accounts will only be possible subject to the users’ voluntary consent. Where consent is not given, the data must remain with the respective service and cannot be processed in combination with Facebook data.
(ii) Collecting data from third party websites and assigning them to a Facebook user account will also only be possible if users give their voluntary consent.
Shoshana Zuboff's interdisciplinary breadth and depth enable her to come to grips with the social, political, business, and technological meaning of the changes taking place in our time. We are at a critical juncture in the confrontation between the vast power of giant high-tech companies and government, the hidden economic logic of surveillance capitalism, and the propaganda of machine supremacy that threaten to shape and control human life. Will the brazen new methods of social engineering and behavior modification threaten individual autonomy and democratic rights and introduce extreme new forms of social inequality? Or will the promise of the digital age be one of individual empowerment and democratization?
The Age of Surveillance Capitalism is neither a hand-wringing narrative of danger and decline nor a digital fairy tale. Rather, it offers a deeply reasoned and evocative examination of the contests over the next chapter of capitalism that will decide the meaning of information civilization in the twenty-first century. The stark issue at hand is whether we will be the masters of information and machines or its slaves.
Can’t wait to get this…
On first blush, I’ll note that the cover looks a lot like that of Pikkety’s Captialism in the 21st Century. Certainly an interesting framing by the publisher.
Shoshana Zuboff is the author of The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. She talks with Leo Laporte about how social media is being used to influence people.
Even for the people who are steeped in some of the ideas of surveillance capitalism, ad tech, and dark patterns, there’s a lot here to still be surprised about. If you’re on social media, this should be required listening/watching.
I can’t wait to get the copy of her book.
Folks in the IndieWeb movement have begun to fix portions of the problem, but Shoshana Zuboff indicates that there are several additional levels of humane understanding that will need to be bridged to make sure their efforts aren’t just in vain. We’ll likely need to do more than just own our own data, but we’ll need to go a step or two further as well.
The thing I was shocked to not hear in this interview (and which may not be in the book either) is something that I think has been generally left unmentioned with respect to Facebook and elections and election tampering (29:18). Zuboff and Laporte discuss Facebook’s experiments in influencing people to vote in several tests for which they published academic papers. Even with the rumors that Mark Zuckerberg was eyeing a potential presidential run in 2020 with his trip across America and meeting people of all walks of life, no one floated the general idea that as the CEO of Facebook, he might use what they learned in those social experiments to help get himself (or even someone else) elected by sending social signals to certain communities to prevent them from voting while sending other signals to other communities to encourage them to vote. The research indicates that in a very divided political climate that with the right sorts of voting data, it wouldn’t take a whole lot of work for Facebook to help effectuate a landslide victory for particular candidates or even entire political parties!! And of course because of the distributed nature of such an attack on democracy, Facebook’s black box algorithms, and the subtlety of the experiments, it would be incredibly hard to prove that such a thing was even done.
I like her broad concept (around 43:00) where she discusses the idea of how people tend to frame new situations using pre-existing experience and that this may not always be the most useful thing to do for what can be complex ideas that don’t or won’t necessarily play out the same way given the potential massive shifts in paradigms.
Also of great interest is the idea of instrumentarianism as opposed to the older ideas of totalitarianism. (43:49) Totalitarian leaders used to rule by fear and intimidation and now big data stores can potentially create these same types of dynamics, but without the need for the fear and intimidation by more subtly influencing particular groups of people. When combined with the ideas behind “swarming” phenomenon or Mark Granovetter’s ideas of threshold reactions in psychology, only a very small number of people may need to be influenced digitally to create drastic outcomes. I don’t recall the reference specifically, but I recall a paper about the mathematics with respect to creating ethnic neighborhoods that only about 17% of people needed to be racists and move out of a neighborhood to begin to create ethnic homogeneity and drastically less diversity within a community.
Also tangentially touched on here, but not discussed directly, I can’t help but think that all of this data with some useful complexity theory might actually go a long way toward better defining (and being able to actually control) Adam Smith’s economic “invisible hand.”
There’s just so much to consider here that it’s going to take several revisits to the ideas and some additional research to tease this all apart.
How hotel chains became the new frontier in the surveillance state.
The gist of the idea here is interesting, but the surveillance state it creates and the stupid amount of money it sucks up that could be better spent somewhere else. Where is the humanity in creating our society? Why create such fear in thousands of people for such little in return? There’s so much more to say about this, but I just don’t have the energy.
Opinion: The 2009 vs. 2019 profile picture trend may or may not have been a data collection ruse to train its facial recognition algorithm. But we can't afford to blithely play along.
T-Mobile, Sprint, and AT&T are selling access to their customers’ location data, and that data is ending up in the hands of bounty hunters and others not authorized to possess it, letting them track most phones in the country.
Governments owning citizens’ data directly?? Why not have the government empower citizens to own their own data?
Surveillance capitalism turns a profit by making people more comfortable with discrimination
Facebook’s use of “ethnic affinity” as a proxy for race is a prime example. The platform’s interface does not offer users a way to self-identify according to race, but advertisers can nonetheless target people based on Facebook’s ascription of an “affinity” along racial lines. In other words. race is deployed as an externally assigned category for purposes of commercial exploitation and social control, not part of self-generated identity for reasons of personal expression. The ability to define one’s self and tell one’s own stories is central to being human and how one relates to others; platforms’ ascribing identity through data undermines both. ❧
October 15, 2018 at 09:34PM