I imagine that the first part of this project will focus on how it got to be this way, what got missed or ignored in some of the early warnings about what was happening online and how those warnings were swamped by the hype depicting the Internet as a space of radical democratization. ❧
I love the brewing idea here. We definitely need this.
Some broad initial bibliography from the top of my head:
We are joined by Chris Gilliard, Professor of English at Macomb Community College. His scholarship concentrates on privacy, institutional tech policy, digital redlining, and the re-inventions of discriminatory practices through data mining and algorithmic decision-making, especially as these apply to college students. He is currently developing a project that looks at how popular misunderstandings of mathematical concepts create the illusions of fairness and objectivity in student analytics, predictive policing, and hiring practices. Follow him on Twitter at @hypervisible.
An interesting episode on surveillance capitalism and redlining.
I’m a bit surprised to find that I’ve been blocked by Chris Gilliard (@hypervisible) on Twitter. I hope I haven’t done or said anything in particular to have offended him. More likely I may have been put on a block list to which he’s subscribed?? Just not sure. I’ll have to follow him from another account as I’m really interested in his research particularly as it applies to fixing these areas within the edtech space and applications using IndieWeb principles. I think this may be the first instance that I’ve gone to someone’s account to notice that I’ve been blocked.
The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread. ❧
Kate Bowles gave a great Keynote at the Open Education Resources 2019 (OER19) conference in Galway last night. In it she indicates how politicians, economists and even universities themselves measure their growth at the level of imports/exports and even compare it with mining in a cynical way to describe the movement of their educational resources and students.
A slide from
“What a chilling thing to say about young people crossing the world to learn.” –Kate Bowles (in response to the slide immediately above)
The fact that businesses, governments, and even universities themselves would take such an ugly standpoint on teaching and learning is painful. It reminds me that one of the things that I think the open IndieWeb movement gets right is that it is people-centric first and foremost. If you can take care of people at the most base level, then hopefully what gets built upon that base–while still watching it carefully–will be much more ethical.
The IndieWeb is a people-focused alternative to the “corporate web”.
As a result of this people-centric vision, I’m seeing a lot less of the sort of ills, unintended consequences, and poor emergent behaviors caused by the drive toward surveillance capitalism within the giant social media silos like Facebook, Twitter, Instagram, et al.
I’m reminded of a part of the thesis that Cesar Hidalgo presents in Why Information Grows: The Evolution of Order from Atoms to Economies of the idea of the personbyte and what that looks like at a group level, then a corporate level, and I wonder how it may grow to the next level above that. Without ultimately focusing on the person at the bottom of the pyramid however, we may be ethically losing sight of where we’re going and why. We may even be building an edifice that is far more likely to crumble with even worse unintended consequences.
Bookmarked The ultimate guide to DuckDuckGo - BrettTerpstra.com (BrettTerpstra.com)
If you don’t already have the scoop, it’s the search engine that can serve as a complete replacement for Google (and Bing and whatever else you like), except it respects your privacy and security. And while Google does some cool tricks, DuckDuckGo does some even better ones.
I switched over to DuckDuckGo for searches a few months ago. There’s a lot of stuff here I didn’t know about especially “bangs” which look really useful.
An exercise I gave my students helps illustrate the risks to privacy in our everyday, offline lives.
I saw some on Twitter say that this was a terrible assignment and that they can accomplish the same goal without being so creepy, but naturally they neglected to give any details about improving on it.
I used to be an artist, then I became a poet; then a writer. Now when asked, I simply refer to myself as a word processor. — Kenneth Goldsmith It’s a striking headline, and the Guardian…
Today, Facebook is encouraging its legions of users to declare civic enthusiasm to their friends, with a prominent "I'm A Voter" botton at the top of the newsfeed. Large-scale, experimental research shows that simply clicking the button, and sharing your voting intention, could do more to increase …
Shoshana Zuboff’s new book is a chilling exposé of the business model that underpins the digital world. Observer tech columnist John Naughton explains the importance of Zuboff’s work and asks the author 10 key questions
If you can’t read Zuboff’s new book in full, this article/interview may convince you that you should anyway. It may be one of the most important things you read all year.
For example, the idea of “data ownership” is often championed as a solution. But what is the point of owning data that should not exist in the first place? All that does is further institutionalise and legitimate data capture. It’s like negotiating how many hours a day a seven-year-old should be allowed to work, rather than contesting the fundamental legitimacy of child labour. Data ownership also fails to reckon with the realities of behavioural surplus. Surveillance capitalists extract predictive value from the exclamation points in your post, not merely the content of what you write, or from how you walk and not merely where you walk. Users might get “ownership” of the data that they give to surveillance capitalists in the first place, but they will not get ownership of the surplus or the predictions gleaned from it – not without new legal concepts built on an understanding of these operations. ❧
It is no longer enough to automate information flows about us; the goal now is to automate us. These processes are meticulously designed to produce ignorance by circumventing individual awareness and thus eliminate any possibility of self-determination. As one data scientist explained to me, “We can engineer the context around a particular behaviour and force change that way… We are learning how to write the music, and then we let the music make them dance.” ❧
We saw the experimental development of this new “means of behavioural modification” in Facebook’s contagion experiments and the Google-incubated augmented reality game Pokémon Go. ❧