Dynamic range in social media and shovels versus excavators

A developer at today’s Homebrew Website Club mentioned that they didn’t want to have a website built on a particular language because they weren’t familiar with the language and felt uncomfortable trusting their data to it. What if something goes wrong? What if it breaks? How easy will it be to export and move their data over?

Compare this with the average social media user who doesn’t know any code. In their world, they’re making a choice, likely predicated upon social pressures, to post their data, content, and identity on one or more corporately controlled silos. Because of the ease-of-use, the platform is abstracted away from them even further than from the developer’s perspective thus making it even less apparent the level of trust they’re putting into the platform. What is the platform doing with their data? How is what they’re seeing in their feed being manipulated and controlled?

The problems both people are facing are relatively equivalent, just different in their dynamic range. The non-programmer is at an even greater disadvantage however as the silos are moving faster and can do more to take advantage of and manipulate them more seamlessly than the programmer who at least has more potential to learn the unfamiliar language to dig themselves out. This difference is also one of dynamic range as the developer may only need a simple shovel to dig themselves out whereas the non-coder will need a massive excavator, which may be unavailable and still need an operator with knowledge of how to use it.

Featured image: excavator flickr photo by mbecher shared under a Creative Commons (BY-NC-ND) license

Read Ed-Tech Agitprop by Audrey WattersAudrey Watters (Hack Education)

agitprop poster

This talk was delivered at OEB 2019 in Berlin. Or part of it was. I only had 20 minutes to speak, and what I wrote here is a bit more than what I could fit in that time-slot.

I've been thinking a lot lately about this storytelling that we speakers do -- it's part of what I call the "ed-tech imaginary." This includes the stories we invent to explain the necessity of technology, the promises of technology; the stories we use to describe how we got here and where we are headed. And despite all the talk about our being "data-driven," about the rigors of "learning sciences" and the like, much of the ed-tech imaginary is quite fanciful. Wizard of Oz pay-no-attention-to-the-man-behind-the-curtain kinds of stuff.

An important message pointing out that many (particularly corporations) are operating on fear and not facts within the EdTech spaces. Some simple fact-checking will verify that vos veritas liberabit.

I’ve been working on a thesis lately relating to some simple ideas with relation to memory that make me think we should be looking backwards instead of forward. Part of the trouble is that as a society we’ve long forgotten some of the basic knowledge even indigenous peoples had/have, but somehow there’s more benefit and value in the information imbalance to some that we no longer have or use some of these teaching and knowledge techniques. We definitely need to bring them back.

Agitprop is a portmanteau — a combination of “agitation” and “propaganda,” the shortened name of the Soviet Department for Agitation and Propaganda which was responsible for explaining communist ideology and convincing the people to support the party. This agitprop took a number of forms — posters, press, radio, film, social networks — all in the service of spreading the message of the revolution, in the service of shaping public beliefs, in the service of directing the country towards a particular future.

Might be fun to mix up some agitprop art for various modern things. Perhaps for social media so as to frame IndieWeb as the good?

Although agitprop is often associated with the Soviet control and dissemination of information, there emerged in the 1920s a strong tradition of agitprop art and theatre — not just in the USSR. One of its best known proponents was my favorite playwright, Bertolt Brecht. Once upon a time, before I turned my attention to education technology, I was working on a PhD in Comparative Literature that drew on Brecht’s Verfremdungseffekt, on the Russian Formalists’ concept of ostranenie — “defamiliarization.” Take the familiar and make it unfamiliar. A radical act or so these artists and activists believed that would destabilize what has become naturalized, normalized, taken for some deep “truth.” Something to shake us out of our complacency.

Now, none of these stories is indisputably true. At best — at best — they are unverifiable. We do not know what the future holds; we can build predictive models, sure, but that’s not what these are. Rather, these stories get told to steer the future in a certain direction, to steer dollars in a certain direction. (Alan Kay once said “the best way to predict the future is to build it,” but I think, more accurately, “the best way to predict the future is to issue a press release,” “the best way to predict the future is to invent statistics in your keynote.”) These stories might “work” for some people. They can be dropped into a narrative to heighten the urgency that institutions simply must adapt to a changing world — agitation propaganda.
Many of these stories contain numbers, and that makes them appear as though they’re based on research, on data. But these numbers are often cited without any sources. There’s often no indication of where the data might have come from. These are numerical fantasies about the future.
Another word: “robots are coming for your jobs” is one side of the coin; “immigrants are coming for your jobs” is the other. That is, it is the same coin. It’s a coin often used to marshall fear and hatred, to make us feel insecure and threatened. It’s the coin used in a sleight of hand to distract us from the profit-driven practices of capitalism. It’s a coin used to divide us so we cannot solve our pressing global problems for all of us, together.