Read a post by David ShanskeDavid Shanske (david.shanske.com)
I just pushed the first set of improvements to Parse This to support JSON-LD.  Parse This takes an incoming URL and converts  it to mf2 or jf2. It is used by Post Kinds and by Yarns Microsub  to handle this. So, assuming the default arguments are set, the parser will, for a URL that is not a feed...
Also, installed and looks good!
Replied to Idea: a script to find Flickr photos being used online by Matt Maldre (Matt Maldre)
Flickr is a great place to find photos to use. Many photographers assign their photos with a Creative Commons license, so any can use the … Idea: a script to find Flickr photos being used online... Read More »

Clicking through to the photo, there is no mention of this image appearing on this important announcement. Perhaps the author privately contact the photographer about using his image. Since Ken Doctor is so incredible with his media experience (i’m being serious), I’m fairly certain someone from his team would have contacted the photographer to give him a heads up.

I’m sure I’ve said it before, but I maintain that if the source of the article and the target both supported the Webmention spec, then when a piece used an image (or really any other type of media, including text) with a link, then the original source (any website, or Flickr in this case) would get a notification and could show—if they chose—the use of that media so that others in the future could see how popular (or not) these types of media are.

Has anyone in the IndieWeb community got examples of this type of attribution showing on media on their own websites? Perhaps Jeremy Keith or Kevin Marks who are photographers and long time Flickr users?

Incidentally I’ve also mentioned using this notification method in the past as a means of decentralizing the journal publishing industry as part of a peer-review, citation, and preprint server set up. It also could be used as part of a citation workflow in the sense of Maria Popova and Tina Roth Eisenberg‘s Curator’s Code[1]set up, which could also benefit greatly now with Webmention support.
Annotated on March 09, 2020 at 12:18PM

Read The 100 Worst Ed-Tech Debacles of the Decade by Audrey Watters (Hack Education)

For the past ten years, I have written a lengthy year-end series, documenting some of the dominant narratives and trends in education technology. I think it is worthwhile, as the decade draws to a close, to review those stories and to see how much (or how little) things have changed. You can read the series here: 2010201120122013201420152016201720182019.

I thought for a good long while about how best to summarize this decade, and inspired by the folks at The Verge, who published a list of “The 84 biggest flops, fails, and dead dreams of the decade in tech,” I decided to do something similar: chronicle for you a decade of ed-tech failures and fuck-ups and flawed ideas.

I started reading this over the holidays when Audrey released it. It took me four sittings to make it all the way through (in great part because it’s so depressing). I’ve finally picked it back up today to wallow through the last twenty on the list. 

I’m hoping that at least a few people pick up the thread she’s always trying to show us and figure out a better way forward. The information theorist in me says that every student has only so much bandwidth and there’s an analogy to the Shannon Limit for how much information one can cram into a person. I’ve been a fan of Cesar Hidalgo’s idea of a personbyte (a word for the limit of information one can put into a person) and the fact that people need to collaborate to produce things bigger and greater than themselves. What is it going to take to get everyone else to understand?

If anything, the only way I suspect we’ll be able to better teach and have students retain information is to use some of the most ancient memory techniques from indigenous cultures rather than technologizing our way out of the perceived problem.

Annotations/Highlights

(only a small portion since HackedEducation doesn’t fit into my usual workflow)

In his review of Nick Srnicek’s book Platform CapitalismJohn Hermann writes,

Platforms are, in a sense, capitalism distilled to its essence. They are proudly experimental and maximally consequential, prone to creating externalities and especially disinclined to address or even acknowledge what happens beyond their rising walls. And accordingly, platforms are the underlying trend that ties together popular narratives about technology and the economy in general. Platforms provide the substructure for the “gig economy” and the “sharing economy”; they’re the economic engine of social media; they’re the architecture of the “attention economy” and the inspiration for claims about the “end of ownership.”

Annotated on March 08, 2020 at 02:35PM

It isn’t just the use of student data to fuel Google’s business that’s a problem; it’s the use of teachers as marketers and testers. “It’s a private company very creatively using public resources — in this instance, teachers’ time and expertise — to build new markets at low cost,” UCLA professor Patricia Burch told The New York Times in 2017 as part of its lengthy investigation into “How Google Took Over the Classroom.”

Annotated on March 08, 2020 at 03:04PM

Read Intentional Internet by Desirée GarcíaDesirée García (Miscelanea)

So this year I took a systems design approach to the problem. Why wasn’t I doing the things I wanted to do? At first glance, it looked like a simple matter of spending too much time online.

I was, but a lot of my goals, like re-building my website or blogging again, were dependent on the Internet. So instead, I started observing the situations I would find myself in when I self-sabotaged at home and at work, and they all had too much content. Quality content by most people’s standards, but trivial, nonetheless. They were all click-holes. From there I looked at the interaction models and behaviors encouraged by those sites or platforms, and decided to experiment with removing all triggers, digitial or otherwise. The idea was that by “controlling” for these variables, perhaps I would see some sort of change at the end that would give me the space to do the things I really wanted to do and get shit done.

And so The Year of Intentional Internet began.

After reading just a few posts by Desirée García, I’d like to nominate her to give a keynote at the upcoming IndieWeb Summit in June. I totally want to hear her give a talk on The Year of Intentional Internet.

Read Letting Go of the Old Web by Desirée GarcíaDesirée García (Miscelanea)
Last year I wrote a thing on Automattic’s design blog about something I keep noodling on, which ultimately boils down to what makes us creative. What gets us to build a website? What the hell is a website today?
I don’t mean “Us” the web community, or tech industry. I mean “Us” as Humankind.

Silly me, I didn’t manage to keep a reference for where I found this article in the first place.

But it is important and has some interesting philosophical questions for the IndieWeb and, for lack of a better framing, future generations of the IndieWeb.

While I have the sort of love and excitement for the web that she talks about, I wonder if others will too?

The other side of me says that one of the great benefits of what the IndieWeb is doing is breaking down all of the larger and complicated pieces of a website down into smaller and simpler component parts. This allows a broader range of people to see and understand them and then potentially remix them into tools that will not only work for them on a day-to-day basis, but to create new and exciting things out of them. I feel like we’re getting closer to this sort of utopia, but even as I see the pieces getting simpler, I also see large projects like WordPress becoming even more difficult and complex to navigate. There is a broader divide between the general public and the professional web developer and not as many people like me who know just enough of both to be dangerous, creative, and yet still productive.

I hope we can continue to break things down to make them easier for everyone to not only use, but to create new and inspiring things.

Read Opinion | I Helped Fact-Check the 1619 Project. The Times Ignored Me. by Leslie M. Harris (POLITICO)
The paper's series on slavery made avoidable mistakes. But the attacks from its critics are much more dangerous.

Beginning in the last quarter of the 20th century, historians like Gary Nash, Ira Berlin and Alfred Young built on the earlier work of Carter G. Woodson, Benjamin Quarles, John Hope Franklin and others, writing histories of the Colonial and Revolutionary eras that included African Americans, slavery and race. A standout from this time is Edmund Morgan’s American Slavery, American Freedom, which addresses explicitly how the intertwined histories of Native American, African American and English residents of Virginia are foundational to understanding the ideas of freedom we still struggle with today. 

These could be interesting to read.
Annotated on March 07, 2020 at 09:12PM


Scholars like Annette Gordon-Reed and Woody Holton have given us a deeper understanding of the ways in which leaders like Thomas Jefferson committed to new ideas of freedom even as they continued to be deeply committed to slavery. 

I’ve not seen any research that relates the Renaissance ideas of the Great Chain of Being moving into this new era of supposed freedom. In some sense I’m seeing the richest elite whites trying to maintain their own place in a larger hierarchy rather than stronger beliefs in equality and hard work.
Annotated on March 07, 2020 at 09:22PM

Read James Clyburn wasn't responsible for the Biden surge. So what was? by Kevin DrumKevin Drum (Mother Jones)
The green dot is February 29, the date of the South Carolina primary. Biden had gained a point or two before that, but he only really started to skyrocket shortly after the primary. So it’s safe to say that his big victory in South Carolina was the proximate cause of his early March takeoff. But what was responsible for Biden’s South Carolina win in the first place?

Biden went up a lot more, but he was taking votes away from Steyer, Warren, Buttigieg, and Klobuchar.
For now, the debate still seems the most likely cause. That’s a little unusual, since conventional wisdom says that debates don’t move public opinion much, but maybe this was an exception. 

Perhaps it was more the fact that the electorate just didn’t know or trust the Steyer, Buttigieg, and Klobuchar centrist crowd enough to vote for them with time running out. As a result the old tried-and-true guy you know pulls in all the people. He didn’t really need an inciting incident other than the looming election. Following the debate without anything else to use to make a decision on, the choice was *fait accompli*.
Annotated on March 07, 2020 at 08:57PM

Read A Harvard sociologist explains why we confide in strangers by Jenny AndersonJenny Anderson (Quartz)
Who did you last trust with some really personal information? 

Small says there are three reasons we might avoid those closest to us when we are grappling with problems about our health, relationships, work, or kids. 

Annotated on March 07, 2020 at 08:39PM

The first is that our closest relationships are our most complex ones. 

Annotated on March 07, 2020 at 08:39PM

The second reason is that when we are dealing with something difficult, we commonly prefer to confide in people who have been through what we are going through rather than those who know us, seeking “cognitive empathy” over guaranteed warmth or closeness. 

Annotated on March 07, 2020 at 08:40PM

The third reason is that in our moment of vulnerability, our need to talk is greater than our need to self-protect. 

Annotated on March 07, 2020 at 08:41PM

Adam Smith, writing in 1790, said we can only expect real sympathy from real friends, not from mere acquaintances. More recently, in 1973, Stanford sociologist Mark Granovetter established as a bedrock of social network analysis the idea that we rely on “strong” ties (our inner circle) for support and weak ties (our acquaintances) for information. 

Annotated on March 07, 2020 at 08:43PM

Read Organic farming is on the rise in the U.S. by Kristen Bialik (Pew Research Center)
There were more than 14,000 certified organic farms in the United States in 2016, a 56% increase from 2011.

Still, organic farming makes up a small share of U.S. farmland overall. There were 5 million certified organic acres of farmland in 2016, representing less than 1% of the 911 million acres of total farmland nationwide. Some states, however, had relatively large shares of organic farmland. Vermont’s 134,000 certified organic acres accounted for 11% of its total 1.25 million farm acres. California, Maine and New York followed in largest shares of organic acreage – in each, certified organic acres made up 4% of total farmland. 

Annotated on March 07, 2020 at 12:09PM


Certified organic food, according to the Agriculture Department’s definition, must be produced without the use of conventional pesticides, petroleum- or sewage-based fertilizers, herbicides, genetic engineering, antibiotics, growth hormones or irradiation. Certified organic farms must also adhere to certain animal health and welfare standards, not treat land with any prohibited substances for at least three years prior to harvest, and reach a certain threshold for gross annual organic sales. U.S. organic farms that are not certified organic are not included in this analysis. 

Annotated on March 07, 2020 at 12:15PM

Read The fight to preserve a 44,000-year-old painting by Krithika Varagur (1843)
One of the world’s oldest artworks has been discovered inside a working Indonesian mine. It survived this long – Krithika Varagur ventures to Sulawesi to find out if it has a future

This painting was discovered in the Bulu Sipong cave on Sulawesi in 2016 and recent analysis has shown that it is the “oldest pictorial record of storytelling” and the “earliest figurative artwork in the world”, and is at least 43,900 years old. (The oldest known drawing in the world, a 73,000-year-old abstract scribble, was found in South Africa in 2018.)

Annotated on March 06, 2020 at 10:25PM

Read My Repo, My House, My Rules by Eran HammerEran Hammer (hueniverse.com)
GitHub provides an invaluable hosting service. Like all hosting platforms, any interaction between the content owner — the maintainer — and their community— the users — is owned exclusively by the owner. If you visit my repositories on GitHub, you are visiting my property, hosted generously by GitHub. It is not public space.
I wonder if the reframing by the IndieWeb community of hosting things on their own sites will prevent this sort of rudeness in the future, or will the social construct fall down with the influence of spammers and trolls?