Recap of Los Angeles Area Homebrew Website Club February 21, 2018

After a relatively quiet quiet writing hour where I worked on acquisition posts a bit, people began arriving just before the 6:30 pm official start time.

I kicked off the meeting with a quick overview of IndieWeb’s concepts and principles for newcomers. As a mini-case study I talked a bit about some of my work and conversations earlier today about thinking about adding acquisition posts to my website and the way in which I’m approaching the problem.

Asher Silberman was glad to be back at a meeting. He has recently been working on more content over functionality.

Micah Cambre showed off a gorgeous development version of the new theme he’s building for his site which is a super clean and pared down theme based on the Sage platform using WordPress. He’s hoping to finish it shortly so he can relaunch his personal site at http://asuh.com. He spent some time talking about the process of using David Shanske’s IndieWebified version of the Twenty Sixteen theme as a template for adding microformats and functionality to the Sage set up.

Richard Hopp, a gen2/gen3 user who is completely new to the community and interested in learning, has a personal domain at http://www.ricahardhopp.com/ on which he’s installed WordPress. He’s currently considering whether he’d like to begin blogging soon and what other functionality he’d like to have on his site. He’s relatively new to Facebook, having only joined about six months ago. On the professional side, he does some governmental related work and has some large collections of documents that he’s also doing some research for in consideration of how to best put them on the web for ease of search and use.

I wrapped up the demo portion with a quick showing of how I leveraged the power of the Post Kinds Plugin to facetiously add chicken posts to my site as a prelude to doing a tad more work to begin adding explicit follow posts.

We took a short break to take a photo of the group.

In the end of the evening we talked over a handful of broad ideas including user interface, webactions, and Twitter interactions.

We wrapped things up with a demo of how I use the URL Forwarder app on Android to post to my website via mobile. We then used some of this documentation to try to help Asher fix his previously broken browser bookmarklets to hopefully work better with the Post Kinds Plugin. I spent a few minutes to create a similar bookmarklet to add the ability to more easily add follow posts to my website since I hadn’t done it after adding them last week.

Syndicated copies to:

Henninger Trailhead / Eaton Canyon Wild Fire Coverage in Altadena, CA

A small brush fire broke out today just a few blocks from the house.

A helicopter buzzing a palm tree

At roughly 4:09pm I noticed an incredibly low flying Los Angeles County Sherrif’s Office water drop helicopter buzz our neighborhood nearly missing the neighbors’ 50ft palm tree. There had been helicopter noise for about 10 minutes prior, so this got my immediate attention. I went outside to see a copious amounts of white smoke coming from the neighborhood just about 2 blocks north of the house.

I put on my shoes to see where the fire was originating and walked up the street.

View North from Harding Avenue and Berendo as I walked toward the fire

A satellite map of the exact location of the blaze.

Arrival at the Scene

I walked up to the rough scene (or as close as I could get given the situation and the smoke) at about 6:22. Traffic is being stopped on Altadena Drive from roughly Canyon Close Dr. up past Roosevelt Ave.

Firefighters are connecting hoses at Canyon Close Drive and running them up the street. I suspect this is to potentially defend the homes on the top side of the street because the fire and certainly the smoke are close enough to warrant it.

Water drops in progress

As I was walking up to the scene until about ten minutes later there were about 4 or 5 water drops by LA County Sheriff’s Helicopters.

Scene on the street

Ground Troops Arrive

Water drops seem to have stopped for the moment and groups of firefighters are arriving to descend down into Eaton Canyon to finish off the blaze. By rough count there were about 50-60 firefighters down in the canyon and roughly another 30 or so additional firefighters and other first responders up on the street providing support.

Clean up time

It’s now 4:43 pm and water drops have stopped for the past 10 minutes or so. It’s now roughly 45 minutes after the firefight started. Here are some pictures from the vantage point just above the location of the fire just a few feet away from the canyon edge. Based on my guestimate the fire took up approximately 2-4 acres of space which was primarily dry scrubbrush and several trees in the middle of the arroyo.

Reporting live from the scene

With things beginning to look like they’re winding down, and with a clearer view of the scene now that the smoke has abated significantly I take a moment to do a quick video recap for the viewers at home.

Heading home

Things appear to be under control by about 5pm, so I headed home. Stopping to ask local police how long the street is likely to be closed through rush hour.

I arrive at the house and toss in the coordinates of the fire into Google Maps to discover the center of the fire was 2,426 feet from the house (roughly 2 blocks away.) It was easy to get exact coordinates given the size of the trees in the fire zone and the specificity of the images in Google’s satellite view. We definitely dodged one today, particularly given the dryness of the last year and the high winds we’ve seen all afternoon.

It also dawns on me that I took a hike through this exact portion of Eaton Canyon yesterday morning. My checkin at the time captures a photo across the canyon almost 30 hours before the incident. I’ll try to get another sometime this week to provide a direct comparison.

Syndicated copies to:

Facebook is Censoring My Notes

I don’t post “notes” to Facebook often, but I’d noticed a few weeks ago that several pieces I’d published like this a while back were apparently unpublished by the platform. I hadn’t seen or heard anything from Facebook about them being unpublished or having issues, so I didn’t realize the problem until I randomly stumbled back across my notes page.

They did have a piece of UI to indicate that I wanted to contest and republish them, so I clicked on it. Apparently this puts these notes into some type of limbo “review” process, but it’s been a few weeks now and there’s no response about either of them. They’re still both sitting unseen in my dashboard with sad notes above them saying:

We’re reviewing this post against our Community Standards.

There is no real indication if they’ll ever come back online. Currently my only option is to delete them. There’s also no indication, clear or otherwise, of which community standard they may have violated.

I can’t imagine how either of the posts may have run afoul of their community standards, or why “notes” in particular seem to be more prone to this sort of censorship in comparison with typical status updates. I’m curious if others have had this same experience?

We’re reviewing these posts against our Community Standards.

This is just another excellent example of why one shouldn’t trust third parties over which you have no control to publish your content on the web. Fortunately I’ve got my own website with the original versions of these posts [1][2] that are freely readable. If you’ve experienced this or other pernicious problems in social media, I recommend you take a look at the helpful IndieWeb community which has some excellent ideas and lots of help for re-exerting control over your online presence.

Notes Functionality

Notes on Facebook were an early 2009 era attempt for Facebook to have more blog-like content and included a rather clean posting interface, not un-reminiscent of Medium’s interface, that also allowed one to include images and even hyperlinks into pages.

The note post type has long since fallen by the wayside and I rarely, if ever, come across people using it anymore in the wild despite the fact that it’s a richer experience than traditional status updates. I suspect the Facebook black box algorithm doesn’t encourage its use. I might posit that it’s not encouraged as unlike most Facebook functionality, hyperlinks in notes on desktop browsers physically take one out of the Facebook experience and into new windows!

The majority of notes about me are spammy chain mail posts like “25 Random Things About Me”, which also helpfully included written instructions for how to actually use notes.

25 Random Things About Me

Rules: Once you’ve been tagged, you are supposed to write a note with 25 random things, facts, habits, or goals about you. At the end, choose 25 people to be tagged. You have to tag the person who tagged you. If I tagged you, it’s because I want to know more about you.

(To do this, go to “notes” under tabs on your profile page, paste these instructions in the body of the note, type your 25 random things, tag 25 people (in the right hand corner of the app) then click publish.)

Most of my published notes were experiments in syndicating my content from my own blog to Facebook (via POSSE). At the time, the engagement didn’t seem much different than posting raw text as status updates, so I abandoned it. Perhaps I’ll try again with this post to see what happens? I did rather like the ability to actually have links to content and other resources in my posts there.

Syndicated copies to:

Brief Review of The Atlantic Interview Podcast

An awesome policy-focused and interview-based podcasts from one of the premiere news outlets of our day.

I’ve now listened to a dozen of the opening episodes of The Atlantic Interview and am enamored. It’s officially ensconced at the top of my regular rotation.

The weekly show, hosted by Jeffrey Goldberg, The Atlantic’s editor in chief, features him doing a relatively in-depth interview of a single guest for about thirty minutes.

I almost look at this podcast as a far better version of some of the “Sunday shows” where the guest isn’t always so heavily guarded because it would be impolitic or that they’re lost in a sea of voices amongst a larger panel where they just can’t develop some longer coherent thoughts or theses.

To some extent, this podcast is starting to fill a hole in my daily schedule that was created by the disappearance of The Charlie Rose show late last year. The sad part is that, at only once a week, I’m going to wish I had a lot more when I’m done binge-listening to the short backlog I’ve got. On The Atlantic Interview I appreciate that the “thing guests may be selling” (book, article, show, film, etc.) takes a pointed back seat to the broader topic(s) at hand.

Much of the strength of what I’ve heard thus far stems from interviews with people that are slightly off the beaten path, but with serious messages and interesting viewpoints. They’ve all been journalisticly solid and almost always provide me with news, viewpoints, and subtle information that I didn’t have before. Another strength is that the show can give guests additional time and depth than they might receive on other traditional shows. The guests so far have been very smart, cogent, and interesting. Their selection has been well balanced for gender, topic, and general variety within the space the show occupies. The show has generally impeccable audio and production values.

While initial guests seem to have an air of familiarity with the host as the result of closer (disclosed) interpersonal connections, I suspect that even when the list of immediate friends in his Rolodex runs dry, the show will easily have enough value and gravitas to successfully run on long beyond this.

One of my favorite parts of these podcasts are the somewhat snarky bumpers that Goldberg puts onto the the end encouraging people to give reviews and subscribe. I kind of wish he’d let loose a bit more and inject some of this kind of snark into the interviews too. If nothing else, he’s at least having fun with a part of the show that would otherwise be typically painful to trudge through.

Suggestions

I’d love to hear more about education policy, health care, public heath, internet, and foreign policy. A few guest ideas I’d love to hear in this format: Tressie McMillan Cottom, Mike Morrell, Susan J. Fowler, César A. Hidalgo, Tantek Çelik, Ellen J. MacKenzie, and Ezekiel Emanuel. Continuing in the vein of interviewing the interviewers, which I find terrifically fascinating, I’d love to see Judy Woodruff, Fareed Zakaria, W. Kamau Bell, Trevor Noah, and John Dickerson in the future. These aside, I suspect that anyone that Mssr. Goldberg finds intriguing, I’m sure I will as well.

Additional Technical Commentary

I really wish their podcast had individual web pages for each episode so I could more easily email, share, or target individual episodes for people. It would also be nice if the main page actually had .mp3 versions of the audio embedded in them to make it easier to bookmark and share through services like Huffduffer.com. I really don’t know why podcasters insist on using third party podcasting services to hide their .mp3 files from the outside world–it’s literally their most important product! Stop it! I find the practice to be as irksome as newspapers that use Facebook as their primary means of distribution, and just like that case, they’ll regret it in the long run.

While Megaphone.fm is a nice hosting platform for the show, I’m not sure why a publication the size and scope of The Atlantic isn’t simply self-hosing their own content using their own URLs.

The content for the show is still a bit scatter-brained. The main page on The Atlantic has the best and most comprehensive meta-descriptions of episodes, while the Megaphone page has some nice individual episode artwork that The Atlantic doesn’t have or present. This is sure to cause uneven experiences for people depending on how they choose to subscribe.

I appreciate that some of the early episodes went to the trouble to have full transcripts and some additional snippet content and images. I miss these transcripts. I do know that doing this can be painful and expensive, though perhaps services like Gretta.com might have some technology to help. If they want to go crazy, it would be cool to see Audiogram functionality, which they could use instead of relying on Megaphone or some other platform.

Syndicated copies to:

Homebrew Website Club Meetup on February 21, 2018

Are you building your own website? Indie reader? Personal publishing web app? Or some other digital magic-cloud proxy? If so, come on by and join a gathering of people with likeminded interests. Bring your friends who want to start a personal web site. Exchange information, swap ideas, talk shop, help work on a project…

Everyone of every level is welcome to participate! Don’t have a domain yet? Come along and someone can help you get started and provide resources for creating the site you’ve always wanted.

Homebrew Website Club Meetup – Los Angeles Area

Time:  to

Location: Pasadena Central Library, 285 East Walnut Street (at Garfield), 4th floor Conference Room, Pasadena, CA

  • Parking at the library is at a premium, so please park on the street or use one of these nearby lots: Parking details
  • 5:30 – 6:30 pm (Optional) Quiet writing hour at your favorite location within the library for those interested. Use this time to work on your project or do some writing before the meeting.
  • 6:30 – 8:00 pm Meetup in 4th floor Conference Room

More Details

Join a community of like-minded people building and improving their personal websites. Invite friends that want a personal site.

  • Work with others to help motivate yourself to create the site you’ve always wanted to have.
  • Ask questions about things you may be stuck on–don’t let stumbling blocks get in the way of having the site you’d like to have.
  • Finish that website feature or blog post you’ve been working on
  • Burn down that old website and build something from scratch
  • Share what you’ve gotten working
  • Demos of recent breakthroughs

Skill levels: Beginner, Intermediate, Advanced

Any questions? Need help? Need more information? Ask in chat: http://indiewebcamp.com/irc/today#bottom

RSVP

Our space within the library is somewhat limited, so please RSVP prior to attending so we can ensure that we can accomodate as many people as possible.

Add your optional RSVP in the comments below; by adding your indie RSVP via webmention to this post; or by RSVPing yes to one of the syndicated posts below:
Indieweb.org event: https://indieweb.org/events/2018-02-21-homebrew-website-club#Los_Angeles_Area
Facebook: https://www.facebook.com/events/1841008432590283/
Meetup.com: https://www.meetup.com/IndieWeb-Homebrew-Website-Club-Los-Angeles/events/247817484/
Twitter: https://twitter.com/ChrisAldrich/status/963911115271938048

Syndicated copies to:

A better way to subscribe to or follow sites on the open web

Just as I was getting sick last week, Colin Walker wrote “There has to be a better way to subscribe to sites.” He’s definitely hit the nail right on the head. The process is currently painful and disorganized, it’s also working on technology that’s almost two decades old and difficult for newcomers at best.

I’ve always posited that one of the reasons that social media silos have been so successful is that they’ve built some fantastic readers. Sure their UI is cleaner and just dead simple, but to a great extent 95% of their product is an evolved feed reader while the other 5% is a simple posting interface that makes it easy to interact. To compare, most CMSes are almost completely about posting interface, and spend very little time, if any, worrying about providing a reading experience.

The IndieWeb has been making some serious strides on making cross-site interactions easier with the Webmention and Micropub protocols, but the holy grail is still out there: allowing people to have an integrated feed reader built into their website (or alternately a standalone feed reader that’s tightly integrated with their site via Micropub or other means).

For those watching the space with as much interest as I have, there are a couple of interesting tools in the space and a few on the immediate horizon that are sure to make the process a whole lot easier and create a new renaissance in the open web.

SubToMe: a Universal Subscribe Button

First, for a relatively simple one-size-fits-all subscribe button, I recommend people take a look at SubToMe which touts itself as a “Universal Follow button” because it  “makes it easy for people to follow web sites,because browsers don’t do it.” The button is fairly straightforward and has an awful lot of flexibility built in. In the simplest sense it has some solid feed detection so it finds available feeds on a web page and then provides a handful of recommended major readers to the user. With two clicks, one can pretty quickly and almost immediately subscribe to almost any feed in their reader of choice. 

For publishers, one can quickly install a simple button on their site. They can further provide a list of specific feeds they want to advertise, and they can even recommend a particular feed reader if they choose.

For consumers, the service provides a simple browser bookmarklet so that if a site doesn’t have a button, they can click a subscribe button in their browser. Then click on a provider. Done. One can also choose a preferred provider to shorten the process.

Almost all the major feed readers are supported out of the box and the process of adding new ones is relatively simple.

Microsub

Since last June there’s been a quietly growing new web spec called Microsub  that will assuredly shake up the subscription and reader spaces. In short it provides a standardized way for clients to consume and interact with feeds collected by a server.

While it gets pretty deep pretty quickly, the spec is meant to help decouple some of the heavy architecture of building a feed reader. In some way it’s analogous to the separation of content and display that HTML and CSS allows, but applied to the mechanics of feed readers and how readers display their content.

There are already a few interesting projects by the names of Together and Indigenous that are taking advantage of the architecture

I can’t wait to see how it all dovetails together to make a more integrated reading and posting interface as well as the potential it has for individual CMSs to potentially leverage the idea to include integrated interfaces into their products. I can’t wait for the day when my own personal website is compatible with Microsub, so that I can use any Microsub client to read my timeline and follow people.

I’m also sure that decoupling the idea of displaying posts from actually fetching remote feeds will make it easier to build a reader clients in general. I hope this has a Cambrian explosion-type of effect on the state of the art of feed readers.

I’d recommend those interested in a high level discussion to have a listen to the following thee short episodes of Aaron Parecki’s Percolator microcast.

Episode 3: Following

Episode 10: Microsub for Readers

Episode 17: It’s 2018!

Featured photo credit: Flock of sheep flickr photo by Jo@net shared under a Creative Commons (BY) license

Syndicated copies to:

Happy Fornicalia

Celebrating the Ancient Roman religious festival in honor of the goddess Fornax

As we coast toward the nones of February whence we’ll commence the celebration of the Fornacalia, by all accounts an Ancient Roman religious festival celebrated in honor of the goddess Fornax, a divine personification of the oven (fornax), and was related to the proper baking of bread, I thought it only appropriate to call some attention to what should be an international holiday for bakers.

While shamefully few, if any(?), now celebrate the Fornacalia, I’ve always looked at the word as a portmanteau of a festival along the lines of a bacchanalia for bread with tinges of seeming Latin cognates fornicati, fornicatus, fornicata, and fornicatae or the Greek equivalent porneia (πορνεία). Knead these all together and you’ve got the makings of a modern day besotted festival of bread immorality. And really, who wouldn’t want to celebrate such a thing?!

I’ll celebrate myself by doing some baking, listening to the bread related episodes of Eat This Podcast, while reading and looking at bread porn on Fornacalia.com. Special thanks to curio maximus Jeremy Cherfas for providing entertainment for the festival!

How will you celebrate?

 

Featured photo Bread is a flickr photo by Jeremy Keith aka adactio shared under a Creative Commons (BY) license.

Syndicated copies to:

Fragmentions for Better Highlighting and Direct References on the Web

Fragmentions

Ages ago I added support on my website for fragmentions.

Wait… What is that?

Fragmention is a portmanteau word made up of fragment and mention (or even Webmention), but in more technical terms, it’s a simple way of creating a URL that not only targets a particular page on the internet, but allows you to target a specific sub-section of that page whether it’s a photo, paragraph, a few words, or even specific HTML elements like <div> or <span> on such a page. In short, it’s like a permalink to content within a web page instead of just the page itself.

A Fragmention Example

Picture of a hipster-esque looking Lego toy superimposed with the words: I'm not looking for a "hipster-web", but a new and demonstrably better web.
29/1.2014 – Larry the Barista by julochka is licensed under CC BY-NC
Feature image for the post “Co-claiming and Gathering Together – Developing Read Write Collect” by Aaron Davis. Photo also available on Flickr.

Back in December Aaron Davis had made a quote card for one of his posts that included a quote from one of my posts. While I don’t think he pinged (or webmentioned) it within his own post, I ran across it in his Twitter feed and he cross-posted it to his Flickr account where he credited where the underlying photo and quote came from along with their relevant URLs.

Fragmentions could have not only let him link to the source page of the quote, it would have let him directly target the section or the paragraph where the quote originated or–even more directly–the actual line of the quote.

Here’s the fragmention URL that would have allowed him to do that: http://boffosocko.com/2017/10/27/reply-to-laying-the-standards-for-a-blogging-renaissance-by-aaron-davis/#I%E2%80%99m%20not%20looking

Go ahead and click on it (or the photo) to see the fragmention in action.

What’s happening?

Let’s compare the two URLs:
1. http://boffosocko.com/2017/10/27/reply-to-laying-the-standards-for-a-blogging-renaissance-by-aaron-davis/
2. http://boffosocko.com/2017/10/27/reply-to-laying-the-standards-for-a-blogging-renaissance-by-aaron-davis/#I%E2%80%99m%20not%20looking

They both obviously point to the same specific page, and their beginnings are identical. The second one has a # followed by the words “I’m not looking” with some code for blank spaces and an apostrophe. Clicking on the fragmention URL will take you to the root page which then triggers a snippet of JavaScript on my site that causes the closest container with the text following the hash to be highlighted in a bright yellow color. The browser also automatically scrolls down to the location of the highlight.

Note: rather than the numbers and percent symbols, one could also frequently use the “+” to stand in for white spaces like so: http://boffosocko.com/2017/10/27/reply-to-laying-the-standards-for-a-blogging-renaissance-by-aaron-davis/#not+looking+for+just This makes the URL a bit more human readable. You’ll also notice I took out the code for the apostrophe by omitting the word “I’m” and adding another word or two, but I still get the same highlight result.

This can be a very useful thing, particularly on pages with huge amounts of text. I use it quite often in my own posts to direct people to particular sub-parts of my website to better highlight the pieces I think they’ll find useful.

It can be even more useful for academics and researchers who want to highlight or even bookmark specific passages of text online. Those with experience on the Medium.com platform will also notice how useful highlighting can be, but having a specific permalink structure for it goes a step further.

I will note however, that it’s been rare, if ever, that anyone besides myself has used this functionality on my site. Why? We’ll look at that in just a moment.

Extending fragmentions for easier usability.

Recently as a result of multiple conversations with Aaron Davis (on and between our websites via webmention with syndication to Twitter), I’ve been thinking more about notes, highlights, and annotations on the web. He wrote a post which discusses “Page Bookmarks” which are an interesting way of manually adding anchors on web pages to allow for targeting specific portions of web pages. This can make it easy for the user to click on links on a page to let them scroll up and down specific pages.  Sadly, these are very painful to create and use both for a site owner and even more so for the outside public which has absolutely no control over them whatsoever.

His post reminded me immediately of fragmentions. It also reminded me that there was a second bit of user interface related to fragmentions that I’d always meant to also add to my site, but somehow never got around to connecting: a “fragmentioner” to make it more obvious that you could use fragmentions on my site.

In short, how could a user know that my website even supports fragmentions? How could I make it easier for them to create a fragmention from my site to share out with others? Fortunately for me, our IndieWeb friend Kartik Prabhu had already wired up the details for his own personal website and released the code and some pointers for others who were interested in setting it up themselves. It’s freely available on Github and includes some reasonable details for installation.

So with a small bit of tweaking and one or two refinements, I got the code up and running and voilà! I now have a natural UI for highlighting things.

How?

When a user naturally selects a portion of my page with their mouse–the way they might if they were going to cut and paste the text, a simple interface pops up with instructions to click it for a link. Kartik’s JavaScript automatically converts the highlight into the proper format and changes the page’s URL to include the appropriate fragmention URL for that snippet of the page. A cut and paste allows the reader to put that highlighted piece’s URL anywhere she likes. It

text highlighted in a browser with a small chain icon and text which says "Click for link to text"
Highlighting text pulls up some simple user interface for creating a fragmention to the highlighted text.

The future

What else would be nice?

I can’t help but think that it would be fantastic if the WordPress Fragmention plugin added the UI piece for highlight and sharing text via an automatically generated link.

Perhaps in the future one could allow a highlight and click interaction not only get the link, but to get a copy of both the highlighted text and the link to the URL. I’ve seen this behavior on some very socially savvy news websites. This would certainly make a common practice of cutting and pasting content much easier to do while also cleverly including a reference link.

The tough part of this functionality is that it’s only available on websites that specifically enable it. While not too difficult, it would be far nicer to have native browser support for both fragmention creation and use.  This would mean that I don’t need to include the JavaScript on my website to do the scrolling or highlighting and I wouldn’t need any JavaScript on my site to enable the highlighting to provide the specific code for the custom URL. How nice would it be if this were an open web standard and supported by major browsers without the need for work at the website level?

Medium-like highlighting and comments suddenly become a little easier for websites to support. With some additional code, it’s only a hop, skip, and a jump to dovetail this fragmention functionality with the W3C Webmentions spec to allow inline marginalia on posts. One can create a fragmention targeting text on a website and write a reply to it. With some UI built out,  by sending a webmention to the site, it could pick up the comment and display it as a marginal note at that particular spot instead of as a traditional comment below the post where it might otherwise loose the context of being associated at the related point in the main text. In fact our friend Kartik Prabhu has done just this on his website. Here’s an example of it in his post announcing the feature.

Example of inline marginalia on Kartik Prabhu’s website “Parallel Transport”.

You’ll notice that small quotation bubbles appear at various points in the text indicating marginalia. By clicking on them, the bubble turns green and the page expands to show the comment at that location. One could easily imagine CSS that allows the marginalia to actually display in the margin of the page for wider screens.

How could you imagine using fragmentions? What would you do with them? Feel free to add your thoughts below or own your site and send me a webmention.

Syndicated copies to:

A Digital Food Diary on My Own Website

Food and Drink on my own website

I’ve been wanting to do it for a while, but I’ve finally started making eat and drink posts. The display isn’t exactly what I want yet, but it’s getting there. For myself and those reading, I’ll try to continue tweaking on templates, but with the start of the new year, I wanted to at least start capturing the basic data. Most of the heavy lifting will be done by David Shanske’s excellent Post Kinds Plugin.

I’m hoping that, much like the dieting advice about getting and using a clean plate for every single thing you eat, consciously posting will help me to subconsciously eat better too. I’ve already begun to notice some of the subtle effects, and not just for composing better photographs of my food.

I probably won’t post everything publicly after some time because, really, who really wants to see all this (perhaps aside from others interested in doing the same thing themselves)? Eventually it’ll probably devolve into only the more fabulous looking restaurant meals and specialty cocktails while I’m out.

🍴 Ham sandwich with muenster on rye, strawberries, oatmeal and raisin cookie with Coca-cola zero sugar

Since the ham sandwich post is so vaunted and maligned in the social media space and I can more properly support it, I’ve already made my obligatory first personal ham sandwich post.

Previous Food related posts on Silos

Back in the day, I’d used services like Eat.ly and Foodspotting. The former was bought out by the latter and development and customer acquisition seems to have died altogether. These did a reasonable job of melding eating and checkin post types, but the genre seems to have died out for lack of interest and or development. Since some of what they did was interesting and useful to me, I’m recreating portions of it on my own site.

Courtesy of David’s Simple Location Plugin, I’ll also be able to add location data to my eating-related posts to also make them checkins in a sense much like some of the functionality of these older silos.

I did like some of the health related and calorie data that Eat.ly made possible, and might consider adding some of that into my site in the future as well. I’ll have to take a look at services like WeightWatchers that I would expect might add that type of functionality as well. This also reminds me that Leo Laporte has a wi-fi scale that Tweets out his weight every time he stands on it. That sounds like useful quantified self data, though I don’t think I’d go so far as to post it publicly on my site (or syndicate it) in the future.

Feeds for these posts

I can’t imagine that anyone but potential stalkers would care, but for posterity, here are the feeds associated with these posts:
Eat: http://www.boffosocko.com/kind/eat/feed/
Drink: http://www.boffosocko.com/kind/eat/feed/
Eat & Drink (combined): http://www.boffosocko.com/feed/?kind=eat,drink

If you’re subscribed to my full feed and don’t want these in it, it’s possible to redact these posts from your stream, just drop me a line and I can help you subscribe to just the content you desire. Those subscribed to the “Food” category needn’t worry as I don’t expect to be clogging that category up with these posts.

Syndicated copies to:

Adding Simple Twitter Response Buttons to WordPress Posts

Simple Twitter UI buttons on an IndieWeb-enabled site can allow Twitter to become part of your commenting system.

Back at IndieWebCamp Austin, I became enamored of adding additional methods of interacting with my website, particularly for those who weren’t already on the IndieWeb train. I’d seen these types of interactions already on Tantek Çelik’s site in the past, so naturally I figured I would start there.

Web Actions

Some basic searching revealed that in IndieWeb parlance, these types of functionalities are known as web actions. While they’re often added to make it easier for one site with the proper infrastructure to interact with another, they’re also designed for social web silos (Like Twitter, Facebook, et al.) to do this type of interaction simply as well.

As a small scale experiment, I thought I would begin manually and add some simple interface to allow Twitter users (who may not yet have their own websites to use to respond to me instead) to be able to quickly and easily reply to, repost, or like posts on my site. A little bit of reading on the wiki and Twitter’s developer site allowed me to leverage something into existence pretty quickly.

Interestingly, although there are many plugins that help users simply share a blog post to Twitter, I couldn’t easily find a WordPress plugin that already allows these other interactions as options at all. I suspect it may be because the other side of the interaction of bringing the replies back to one’s site isn’t commonly known yet.

Example

I was able to write a post, syndicate it to Twitter, upload the button images, and then inject the Twitter post ID (939650287622434816 in this example) for my syndicated copy into my post like so:

<span class="syn-text">Respond via Twitter:
<ul class="relsyn">
<li><a href="https://twitter.com/intent/tweet?in_reply_to=939650287622434816" target=""><img src="/reply-icon-16.png" alt="" width="16" height="11" /> Reply</a></li>
<li><a href="https://twitter.com/intent/retweet?tweet_id=939650287622434816" target=""><img src="/retweet-icon-16.png" alt="" width="16" height="10" /> Repost</a></li>
<li><a href="https://twitter.com/intent/favorite?tweet_id=939650287622434816" target=""><img src="/like-icon-16.png" alt="" width="16" height="16" /> Like</a></li>
</ul><script type="text/javascript" async src="https://platform.twitter.com/widgets.js"></script></span>

And voila! My new post now had some simple buttons that allow users a simple one click interaction with a popup window to reply to, repost, or like my post.

Displaying responses

Naturally, through the “magic” of Brid.gy, I’m able to collect these responses via backfeed with the Webmention protocol using the Webmention Plugin for WordPress back to my own website. In simpler and less technical terms, if you use one of these buttons, your interaction with my website as posted to Twitter comes back to live on my website. Thus users can use Twitter to write a comment or reply on Twitter and it will display in my comments section just as if they had written it directly in my comments box. Likes and reposts are sent to my site and are displayed relatively naturally as facepiles under the comment headings “Likes” and “Reposts”.

I’ll do another manual example with this particular post, so feel free to use the buttons at the bottom of this post to make your response via Twitter if you wish.

Future Improvements

Taking some of this code and abstracting into a plugin for others to use would be a nice feature. Doing this would also potentially make it available as a potential plugin in the larger IndieWeb suite of WordPress plugins. Perhaps it could be easily added into the codebase in one or another pre-existing plugins? I might think that David Shanske’s  Syndication Links plugin or Bridgy Publish plugin might make sense as they’re already adding functionality for part of the publishing half of the cycle by either publishing to Twitter and/or importing the Tweet ID back into one’s WordPress site for potential display. One or the other could do a simple if/then on the existence of a syndicated Tweet, then extract the Twitter ID, and add the buttons to the interface appropriately.

It would be interesting to add full mark up to make <indie-action> functionality possible for a broader class of web actions, particularly if it could be integrated directly into WordPress in a more interesting manner to work with the Post Kinds Plugin or the IndieWeb PressThis type of bookmarklet functionality.

Instead of having these types of interactions injected at the bottom of the post, it may make more sense to have it display in the comment block instead.

I suspect that Facebook, Instagram, and others also enable some types of functionality, so adding the ability to use them the same way would be awesome. And even more so in the case of RSVP’s to events since Brid.gy handles those relatively well between Facebook and WordPress sites. (See this example.)

Try it yourself

Go ahead and use the buttons below to interact with this post via Twitter.

Syndicated copies to:

Virtual Homebrew Website Club Meetup on December 27, 2017

Work on your #newwwyear Resolution to have a better website for 2018

I was going to take the week off for the holidays, but seeing a group of people rally around the hashtag #newwwyear to get excited about building and updating their personal websites has inspired me to host an online Homebrew Website Club meetup during the holidays.

This is a virtual/online HWC meetup for website builders who either can’t make a regular in-person meeting in their city or don’t yet have critical mass to host one in their area. Everyone of every level is welcome to participate remotely! Don’t have a domain yet? Come along and someone can help you get started and provide resources for creating the site you’d like to have.

Virtual Homebrew Website Club Meetup

Time:  to
Location: Online via Google Hangouts (a link will be posted here or on syndicated copies of this post prior to the meetup: https://hangouts.google.com/call/8XGkwtEjxg2oH07pyOD_AAEE)

More Details

Join a community of like-minded people building and improving their personal websites. Invite friends that want a personal site.

  • Work on your #newwwyear Resolution or IndieWeb Resolutions for 2018
  • Work with others to help motivate yourself to create the site you’ve always wanted to have.
  • Ask questions about things you may be stuck on–don’t let stumbling blocks get in the way of having the site you’d like to have.
  • Finish that website feature or blog post you’ve been working on
  • Burn down that old website and build something from scratch
  • Share what you’ve gotten working
  • Demos of recent breakthroughs

A link to the virtual meetup on Google Hangouts will be posted on the day of the event. Check back before the quiet writing hours/meeting to get the link.

Optional quiet writing hour: 19:30–20:30 ET (16:30-17:30 PT) Use this time to work on your project (or get some help) before the meeting.
Meetup: 20:30–21:30 ET (17:30-18:30 PT)

Skill levels: Beginner, Intermediate, Advanced

Keep in mind that there is often a European virtual meetup if those times work better for your schedule.

Any questions? Need help? Need more information? Ask in chat: http://indiewebcamp.com/irc/today#bottom

RSVP

Add your optional RSVP in the comments below; by adding your indie RSVP via webmention to this post; or by RSVPing yes to one of the syndicated posts below:
Indieweb.org event: https://indieweb.org/events/2017-12-27-homebrew-website-club#Virtual_Americas
Facebook.com: https://www.facebook.com/events/801364280044020/
Meetup.com: https://www.meetup.com/IndieWeb-Homebrew-Website-Club-Los-Angeles/events/246071606/

 

Syndicated copies to:

Accessibility on the Web

I certainly don’t go out of my way to follow the topic of accessibility, though I do think about it occasionally. It’s apparently bubbling up more frequently as something in need of some dire attention on both the web and in real life.

I ran across three different pleas in less than the span of an hour, so it’s something I’ll commend to everyone’s attention. Rachel’s tweet has some nice linked resources. I’ll have to take a closer look at what I can do to better support these ideas myself.

I’m glad that WordPress.org has a feature filter checkbox for “accessibility ready” on their themes page, but they should begin using that flag to filter out those which aren’t and just not showing them. It would be nice to have that type of functionality to be able to sort plugins by as well, or to leverage plugins to support it against the threat of being de-listed.

I highly recommend these two additional articles I saw that touch upon two different areas:

Excluded from Confoo Speaker Dinner: What Happened and How it Made Me Feel by Nicolas Steenhout

Spooled Twitter Thread: OK Third-Party WordPress, We Need To Have A Come-to-Jesus Meeting About Your Accessibility Flare by Amanda Rush

Syndicated copies to:

Threaded Replies and Comments with Webmentions in WordPress

Introduction to what one would consider basic web communication

A few days ago I had written a post on my website and a colleague had written a reply on his own website. Because we were both using the W3C Webmention specification on our websites, my site received the notification of his response and displayed it in the comments section of my website. (This in and of itself is really magic enough–cross website @mentions!)

To reply back to him I previously would have written a separate second post on my site in turn to reply to his, thereby fragmenting the conversation across multiple posts and making it harder to follow the conversation. (This is somewhat similar to what Medium.com does with their commenting system as each reply/comment is its own standalone page.)

Instead, I’ve now been able to configure my website to allow me to write a reply directly to a response within my comments section admin UI (or even in the comments section of the original page itself), publish it, and have the comment be sent to his reply and display it there. Two copies for the price of one!

From the comments list in my Admin UI, I can write a reply and it not only lives on my site but it can now be sent as a comment to the site that made the original comment! As an example, here’s my first one and the resultant copy on the site I was replying to.

This means that now, WordPress-based websites (at least self-hosted versions running the WordPress.org code) can easily and simply allow multiple parties to write posts on their own sites and participate in multi-sided conversations back and forth while all parties maintain copies of all sides of the conversation on their own websites in a way that maintains all of the context. As a result, if one site should be shut down or disappear, the remaining websites will still have a fully archived copy of the entire conversation thread. (Let’s hear it for the resilience of the web!)

What is happening?

This functionality is seemingly so simple that one is left wondering:

  • “Why wasn’t this baked into WordPress (and the rest of the web) from the start?”
  • “Why wasn’t this built after the rise of Twitter, Facebook, or other websites which do this as a basic function?”
  • “How can I get it tout suite?!” (aka gimme, gimme, gimme, and right now!!!)

While seeming simple, the technical hurdles aren’t necessarily because there had previously never been a universal protocol for the web to allow it. (The Webmentions spec now makes it possible.) Sites like Facebook, Twitter, and others enable it because they’ve got a highly closed and highly customized environment that makes it a simpler problem to solve. In fact, even old-school web-based bulletin boards allowed this!

But even within social media one will immediately notice that you can’t use your Facebook account to reply to a Twitter account. And why not?! (While the web would be far better if one website or page could talk to another, these sites don’t for the simple economic reason that they want you using only their site and not others, and not enabling this functionality keeps you locked into what they’re selling.)

I’ll detail the basic set up below, but thought that it would be highly illustrative to have a diagram of what’s physically happening in case the description above seems a bit confusing to picture properly. I’ll depict two websites, each in their own column and color-coded so that content from site A is one color while content from site B is another color.

A diagram of where comments live when sent via webmention.
Each site composes and owns its own content and sends the replies to the other site.

It really seems nearly incomprehensible to me how this hasn’t been built into the core functionality of the web from the beginning of at least the blogosphere. Yet here we are, and somehow I’m demonstrating how to do this from one WordPress site to another via the open web in 2017. To me this is the entire difference between a true Internet and just using someone else’s intranet.

Implementation

Prerequisites

While this general functionality is doable on any website, I’ll stick to enabling it specifically on WordPress, a content management system that is powering roughly 30% of all websites on the internet. You’ll naturally need your own self-hosted WordPress-based website with a few custom plugins and a modern semantic-based theme. (Those interested in setting it up on other platforms are more than welcome to explore the resources of the IndieWeb wiki and their chat which has a wealth of resources.)

Plugins

As a minimum set you’ll want to have the following list of plugins enabled and configured:

Other instructions and help for setting these up and configuring them can be found on the IndieWeb wiki, though not all of the steps there are necessarily required for this functionality.

Themes

Ideally this all should function regardless of the theme you have chosen, but WordPress only provides the most basic support for microformats version 1 and doesn’t support the more modern version 2 out of the box. As a result, the display of comments from site to site may be a bit wonky depending on how supportive your particular theme is of the microformats standards. As you can see I’m using a relatively standard version of the TwentySixteen theme without a lot of customization and getting some reasonable results. If you have a choice, I’d recommend one of the following specific themes which have solid semantic markup:

Plugin

The final plugin that enables sending comments from one comment section to another is the WordPress Webmention for Comments plugin. As it is still somewhat experimental and is not available in the WordPress repository, you’ll need to download it from GitHub and activate it. That’s it! There aren’t any settings or anything else to configure.

Use

With the plugin installed, you should now be able to send comments and replies to replies directly within your comments admin UI (or directly within your comments section in individual pages, though this can not require additional clicks to get there, but you also don’t have the benefit of the admin editor either).

There is one current caveat however. For the plugin to actually send the webmention properly, it will need to have a URL in your reply that includes the microformats u-in-reply-to class. Currently you’ll need to do this manually until the plugin can properly parse and target the fragmentions for the comments properly. I hope the functionality can be added to the plugin to make the experience seamless in the future.

So what does this u-in-reply-to part actually look like? Here’s an example of the one I used to send my reply:

<a class="u-in-reply-to" href="https://islandinthenet.com/manually-adding-microfomats-markup/">Khürt</a>

The class tells the receiving site that the webmention is a reply and to display it as such and the URL is necessary for your webmention plugin to know where to send the notification. You’d simply need to change the URL and the word (or words) that appear between the anchor tags.

If you want to have a hidden link and still send a webmention you could potentially add your link to a zero width space as well. This would look like the following:

<a class="u-in-reply-to" href="http://www.example.com">&​#8203;​</a>

Based on my experiments, using a <link> via HTML will work, but it will send it as a plain webmention to the site and it won’t show up natively as a reply.

Sadly, a plain text reply doesn’t work (yet), but hopefully some simple changes could be made to force it to using the common fragmentions pattern that WordPress uses for replies.

Interestingly this capability has been around for a while, it just hasn’t been well documented or described. I hope now that those with WordPress sites that already support Webmentions will have a better idea what this plugin is doing and how works.

Future

Eventually one might expect that all the bugs in the system get worked out and the sub-plugin for sending comment Webmentions will be rolled up into the main Webmentions plugin, which incidentally handles fragmentions already.

Caveats

In addition to the notes above, I will say that this is still technically experimental code not running on many websites, so its functionality may not be exact or perfect in actual use, though in experimenting with it I have found it to be very stable. I would recommend checking that the replies actually post to the receiving site, which incidentally must be able to accept webmentions. If the receiving website doesn’t have webmention support, one will need to manually cut and paste the content there (and likely check the receive notification of replies via email, so you can stay apprised of future replies).

You can check the receiving site’s webmention support in most browsers by right clicking and viewing the pages source. Within the source one should see code in the <head> section of the page which indicates there is a webmention endpoint. Here is an example of the code typically injected into WordPress websites that you’d be looking for:

<link rel="webmention" href="http://example.com/wp-json/webmention/1.0/endpoint" />
<link rel="http://webmention.org/" href="http://example.com/wp-json/webmention/1.0/endpoint" />

Also keep in mind that some users moderate their comments, so that even though your mention was sent, they may need to approve it prior to it displaying on the page.

If you do notice problems or issues or have quirks, please file the issue with as full a description of what you did and what resulted as you can so that it can be troubleshot and made to work not only for you, but hopefully work better for everyone else.

Give it a try

So you’ve implemented everything above? Go ahead and write a reply on your own WordPress website and send me a webmention! I’ll do my best to reply directly to you so you can send another reply to make sure you’ve got things working properly.

Once you’re set, go forward and continue helping to make the web a better place.

Special Thanks

I wanted to take a moment to give special thanks to Aaron Parecki, Matthias Pfefferle, and David Shanske who have done most of the Herculean work to get this and related functionality working. And thanks also to all who make up the IndieWeb community that are pushing the boundaries of what the web is and what it can accomplish. And finally, thanks to Khürt Williams who became the unwitting guinea pig for my first attempt at this. Thank you all!

​​​​​​

Syndicated copies to:

An update to read posts for physical books

Inspired by gRegor Morrill’s IndieWebCamp Austin project, I went back and took a look at some of my read posts, and particularly for books.

For online material, I use the Post Kinds Plugin which does a good job of adding h-cite and p-read-of (experimental) microformats classes to the data for the things I’ve read.

Because Post Kinds doesn’t (yet?) support percentage finished or number of pages read, I generally do read posts for books by hand as notes with the relevant data. So I decided to add some better mark up to my book-specific read posts and added microformats classes of h-cite, u-url, u-read-of, p-name, p-author, h-card and dt-published. I’m far from an expert on microformats, but hopefully the way I’m nesting them makes sense to parsers off in the future. (Suggestions for improvement are more than welcome.)

I like Gregor’s idea of p-read-status for things he’s posting and will have to see how I can pull that off for posts in the future (or suggest it as an addition to Post Kinds). Presently I’m just adding a want to read tag, but that could be improved to better match the functionality I appreciate in silos like Goodreads. I’ll also have to load up Gregor’s recent modifications to Quill and test them out on my site as well. I know David Shanske has expressed interest in better aligning Quill and micropub clients to post to WordPress with Post Kinds in mind.

Here’s an example of the mark up of a recent read post:

Read pages 381-461 to finish reading <span class="h-cite"><cite><a class="u-url u-read-of p-name" href="http://amzn.to/2zXnQDC" target="_blank" rel="noopener">Origin: A Novel</a></cite> by <span class="p-author h-card"><a class="p-name u-url" href="http://danbrown.com/">Dan Brown</a></span><time class="dt-published" datetime="2017-10-103 00:00:00"></time></span>

It’s also made me begin to feel itchy about some of my past quote posts and potentially revisiting them to add the appropriate h-cite and related mark up to them as well. (Or at least fix it moving forward.)

Incidentally, my real camp project was some heavy editing work on “The Book.” More on that later…

Syndicated copies to:

Mismanaged road closures on 210 Freeway for the Creek Fire (and others)

I’ll note at the outset that there are larger, potentially more pressing problems relating to the current fires in Southern California, and I have every hope that they’re mitigated as quickly and smoothly as possible, particularly for the large numbers of displaced residents. But I also know that this is not our “first rodeo”, and therefore there should have been better planning and be a better coordinated response from state and local officials.

Apparently in a fit of poor thinking, the California Highway Patrol and the fine folks at CalTrans Distric 7 have closed almost all of the East and Westbound exits on the 210 Freeway from roughly Glendale to past Sylmar. This includes exits for areas that aren’t under immediate threat, nor, based on reports I’ve seen, for areas that are expected to be threatened.

While I understand that they’re evacuating much of the proximal area for the Creek Fire and public safety, they’re potentially causing not only undue burden on people moving around or through the area, but adding stress to resources needed to abate the issue. In particular, while it may be advisable to close several on/off ramps nearest the fire, it is neither smart, nor helpful to have all of them closed for miles and miles in all directions, particularly those closures at the furthest ends.

Because the Westbound Pennsylvania and Lowell freeway ramps were (unnecessarily) closed this morning on the Westbound 210, I and thousands of others, including countless parents taking their children to one of the several dozen schools in NorthWest Glendale, were unnecessarily forced to spend an additional hour or more this morning driving on the 210 through the worst of the smoke out past Sylmar only to need to turn around and drive back through the heavy smoke to return to our original destinations. After almost a day of issues, there is still no signage on the 210 Freeway indicating any closures. Easily one of approximately 20 CalTrans vans I saw blocking exits this morning could have been better used to pull a trailer with closure signage.

I get the need to evacuate the area and close roads, but why not close them at the surface street level? This would allow travelers to turn around and reroute instead of being unnecessarily forced to spend one or more hours in both heavy traffic and heavy smoke. If there aren’t enough resources to do this at every exit, why not at least one or two of them to alleviate the additional and unnecessary back and forth?

I noticed at least four accidents–which I’m sure is at least 3 standard deviations from the average–on this stretch of freeway, which I hope were small fender benders. I would posit that these were all caused as a result of (frustrated and distracted) people simply trying to exit and turn around. This stresses the EMS system further by requiring the additional response of police, ambulance, fire and other first responders. I saw at least one firetruck at such a scene this morning, which I’m sure could have been better deployed against low containment numbers in highly populated areas being threatened by fire.

I saw people attempting to go the wrong way down on ramps simply to access surface streets to turn around. I saw dozens of cars (far more than usual) pulled over on the side of the road attempting to figure out the predicament. At least one driver in a similar situation this morning was forced to cope with running out of gas as the result of lack of communication. I stopped at at least two exit ramps in an attempt to get information from CHP officers, none of whom had any information about where or how to turn around. They literally knew nothing except that they could not let me pass at that point. (To me this is painfully inept communication at a time when communication could be saving lives, and multiple hours after these issues should have long since been anticipated.)

If they’re going to pull the public safety card, local and state government should simply close the entire 210 freeway from the 2 North to past Sylmar. If they can’t do this they should do local street closures to allow constituents to exit the freeway to turn around and find alternate routes back and around instead of simply being stuck (due to the lack of zero signage) and put further in harms way.

Additionally, if CalTrans hasn’t figured it out yet, there’s also a very frequently used traffic app called Waze that can be quickly edited to indicate road closures that will drastically help to mitigate traffic issues in and around the area to prevent a lot of the problem. Because Google owns Waze and shares data, it also means that Google Maps, another popular navigation application, will also further mitigate the traffic and ancillary public safety issues. I don’t think that any of the closures I saw this morning were marked on either platform. (Nota bene to Waze/Google Maps, in high traffic areas like Southern California, I’m surprised that your systems don’t intuit major closings automatically given the amounts of data you’re receiving back.)

I hope that from an executive standpoint state and local systems will have their resources better deployed for this evening’s commute. I can’t help but note that these aren’t the first large fires in the Southern California area, so I’m shocked that the response isn’t better managed. Better managing small seeming issues like these could allow resources that have to be deployed to remedy distal issues like them to be better deployed to the proximal issues.

If they can’t manage to fix these issues in the near term, I hope they’ll at least file them into their future emergency plans for what are sure to be future incidents.

Syndicated copies to: