Charlotte Allen wrote up a method for using Micropub and Webhooks to PESOS to a WordPress website recently.
Great write up Jamie. Some interesting things to think about and lots of useful examples.
I suspect that for most personal websites the idea of fair use will give people enough protection for reply contexts. Of course it will depend on their jurisdiction as fair use can vary by country or potentially even within countries in terms of how it is applied.
I would almost have to think that barring particular legislation and precedent that people/companies who are explicitly providing Open Graph Protocol or similar meta data on their websites are explicitly granting a license to use that content as the only use for that data on most systems is to provide it for creating contexts on services like Facebook, Twitter, etc. Facebook likely created OGP as a proprietary format to give itself broad legal protection for just such use cases, though I suspect they parse pages and take titles or other snippets when OGP doesn’t exist. Naturally some large systems like WordPress may push OGP into code without the site’s owners being aware of what they’re potentially giving away, so the area is really murky at best. It would be beneficial to consult an attorney to see what their best advice might be or if there are precedents with respect to these areas.
For future reference, here is the relevant section for Fair Use from Title 17 of the Copyright Law of the United States:
107. Limitations on exclusive rights: Fair use40
Notwithstanding the provisions of sections 106 and 106A, the fair use of a copyrighted work, including such use by reproduction in copies or phonorecords or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use the factors to be considered shall include—
(1) the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
(2) the nature of the copyrighted work;
(3) the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
(4) the effect of the use upon the potential market for or value of the copyrighted work.
The fact that a work is unpublished shall not itself bar a finding of fair use if such finding is made upon consideration of all the above factors.
Podcasting or bronze casting crafting of your story should be rough draft of your glory Yet the paradox of technology lead to padlocks on the soapbox once artisic holitstic now undescriptive and stripped of synergistic differences search of prescriptive low friction design instead of the grind//...
Upon the efficient consumption and summarizing of news from around the world.
Facebook is informative in the same way that thumb sucking is nourishing. ❧
Annotated on February 09, 2020 at 10:28AM
Upon the efficient consumption and summarizing of news from around the world.
Remember? from when we though the internet would provide us timely, pertinent information from around the world?
How do we find internet information in a timely fashion?
I have been told to do this through Twitter or Facebook, but, seriously… no. Those are systems designed to waste time with stupid distractions in order to benefit someone else. Facebook is informative in the same way that thumb sucking is nourishing. Telling me to use someone’s social website to gain information is like telling me to play poker machines to fix my financial troubles. Stop that. ❧
Annotated on February 09, 2020 at 10:40AM
After two years of not using Facebook - I logged in today to check out this new tool that shows what Facebook has been tracking. Despite this, they have seen my every move on the Internet - this is creepier than I had imagined. https://t.co/BsqpjT3qid— Miguel de Icaza (@migueldeicaza) January 29, 2020
Facebook's long-awaited Off-Facebook Activity tool started rolling out today. While it's not a perfect measure, and we still need stronger data privacy laws, this tool is a good step toward greater transparency and user control regarding third-party tracking. We hope other companies...
The world's greatest expert on canned TV laugh tracks helps Dr Laurie Santos demonstrate how the emotions of those around us can make us feel happier or more sad. If happiness is so contagious... can we use them to bring joy to ourselves and our loved ones?
Jeff’s research showed that participants pick up other people’s emotions through text— in say, a quick email note or an online comment— just as easily as they do in face-to-face real world interaction.”
Hancock, J. T., Landrigan, C., & Silver, C. (2007, April). Expressing emotion in text-based communication. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 929-932). ACM.
The social media giant allowed Jeff to run an experiment to figure out the emotional impact of Facebook posts.
Kramer, A. D., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788-8790.
The original link has some additional references and research, but I’ve excerpted some small portions of the ethically questionable research Facebook allowed on emotional contagion several years back.
In 1968, Lyndon Johnson appointed a National Commission on Obscenity and Pornography to investigate the supposed sexual scourge corrupting America’s youth...
📑 Highlights and Annotations
For the parallels between the fight against harmful and hateful speech online today and the crusade against sexual speech 50 years ago are stunning: the paternalistic belief that the powerless masses (but never the powerful) are vulnerable to corruption and evil with mere exposure to content; the presumption of harm without evidence and data; cries calling for government to stamp out the threat; confusion about the definitions of what’s to be forbidden; arguments about who should be responsible; the belief that by censoring content other worries can also be erased. ❧
Annotated on January 18, 2020 at 08:29AM
One of the essays comes from Charles Keating, Jr., a conservative whom Nixon added to the body after having created a vacancy by dispatching another commissioner to be ambassador to India. Keating was founder of Citizens for Decent Literature and a frequent filer of amicus curiae briefs to the Supreme Court in the Ginzberg, Mishkin, and Fanny Hill obscenity cases. Later, Keating was at the center of the 1989 savings and loan scandal — a foretelling of the 2008 financial crisis — which landed him in prison. Funny how our supposed moral guardians — Nixon or Keating, Pence or Graham — end up disgracing themselves; but I digress. ❧
Annotated on January 18, 2020 at 08:40AM
The fear then was the corruption of the masses; the fear now is microtargeting drilling directly into the heads of a strategic few. ❧
Annotated on January 18, 2020 at 08:42AM
McCarthy next asks: “Who selects what is to be recorded or transmitted to others, since not everything can be recorded?” But now, everything can be recorded and transmitted. That is the new fear: too much speech. ❧
Annotated on January 18, 2020 at 08:42AM
Many of the book’s essayists defend freedom of expression over freedom from obscenity. Says Rabbi Arthur Lelyveld (father of Joseph, who would become executive editor of The New York Times): “Freedom of expression, if it is to be meaningful at all, must include freedom for ‘that which we loathe,’ for it is obvious that it is no great virtue and presents no great difficulty for one to accord freedom to what we approve or to that to which we are indifferent.” I hear too few voices today defending speech of which they disapprove. ❧
I might take issue with this statement and possibly a piece of Jarvis’ argument here. I agree that it’s moral panic that there could be such a thing as “too much speech” because humans have a hard limit for how much they can individually consume.
The issue I see is that while anyone can say almost anything, the problem becomes when a handful of monopolistic players like Facebook or YouTube can use algorithms to programattically entice people to click on and consume fringe content in mass quantities and that subtly, but assuredly nudges the populace and electorate in an unnatural direction. Most of the history of human society and interaction has long tended toward a centralizing consensus in which we can manage to cohere. The large scale effects of algorithmic-based companies putting a heavy hand on the scales are sure to create unintended consequences and they’re able to do it at scales that the Johnson and Nixon administrations only wish they had access to.
If we look at as an analogy to the evolution of weaponry, I might suggest we’ve just passed the border of single shot handguns and into the era of machine guns. What is society to do when the next evolution occurs into the era of social media atomic weapons?
Annotated on January 18, 2020 at 10:42AM
Truth is hard. ❧
Annotated on January 18, 2020 at 10:42AM
As an American and a staunch defender of the First Amendment, I’m allergic to the notion of forbidden speech. But if government is going to forbid it, it damned well better clearly define what is forbidden or else the penumbra of prohibition will cast a shadow and chill on much more speech. ❧
Perhaps it’s not what people are saying so much as platforms are accelerating it algorithmically? It’s one thing for someone to foment sedition, praise Hitler, or yell their religious screed on the public street corner. The problem comes when powerful interests in the form of governments, corporations, or others provide them with megaphones and tacitly force audiences to listen to it.
When Facebook or Youtube optimize for clicks keyed on social and psychological constructs using fringe content, we’re essentially saying that machines, bots, and extreme fringe elements are not only people, but that they’ve got free speech rights, and they can be prioritized with the reach and exposure of major national newspapers and national television in the media model of the 80’s.
I highly suspect that if real people’s social media reach were linear and unaccelerated by algorithms we wouldn’t be in the morass we’re generally seeing on many platforms.
Annotated on January 18, 2020 at 11:08AM
“Privacy in law means various things,” he writes; “and one of the things it means is protection from intrusion.” He argues that in advertising, open performance, and public-address systems, “these may validly be regulated” to prevent porn from being thrust upon the unsuspecting and unwilling. It is an extension of broadcast regulation.
And that is something we grapple with still: What is shown to us, whether we want it shown to us, and how it gets there: by way of algorithm or editor or bot. What is our right not to see? ❧
Privacy as freedom from is an important thing. I like this idea.
Annotated on January 18, 2020 at 11:20AM
The Twenty-Six Words that Created the Internet is Jeff Kosseff’s definitive history and analysis of the current fight over Section 230, the fight over who will be held responsible to forbid speech. In it, Kosseff explains how debate over intermediary liability, as this issue is called, stretches back to a 1950s court fight, Smith v. California, about whether an L.A. bookseller should have been responsible for knowing the content of every volume on his shelves. ❧
For me this is the probably the key idea. Facebook doesn’t need to be responsible for everything that their users post, but when they cross the line into actively algorithmically promoting and pushing that content into their users’ feeds for active consumption, then they **do** have a responsibility for that content.
By analogy image the trusted local bookstore mentioned. If there are millions of books there and the user has choice when they walk in to make their selection in some logical manner. But if the bookseller has the secret ability to consistently walk up to children and put porn into their hands or actively herding them into the adult sections to force that exposure on them (and they have the ability to do it without anyone else realizing it), then that is the problem. Society at large would further think that this is even more reprehensible if they realized that local governments or political parties had the ability to pay the bookseller to do this activity.
In case the reader isn’t following the analogy, this is exactly what some social platforms like Facebook are allowing our politicans to do. They’re taking payment from politicans to actively lie, tell untruths, and create fear in a highly targeted manner without the rest of society to see or hear those messages. Some of these sorts of messages are of the type that if they were picked up on an open microphone and broadcast outside of the private group they were intended for would have been a career ending event.
Without this, then we’re actively stifling conversation in the public sphere and actively empowering the fringes. This sort of active targeted fringecasting is preventing social cohesion, consensus, and comprimise and instead pulling us apart.
Perhaps the answer for Facebook is to allow them to take the political ad money for these niche ads and then not just cast to the small niche audience, but to force them to broadcast them to everyone on the platform instead? Then we could all see who our politicians really are?
Annotated on January 18, 2020 at 11:50AM
Of course, it’s even more absurd to expect Facebook or Twitter or Youtube to know and act on every word or image on their services than it was to expect bookseller Eleazer Smith to know the naughty bits in every book on his shelves. ❧
Here’s the point! We shouldn’t expect them to know, but similarly if they don’t know, then they should not be allowed to randomly privilege some messages over others for how those messages are distributed on the platform. Why is YouTube accelerating messages about Nazis instead of videos of my ham sandwich at lunch? It’s because they’re making money on the Nazis.
Annotated on January 18, 2020 at 12:07PM
there must be other factors that got us Trump ❧
Primarily people not really knowing how racisit and horrible he really was in addition to his inability to think clearly, logically, or linearly. He espoused a dozen or so simple aphorisms like “Build the wall,” but was absolutely unable to indicate a plan that went beyond the aphorism. How will it be implemented, funded, what will the short and long term issues that result. He had none of those things that many others presumed would be worked out as details by smart and intelligent people rather than the “just do it” managerial style he has been shown to espouse.
Too many republicans, particularly at the end said, “he’s not really that bad” and now that he’s in power and more authoritarian than they expected are too weak to admit their mistake.
Annotated on January 18, 2020 at 12:28PM
Axel Bruns’ dismantling of the filter bubble. ❧
research to read
Annotated on January 18, 2020 at 12:45PM
“To affirm freedom is not to applaud that which is done under its sign,” Lelyveld writes. ❧
Annotated on January 18, 2020 at 12:51PM
You can download your Facebook data archive and look through all the various things that Facebook has collected about you. One of the items is a list of all the Facebook Pages you’ve liked and Facebook Groups you have joined. I have 2,320 in my list. I’m a little obsessive. For the record, here’s my …
Altering the internet's economic and digital infrastructure to promote free speech
Meanwhile, politicians from the two major political parties have been hammering these companies, albeit for completely different reasons. Some have been complaining about how these platforms have potentially allowed for foreign interference in our elections.3 3. A Conversation with Mark Warner: Russia, Facebook and the Trump Campaign, Radio IQ|WVTF Music (Apr. 6, 2018), https://www.wvtf.org/post/conversation-mark-warner-russia-facebook-and-trump-campaign#stream/0 (statement of Sen. Mark Warner (D-Va.): “I first called out Facebook and some of the social media platforms in December of 2016. For the first six months, the companies just kind of blew off these allegations, but these proved to be true; that Russia used their social media platforms with fake accounts to spread false information, they paid for political advertising on their platforms. Facebook says those tactics are no longer allowed—that they’ve kicked this firm off their site, but I think they’ve got a lot of explaining to do.”). Others have complained about how they’ve been used to spread disinformation and propaganda.4 4. Nicholas Confessore & Matthew Rosenberg, Facebook Fallout Ruptures Democrats’ Longtime Alliance with Silicon Valley, N.Y. Times (Nov. 17, 2018), https://www.nytimes.com/2018/11/17/technology/facebook-democrats-congress.html (referencing statement by Sen. Jon Tester (D-Mont.): “Mr. Tester, the departing chief of the Senate Democrats’ campaign arm, looked at social media companies like Facebook and saw propaganda platforms that could cost his party the 2018 elections, according to two congressional aides. If Russian agents mounted a disinformation campaign like the one that had just helped elect Mr. Trump, he told Mr. Schumer, ‘we will lose every seat.’”). Some have charged that the platforms are just too powerful.5 5. Julia Carrie Wong, #Breaking Up Big Tech: Elizabeth Warren Says Facebook Just Proved Her Point, The Guardian (Mar. 11, 2019), https://www.theguardian.com/us-news/2019/mar/11/elizabeth-warren-facebook-ads-break-up-big-tech (statement of Sen. Elizabeth Warren (D-Mass.)) (“Curious why I think FB has too much power? Let’s start with their ability to shut down a debate over whether FB has too much power. Thanks for restoring my posts. But I want a social media marketplace that isn’t dominated by a single censor. #BreakUpBigTech.”). Others have called attention to inappropriate account and content takedowns,6 6. Jessica Guynn, Ted Cruz Threatens to Regulate Facebook, Google and Twitter Over Charges of Anti-Conservative Bias, USA Today (Apr. 10, 2019), https://www.usatoday.com/story/news/2019/04/10/ted-cruz-threatens-regulate-facebook-twitter-over-alleged-bias/3423095002/ (statement of Sen. Ted Cruz (R-Tex.)) (“What makes the threat of political censorship so problematic is the lack of transparency, the invisibility, the ability for a handful of giant tech companies to decide if a particular speaker is disfavored.”). while some have argued that the attempts to moderate discriminate against certain political viewpoints. ❧
Most of these problems can all fall under the subheading of the problems that result when social media platforms algorithmically push or accelerate content on their platforms. An individual with an extreme view can publish a piece of vile or disruptive content and because it’s inflammatory the silos promote it which provides even more eyeballs and the acceleration becomes a positive feedback loop. As a result the social silo benefits from engagement for advertising purposes, but the community and the commons are irreparably harmed.
If this one piece were removed, then the commons would be much healthier, fringe ideas and abuse that are abhorrent to most would be removed, and the broader democratic views of the “masses” (good or bad) would prevail. Without the algorithmic push of fringe ideas, that sort of content would be marginalized in the same way we want our inane content like this morning’s coffee or today’s lunch marginalized.
To analogize it, we’ve provided social media machine guns to the most vile and fringe members of our society and the social platforms are helping them drag the rest of us down.
If all ideas and content were provided the same linear, non-promotion we would all be much better off, and we wouldn’t have the need for as much human curation.
Annotated on December 11, 2019 at 11:13AM
That approach: build protocols, not platforms. ❧
I can now see why @jack made his Twitter announcement this morning. If he opens up and can use that openness to suck up more data, then Twitter’s game could potentially be doing big data and higher end algorithmic work on even much larger sets of data to drive eyeballs.
I’ll have to think on how one would “capture” a market this way, but Twitter could be reasonably poised to pivot in this direction if they’re really game for going all-in on the idea.
It’s reasonably obvious that Twitter has dramatically slowed it’s growth and isn’t competing with some of it’s erstwhile peers. Thus they need to figure out how to turn a relatively large ship without losing value.
Annotated on December 11, 2019 at 11:20AM
It would allow end users to determine their own tolerances for different types of speech but make it much easier for most people to avoid the most problematic speech, without silencing anyone entirely or having the platforms themselves make the decisions about who is allowed to speak.❧
But platforms **are **making **huge **decisions about who is allowed to speak. While they’re generally allowing everyone to have a voice, they’re also very subtly privileging many voices over others. While they’re providing space for even the least among us to have a voice, they’re making far too many of the worst and most powerful among us logarithmic-ally louder.
It’s not broadly obvious, but their algorithms are plainly handing massive megaphones to people who society broadly thinks shouldn’t have a voice at all. These megaphones come in the algorithmic amplification of fringe ideas which accelerate them into the broader public discourse toward the aim of these platforms getting more engagement and therefore more eyeballs for their advertising and surveillance capitalism ends.
The issue we ought to be looking at is the dynamic range between people and the messages they’re able to send through social platforms.
We could also analogize this to the voting situation in the United States. When we disadvantage the poor, disabled, differently abled, or marginalized people from voting while simultaneously giving the uber-rich outsized influence because of what they’re able to buy, we’re imposing the same sorts of problems. Social media is just able to do this at an even larger scale and magnify the effects to make their harms more obvious.
If I follow 5,000 people on social media and one of them is a racist-policy-supporting, white nationalist president, those messages will get drowned out because I can only consume so much content. But when the algorithm consistently pushes that content to the top of my feed and attention, it is only going to accelerate it and create more harm. If I get a linear presentation of the content, then I’d have to actively search that content out for it to cause me that sort of harm.
Annotated on December 11, 2019 at 11:39AM
Moving back to a focus on protocols over platforms can solve many of these problems. ❧
This may also only be the case if large corporations are forced to open up and support those protocols. If my independent website can’t interact freely and openly with something like Twitter on a level playing field, then it really does no good.
Annotated on December 11, 2019 at 11:42AM
And other recent developments suggest that doing so could overcome many of the earlier pitfalls of protocol-based systems, potentially creating the best of all words: useful internet services, with competition driving innovation, not controlled solely by giant corporations, but financially sustainable, providing end users with more control over their own data and privacy—and providing mis- and disinformation far fewer opportunities to wreak havoc. ❧
Some of the issue with this then becomes: “Who exactly creates these standards?” We already have issues with mega-corporations like Google wielding out sized influence in the ability to create new standards like Schema.org or AMP.
Who is to say they don’t tacitly design their standards to directly (and only) benefit themselves?
Annotated on December 11, 2019 at 11:47AM
Millions tuned into impeachment hearings this week — the first two of five already scheduled. On this week’s show, why shifts in public opinion may not necessarily sway the GOP. Plus, what we can learn from the predatory tactics that enriched Bill Gates.
1. Nicole Hemmer [@pastpunditry], author of Messengers of the Right: Conservative Media and the Transformation of American Politics, on the false premise underlying hope for President Trump's removal. Listen.
3. Former Labor Secretary Rob Reich [@RBReich] and Goliath author Matt Stoller [@matthewstoller] on how billionaires like Bill Gates use their power and wealth to force their vision on society. Listen.
It’s like the drug dealer who says you can get bribed or you can get a bullet. […] What you always see with monopolists who control an important platform: they use control of that platform to take control of markets that have to live on that platform.
— Matt Stoller, a Fellow at the Open Markets Institute, in On the Media: Designed to Intimidate [November 15, 2019]
Previously, Stoller was a Senior Policy Advisor and Budget Analyst to the Senate Budget Committee and also worked in the US House of Representatives on financial services policy, including Dodd-Frank, the Federal Reserve, and the foreclosure crisis.
Facebook values you at around $158.
Facebook profits off of its 1.4 billion daily users in a big way: According to its most recent filings with the Securities and Exchange Commission, the average revenue per user in 2017 was $20.21 ($6.18 in the fourth quarter alone). Users in the U.S. and Canada were worth even more because of how big the markets are.
—Money.com in March 2018
Now that you know what you and your data are worth, why not invest in yourself instead?
For about $5 a month or $60 a year, you can pay for an account on micro.blog and have a full suite of IndieWeb tools at your disposal. It’s simple, beautiful, but most importantly it gives you control of your own data and an open and independent presence on the entire web instead of a poor simulacrum of it walled away from everyone else. Of course there are other options available as well, just ask how you can get started.
There’s a growing movement on the left and right for prison reform. On this week’s On the Media, a deep dive into the strange bedfellows coalition working to close prisons down. Also, in speeches, testimony, and leaked audio, Mark Zuckerberg has been trying to make a case for free expression — and for Facebook. Plus, what the TV show COPS reveals about our fascination with punishment.
2. David Dagan [@DavidDagan], post-doctoral political science scholar at George Washington University; Mark Holden, senior vice president of Koch Industries; and Brittany Williams, activist with No New Jails in New York City, on the closing down of prisons and jails.
Remarks by Sacha Baron Cohen, Recipient of ADL's International Leadership Award
Thank you, Jonathan, for your very kind words. Thank you, ADL, for this recognition and your work in fighting racism, hate and bigotry. And to be clear, when I say “racism, hate and bigotry” I’m not referring to the names of Stephen Miller’s Labradoodles.
The meeting took place during Zuckerberg’s most recent visit to Washington, where he testified before Congress about Facebook’s new cryptocurrency Libra.
Last weekend, my friend Virginia shared her latest blog post on Facebook, about excellent Ada Lovelace Day posters for women in STEM (Science Technology Engineering Math). Go ahead, download them! …