🔖 Journalism, Online Comments, and the Future of Public Discourse by Marie K. Shanahan | Routledge

Bookmarked Journalism, Online Comments, and the Future of Public Discourse by Marie K. ShanahanMarie K. Shanahan (Routledge)

Comments on digital news stories and on social media play an increasingly important role in public discourse as more citizens communicate through online networks. The reasons for eliminating comments on news stories are plentiful. Off-topic posts and toxic commentary have been shown to undermine legitimate news reporting. Yet the proliferation of digital communication technology has revolutionized the setting for democratic participation. The digital exchange of ideas and opinions is now a vital component of the democratic landscape. Marie K. Shanahan's book argues that public digital discourse is crucial component of modern democracy―one that journalists must stop treating with indifference or detachment―and for news organizations to use journalistic rigor and better design to add value to citizens’ comments above the social layer. Through original interviews, anecdotes, field observations and summaries of research literature, Shanahan explains the obstacles of digital discourse as well as its promises for journalists in the digital age.

book cover of  Journalism, Online Comments, and the Future of Public Discourse

hat tip: Newsrooms take the comments sections back from platforms

Predictions for Journalism 2019 | Nieman Journalism Lab

Bookmarked Nieman Lab's Predictions for Journalism 2018 (Nieman Lab)
Each year, we ask some of the smartest people in journalism and digital media what they think is coming in the next 12 months. Here’s what they had to say.

I already see some pieces I want to read.

🔖 Highly: Highlight to share.

Bookmarked Highly (Highly)
Highlight the web to share the important parts.

Signing up for yet-another-silo. This one has some slick-looking UI and lots of social and sharing integrations. Their shares to Twitter look interesting, but I really wish there were some better ways to share so well to my own website. Sadly, unlike Hypothes.is, it doesn’t have any annotation functionality. I didn’t find my Twitter colleagues like Jon Udell, Nate Angell, or Jeremy Dean on the service through their Twitter integration set up.

After a cursory look, I’m worried what their funding and monetization plans are and where my data will be in just a few years. While it’s certainly pretty, I far prefer the functionality (and community) that Hypothes.is offers, so I’m not moving any time soon. Definitely worth taking a look at for some of its UI features and interactions and future functionality.

🔖 General Systems Theory: Beginning With Wholes by Barbara G. Hanson

Bookmarked General Systems Theory: Beginning With Wholes by Barbara G. Hanson (Taylor & Francis; 1 edition)

hat tip: Human Current episode 25

🔖 Farm to Taber Podcast

Bookmarked Farm to Taber Podcast (SoundCloud)
Farm to Taber is a show about the inner guts of the food system, and what it takes to make work sustainably. Wherever that takes us—science, history, tech, culture, policy, marketing, psychology, design, and more— Farm to Taber goes there.

🔖 JSON-LD And You – Google Slides | Aram Zucker-Scharff

Bookmarked JSON-LD And You: A Guide to Structured Metadata for Journalism by Aram Zucker-ScharffAram Zucker-Scharff (docs.google.com)

A presentation on Google Docs.

Hi, I’m Aram Zucker-Scharff and now that we’re settled in, I’ll take a minute to introduce myself. I’m the Director of Ad Tech Engineering at The Washington Post, where I work with teams across the organization to help the Post make money and, through our Arc platform, help other publications make money as well. But I’ve taken a long road to this point, I started off as a journalist, then an editor, a social media manager, a product manager, a freelance strategy consultant and developer and last a full stack developer. I even spent some time being very bad at selling ads.

Aram Zucker-Scharff is about as sharp as it gets when it comes to journalism, adtech, and technology. I do wish he’d spent some additional time on Microformats (or even the v2 implementation) as they’re still broadly supported and much less likely to be treated as the flavor-of-the-month that JSON-LD and schema.org are currently.

I dug around a bit and didn’t see any video from this session.

🔖 The influence of collaboration networks on programming language acquisition by Sanjay Guruprasad | MIT

Bookmarked The influence of collaboration networks on programming language acquisition by Sanjay Guruprasad (Massachusetts Institute of Technology)

Many behaviors spread through social contact. However, different behaviors seem to require different degrees of social reinforcement to spread within a network. Some behaviors spread via simple contagion, where a single contact with an "activated node" is sufficient for transmission, while others require complex contagion, with reinforcement from multiple nodes to adopt the behavior. But why do some behaviors require more social reinforcement to spread than others? Here we hypothesize that learning more difficult behaviors requires more social reinforcement. We test this hypothesis by analyzing the programming language adoption of hundreds of thousands of programmers on the social coding platform Github. We show that adopting more difficult programming languages requires more reinforcement from the collaboration network. This research sheds light on the role of collaboration networks in programming language acquisition.

[Downloadable .pdf]

Thesis: S.M., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2018.; Cataloged from PDF version of thesis.; Includes bibliographical references (pages 26-28).

Advisor: César Hidalgo.

URI: http://hdl.handle.net/1721.1/119085

I ran across this paper via the Human Current interview with Cesar Hidalgo. In general they studied GitHub as a learning community and the social support of people’s friends on the platform as they worked on learning new programming languages.

I think there might be some interesting takeaways for people looking at collective learning and online pedagogies as well as for communities like the IndieWeb which are trying to not only build new technologies, but help to get them into others’ hands by teaching and disseminating some generally tough technical knowledge. (In this respect, the referenced Human Current podcast episode may be a worthwhile overview.)

🔖 The New Testament: A Historical Introduction to the Early Christian Writings by Bart D. Ehrman

Bookmarked The New Testament: A Historical Introduction to the Early Christian Writings by Bart D. Ehrman (Oxford University Press; 6 edition)

Featuring vibrant full color throughout, the sixth edition of Bart D. Ehrman's highly successful introduction approaches the New Testament from a consistently historical and comparative perspective, emphasizing the rich diversity of the earliest Christian literature. Distinctive to this study is its unique focus on the historical, literary, and religious milieux of the Greco-Roman world, including early Judaism. As part of its historical orientation, the book also discusses other Christian writings that were roughly contemporary with the New Testament, such as the Gospel of Thomas, the Apocalypse of Peter, and the letters of Ignatius.

Book cover of The New Testament: A Historical Introduction to the Early Christian Writings by Bart D. Ehrman

An interesting looking textbook from Ehrman.

This is a recommended text for Dale Martin’s course Introduction to the New Testament History and Literature.

🔖 Introduction to Renormalization | Simon DeDeo | Complexity Explorer

Bookmarked Introduction to Renormalization by Simon DeDeo (Complexity Explorer)

What does a JPEG have to do with economics and quantum gravity? All of them are about what happens when you simplify world-descriptions. A JPEG compresses an image by throwing out fine structure in ways a casual glance won't detect. Economists produce theories of human behavior that gloss over the details of individual psychology. Meanwhile, even our most sophisticated physics experiments can't show us the most fundamental building-blocks of matter, and so our theories have to make do with descriptions that blur out the smallest scales. The study of how theories change as we move to more or less detailed descriptions is known as renormalization. 

This tutorial provides a modern introduction to renormalization from a complex systems point of view. Simon DeDeo will take students from basic concepts in information theory and image processing to some of the most important concepts in complexity, including emergence, coarse-graining, and effective theories. Only basic comfort with the use of probabilities is required for the majority of the material; some more advanced modules rely on more sophisticated algebra and basic calculus, but can be skipped. Solution sets include Python and Mathematica code to give more advanced learners hands-on experience with both mathematics and applications to data.

We'll introduce, in an elementary fashion, explicit examples of model-building including Markov Chains and Cellular Automata. We'll cover some new ideas for the description of complex systems including the Krohn-Rhodes theorem and State-Space Compression. And we'll show the connections between classic problems in physics, including the Ising model and plasma physics, and cutting-edge questions in machine learning and artificial intelligence.

🔖 Worldmapper | rediscover the world as you’ve never seen it before

Bookmarked Worldmapper | rediscover the world as you've never seen it before (Worldmapper)
Mapping our place in the world: The atlas for the 21st century. Worldmapper is a collection of world maps where countries are resized according to a broad range of global issues. Our cartograms are unique visualisations that show the world as you've never seen it before. Explore them all!

🔖 Networks by Mark Newman

Bookmarked Networks by Mark Newman (Oxford University Press; 2 edition)

The study of networks, including computer networks, social networks, and biological networks, has attracted enormous interest in the last few years. The rise of the Internet and the wide availability of inexpensive computers have made it possible to gather and analyze network data on an unprecedented scale, and the development of new theoretical tools has allowed us to extract knowledge from networks of many different kinds. The study of networks is broadly interdisciplinary and central developments have occurred in many fields, including mathematics, physics, computer and information sciences, biology, and the social sciences. This book brings together the most important breakthroughs in each of these fields and presents them in a coherent fashion, highlighting the strong interconnections between work in different areas.

Topics covered include the measurement of networks; methods for analyzing network data, including methods developed in physics, statistics, and sociology; fundamentals of graph theory; computer algorithms; mathematical models of networks, including random graph models and generative models; and theories of dynamical processes taking place on networks.

book cover of Networks by Mark Newman

🔖 The Deep Learning Revolution by Terrence J. Sejnowski | MIT Press

Bookmarked The Deep Learning Revolution by Terrence J. Sejnowski (MIT Press)

How deep learning―from Google Translate to driverless cars to personal cognitive assistants―is changing our lives and transforming every sector of the economy.

The deep learning revolution has brought us driverless cars, the greatly improved Google Translate, fluent conversations with Siri and Alexa, and enormous profits from automated trading on the New York Stock Exchange. Deep learning networks can play poker better than professional poker players and defeat a world champion at Go. In this book, Terry Sejnowski explains how deep learning went from being an arcane academic field to a disruptive technology in the information economy.

Sejnowski played an important role in the founding of deep learning, as one of a small group of researchers in the 1980s who challenged the prevailing logic-and-symbol based version of AI. The new version of AI Sejnowski and others developed, which became deep learning, is fueled instead by data. Deep networks learn from data in the same way that babies experience the world, starting with fresh eyes and gradually acquiring the skills needed to navigate novel environments. Learning algorithms extract information from raw data; information can be used to create knowledge; knowledge underlies understanding; understanding leads to wisdom. Someday a driverless car will know the road better than you do and drive with more skill; a deep learning network will diagnose your illness; a personal cognitive assistant will augment your puny human brain. It took nature many millions of years to evolve human intelligence; AI is on a trajectory measured in decades. Sejnowski prepares us for a deep learning future.

The Deep Learning Revolution by Terrence J. Sejnowski book cover