Thank you to @RyersonResearch and especially @joyceemsmith for inviting me to talk about my research today. I had a great time talking IndieWeb, and specifically, Bridgy. Jan 30, 2019 Lunch and Learn at Ryerson Journalism Research Centre I presented a study I’ve been working on about Bridgy, i...
Gives me hope in old age that I have a German Shepherd and not a Chihuahua (or slime mold)…
The Kardashian Index is a measure of the discrepancy between an academic's social media profile and publication record based on the direct comparison of numbers of citations and Twitter followers.
The Kardashian Index (K-index) can be calculated as follows:
K - index = F(a) / F(c)
F(a) is the actual number of Twitter followers of academic X. F(c) is the number academic X should have given their citations C; given a trend identified in the original paper, it is calculated as:
F = 43.3C0.32
The author of the index says that "a high K-index is a warning to the community that researcher X may have built their public profile on shaky foundations, while a very low K-index suggests that a scientist is being undervalued. ... those people whose K-index is greater than 5 can be considered 'Science Kardashians'.
In today's world, scientists in many disciplines and a growing number of journalists live and breathe data. There are many thousands of data repositories on the web, providing access to millions of datasets; and local and national governments around the world publish their data as well. To enable easy access to this data, we launched Dataset Search, so that scientists, data journalists, data geeks, or anyone else can find the data required for their work and their stories, or simply to satisfy their intellectual curiosity.
Maps & spatial analysis: One-dot one-person map for the entire United States: Introduction to geo-scripting in R & Python: Awesome blog with cool maps and the codes behind them by James C…
From the archives: Peer review is a scientific institution; here's its purpose.
Recent PhD graduate Lucy A. Taylor shares the advice she and her colleagues wish they had received.
New Data & Society report recommends editorial “better practices” for reporting on online bigots and manipulators; interviews journalists on accidental amplification of extreme agendas
This report draws on in-depth interviews by scholar Whitney Phillips to showcase how news media was hijacked from 2016 to 2018 to amplify the messages of hate groups.
Offering extremely candid comments from mainstream journalists, the report provides a snapshot of an industry caught between the pressure to deliver page views, the impulse to cover manipulators and “trolls,” and the disgust (expressed in interviewees’ own words) of accidentally propagating extremist ideology.
After reviewing common methods of “information laundering” of radical and racist messages through the press, Phillips uses journalists’ own words to propose a set of editorial “better practices” intended to reduce manipulation and harm.
As social and digital media are leveraged to reconfigure the information landscape, Phillips argues that this new domain requires journalists to take what they know about abuses of power and media manipulation in traditional information ecosystems; and apply and adapt that knowledge to networked actors, such as white nationalist networks online.
This work is the first practitioner-focused report from Data & Society’s Media Manipulation Initiative, which examines how groups use the participatory culture of the internet to turn the strengths of a free society into vulnerabilities.
Abstract: The News Study research report presents findings about how a sample of U.S. college students gather information and engage with news in the digital age. Results are included from an online survey of 5,844 respondents and telephone interviews with 37 participants from 11 U.S. colleges and universities selected for their regional, demographic, and red/blue state diversity. A computational analysis was conducted using Twitter data associated with the survey respondents and a Twitter panel of 135,891 college-age people. Six recommendations are included for educators, journalists, and librarians working to make students effective news consumers. To explore the implications of this study’s findings, concise commentaries from leading thinkers in education, libraries, media research, and journalism are included.
telephone interviews with 37 participants ❧
I have to wonder at telephone samples of this age group given the propensity of youth to not communicate via voice phone.
October 22, 2018 at 08:15PM
Major Findings (2:35 minutes) ❧
I’m quite taken with the variety of means this study is using to communicate its findings. There are blogposts, tweets/social posts, a website, executive summaries, the full paper, and even a short video! I wish more studies went to these lengths.
October 22, 2018 at 08:19PM
In the early 1950s, the university's hospital stole cells from Lacks, who has been called the "mother of modern medicine."
PlumX Metrics provide insights into the ways people interact with individual pieces of research output (articles, conference proceedings, book chapters, and many more) in the online environment. Examples include, when research is mentioned in the news or is tweeted about. Collectively known as PlumX Metrics, these metrics are divided into five categories to help make sense of the huge amounts of data involved and to enable analysis by comparing like with like.
PlumX gathers and brings together appropriate research metrics for all types of scholarly research output.
We categorize metrics into 5 separate categories: Usage, Captures, Mentions, Social Media, and Citations.
Retraction Watch readers may be familiar with the story of a paper about gender differences by two mathematicians. Last month, in Weekend Reads, we highlighted an account of that story, which appea…
This statement addresses some unfounded allegations about my personal involvement with the publishing of Ted Hill's preprint "An evolutionary theory for the variability hypothesis" (and the earlier version of this paper co-authored with Sergei Tabachnikov). As a number of erroneous statements have been made, I think it's important to state formally what transpired and my beliefs overall about academic freedom and integrity. I first saw the publicly-available paper of Hill and Tabachnikov on 9/6/17, listed to appear in The Mathematical Intelligencer. While the original link has been taken down, the version of the paper that was publicly available on the arxiv at that time is here. I sent an email, on 9/7/17, to the Editor-in-Chief of The Mathematical Intelligencer, about the paper of Hill and Tabachnikov. In it, I criticized the scientific merits of the paper and the decision to accept it for publication, but I never made the suggestion that the decision to publish it be reversed. Instead, I suggested that the journal publish a response rebuttal article by experts in the field to accompany the article. One day later, on 9/8/17, the editor wrote to me that she had decided not to publish the paper. I had no involvement in any editorial decisions concerning Hill's revised version of this paper in The New York Journal of Mathematics. Any indications or commentary otherwise are completely unfounded. I would like to make clear my own views on academic freedom and the integrity of the editorial process. I believe that discussion of scientific merits of research should never be stifled. This is consistent with my original suggestion to bring in outside experts to rebut the Hill-Tabachnikov paper. Invoking purely mathematical arguments to explain scientific phenomena without serious engagement with science and data is an offense against both mathematics and science.
In the highly controversial area of human intelligence, the ‘Greater Male Variability Hypothesis’ (GMVH) asserts that there are more idiots and more geniuses among men than among women. Darwin’s research on evolution in the nineteenth century found that, although there are many exceptions for ...
If a formally refereed and published paper can later be erased from the scientific record and replaced by a completely different article, without any discussion with the author or any announcement in the journal, what will this mean for the future of electronic journals?
This is a very concerning issue and a good reason why people should also practice samizdat and place multiple copies online in various repositories.
How the World Academy of Science, Engineering and Technology became a multimillion dollar organization promoting bullshit science through fake conferences and journals.