Category Archives: Data

Error screen

Opportunities for critical thinking about data journalism

The 1988 compilation of essays Selected Issues in Logic and Communication included one by Ralph Johnson called “Poll-ution: Coping with Surveys and polls.”

Johnson wrote about critical thinking questions that help you decide whether to accept the conclusions in news coverage of polling data:

Polls are often reported and are increasingly significant in political life. We need to know how to assess reports of polls. Crucial information such as the nature of the sample and the precise question asked is often not reported. No matter how accurate sampling techniques are, a poll cannot provide valuable information if its question is misleading or loaded.*

I liked Jacob Harris’s recent essay in Source because it provided a similar collection of entry points for critical thinking questions about data journalism:

Data journalism does not fall perfect from the sky. It’s painstakingly built. … in my own experience it generally involves the following steps:

  • Acquisition
  • Cleaning
  • Loading
  • Verification
  • Analysis
  • Cherry-picking
  • Presentation
  • Maintenance

The fun of data journalism is that each of these steps can introduce errors that can affect the final story.

Read “The Times Regrets the Programmer Error” at Source

* When I added Johnson’s essay to my bibliography manager, I included this quote in the “Abstract” field. But I did not mark a page number and no longer have access to the book. My search for this essay in Google Books returns the beginning of the quote, so I feel pretty safe in assuming it is Johnson’s, but there is a chance I am wrong.

Photo by Anthony Catalano:

The argumentative power of data and narrative in journalism

Nick Diakopoulos, summarizing discussion at the Computation + Journalism Symposium:

This brings us back to the raw cultural difference of the value of “theory” or “model” (i.e. understanding the central tendency and abstraction of data) versus the “anecdote” or “outlier” that is so important to journalists feeling they’ve got a good story to tell. We may be just at the beginning of understanding the benefits and tradeoffs of the narrative-dominant frame versus the analytic-dominant frame, but it’s certain that the cultural dilemma of how news communication is approached underscores a central challenge in integrating computation and journalism.

Read “Finding tools vs. making tools: Discovering common ground between computer science and journalism” at Nieman Lab.

Science journalism in context: What would the arguments look like?

A study published recently in PLoS ONE concluded that science journalism fails to contextualize, over time, initial findings in medical studies. “Initial observations are often refuted or attenuated by subsequent studies,” the researchers write. But the subsequent studies receive less coverage than the initial findings. Whatever coverage subsequent studies do receive rarely references their relationship to the initial findings.

The relationship to argument in journalism here is of an “ought” question: “What ought journalists argue when they cover medical research?”

The researchers here contend that journalists ought to include a claim like, “this study [supports / refutes / questions / etc.] previous studies that…” Of course, links to previous coverage would be appropriate here, too.

Further down the line, we can judge the quality of the argument, in the form of the contextualization offered as well. What support is offered for the claim that the study supports or refutes or questions research X?

h/t: The Economist

How do journalists argue using documents and data?

Professor C.W. Anderson recently presented his latest research project at a colloquium at UNC. The session was recorded (and embedded below), and after watching it, I think that the project is pretty interesting from the argument-in-journalism perspective.

Anderson’s project, as I understand it, is to place historical context around the way journalists use and have used documents to report.

Scholars of journalism have watched for a long time how interviews and first-person witnessing of events become building blocks of news stories. They also examine what has changed in journalists’ use of interviews and witnessing over many decades.

According to Anderson, for any historical period, we can ask three questions these journalistic processes: How journalists collect interviews (or witnesses, or data); how journalists analyze them; and how the results are presented to readers and users.

Now, Anderson wants to study the use of documents and data journalism in a similar way: How are documents and data collected, analyzed, and the results eventually presented to readers and users?

The question of presentation is key. If and when journalists present an argument (or presented an argument, historically) that has documents or data journalism as a pillar, what is done with them to try to make a persuasive case?

With knowledge in hand of how journalists use data and documents to argue conclusions, using the analytical tools of argumentation, informal logic, and critical thinking is the easy next step. How strong are those arguments? Are they stronger or weaker than non-data-driven arguments? What is missing from them? And so on.