‘The flat earth society has members all around the globe’ –
How to right the wrongs of science reporting in the post-truth era?
Howard Hudson,
United Nations University
Draft science reporting brief
25 June 2018
Combined target audiences
i) Researchers, ii) Communications officers, and iii) Journalists and editors working on UN issues, particularly global health (#SDG3)
Guiding theme#1: On doubt and democracy
‘Our freedom to doubt was born out of a struggle against authority in the early days of science. It was a very deep and strong struggle: permit us to question — to doubt — to not be sure… This is the philosophy that guided the men who made the democracy that we live under.
The idea that no one really knew how to run a government led to the idea that we should arrange a system by which new ideas could be developed, tried out, and tossed out if necessary, with more new ideas brought in — a trial and error system.’
Richard Phillips Feynman, Nobel Laureate in Physics (1918-1988)
Guiding theme#2: On science and separation
‘Although we humans cut nature up in different ways, and we have different courses in different departments, such compartmentalization is really artificial. The separation of fields is merely a human convenience, and an unnatural thing. Nature is not interested in our separations, and many of the interesting phenomena bridge the gaps between fields.’
Richard Phillips Feynman, Nobel Laureate in Physics (1918-1988)
—
Overview
Public trust appears to be won via a combination of expertise, integrity and benevolence. This reflects classical models of persuasion and demonstrates why doctors are the most trusted experts, according to a number of recent surveys worldwide. The basic recommendations:
- Scientific research must be reported in a clear, credible and inclusive way – otherwise it risks being ignored or even manipulated;
- It is a fine line, however: being inclusive enough to engage mass audiences, while avoiding the pitfalls of sensationalism and identity politics;
- On health issues, reporting may win greater public trust by focusing more on practical scientific applications rather than the contested causes (regardless of the consensus).
The overall aim of this brief and this workshop is to improve the impact of ‘our’ work as researchers, communications staff and journalists. This will be part of a broader evidence-based approach, involving closer inter-sectoral partnerships.
Introduction
Trust is a short but important word for most people working in science, the media and government. Trust speaks to “the veracity, integrity, or other virtues of someone or something” and is a basic concept spanning generations. Yet, in 2018, public trust in core institutions is on the verge of crisis – a crisis of credibility.
First, there is a fundamental questioning of scientific ‘truths’. Who should define them, what methods should be used, what is the consensus on any given issue? Second, there is a crisis in the reporting of science. In short, are journalists still able to hold the powerful to account in this new media landscape?
Headlines like ‘The flat earth society has members all around the globe’ are of course satirical. But some reporting, even policymaking, on climate change and other global issues have become equally farcical. What seems to be driving this trend is the politicisation of science. In this context, are voters and politicians simply irrational? Do they base their judgements more on cultural identity than on scientific literacy?
Public opinion landscapes: A patchwork of (mis)trust?
Even before the latest presidential elections, US citizens were “least confident in the news media, business leaders, and elected officials to act in the best interests of the public,” said a report from the Pew Research Center, October 2016. The survey of 1500 US adults found just 27% had confidence in elected officials and barely 38% had confidence in the news media. Scientists, on the other hand, were trusted by 76% of respondents; a figure that rose to 84% for medical scientists. This begs the questions: are medical sciences easier to understand for laypeople? Are they better explained by experts? Or are they just less ‘political’ than healthcare and climate change, for example?
In the UK, a 2016 Ipsos-MORI survey of 1000 adults found trust in politicians at the bottom of the scale: at just 21%, below journalists at 25%. The authors struck a note of caution, however: “from this long-running survey we can see that public trust has been an issue for [UK] politicians for at least the past 33 years”. At the other end of the ranking, scientists and doctors were trusted by respectively 79% and 89% of respondents – showing even higher levels of trust than the USA.
Research commentary on communicating science
“In our digital age, (scientific) information is accessible at all times,” writes Dr. Friederike Hendriks at the University of Münster in Germany.
“The Internet, smartphones, and tablet computers enable a large public to access a vast amount of information in just a few seconds. In 2014, more than 60% of Internet users usually turned to the Internet to get information on scientific topics, and the percentage is even higher among young users… Such developments pose two challenges for laypeople. First, they have to decide who is a trustworthy expert without themselves possessing much topic-specific knowledge… Second, there is an even greater risk of being misinformed online than in the traditional news media. Because virtually anyone may publish online without gatekeepers such as publishers or editors, it is up to the recipient to assess online sources for trustworthiness and information on their credibility.”
Dr. Hendriks answers these questions in her 2015 study, in which she coins the term ‘epistemic trustworthiness’. This refers to how and why the general public decides to “trust in – and defer to – an expert in order to solve a scientific informational problem that is beyond their understanding.” From her ‘inventory’ results, public trust is based not only on expertise but also on integrity and benevolence. (This echoes the three means of persuasion – logos, ethos and pathos – defined by the Greek philosopher Aristotle.)
To further test this hypothesis: why, for example, are doctors the most trusted of experts, as shown in major surveys from the UK and USA? Thanks to the basic Hippocratic Oath, advances in modern medicine, and high success rates, it follows that doctors are generally trusted by the public. However, this is also because doctors are experts at explaining complicated procedures to laypeople: people facing life and death situations, people overwhelmed by the latest high-tech treatments.
How is this relevant to the emerging crisis in science reporting? Because many scientists are failing to get their messages across. According to the surveys above, most people trust scientists; they want to believe. Yet, they either cannot understand them or are misled by fake news, ‘crooked’ politicians and ‘gamed’ feeds. Much of this relates to the language of research reports, which may be full of ‘elitist jargon’ – i.e., disconnected from most people’s everyday lives. In follows therefore, that if research is not properly clarified to the public in an inclusive way, it risks being ignored or manipulated.
This chimes with other recent findings. A 2014 survey by the Pew Research Center found that: “Despite broadly similar views about the overall place of science in America, citizens and scientists often see science-related issues through different sets of eyes…”
The science of science communication: Solving by doing?
“Never have human societies known so much about mitigating the dangers they face but agreed so little about what they collectively know,” says Prof. Dan M. Kahan of Yale University in the USA. “The most popular explanation for the science communication paradox can be called the ‘public irrationality thesis’ or PIT.” Yet, the evidence does not bear this out.
“Members of the public who score highest in one or another measure of science comprehension, studies show, are no more concerned about global warming than those who score the lowest. The same pattern, moreover, characterizes multiple other contested risks, such as the ones posed by nuclear power, fracking, and private possession of firearms.”
“Another plausible conjecture — another hypothesis about the science communication paradox — is the ‘cultural cognition thesis’ (CCT).”
“Like sports fans motivated to see the officiating replay as supporting their team, the subjects selectively credited or discredited the evidence we showed them — the position of a highly qualified scientist — in a manner supportive of their group’s position… Under PIT, one should expect individuals who are high in science comprehension to use their knowledge and reasoning proficiency to form risk perceptions supported by the best available scientific evidence…”
Tests, however, substantiated CCT. In other words, “individuals who score highest on tests of one or another reasoning disposition opportunistically use that disposition to search out evidence supportive of their cultural predispositions and explain away the rest.”
On contested issues like vaccine safety and climate change, people make up their minds based on group identity – regardless of scientific literacy. In fact, the more intelligent they are, the more likely they are to show a confirmation bias. Therefore, any blunt attempt at dissuasion, anything less than a paradigm shift in public trust, is likely to fail. Essentially, says Kahan: “science communication professionals must protect citizens from having to choose between knowing what’s known by science and being who they are as members of diverse cultural communities.” We are, ultimately, social animals.
How then, can ‘we’ as researchers, communications staff and journalists make a difference? According to Kahan, we can achieve more by focusing on practical scientific applications.
Recommendations: Defined on the ground
This is a fitting pivot to the recommendations. First, in the current political and media climate, scientists need to leave their ‘ivory towers’ and communicate their work, like doctors, to a broad and mainly non-specialist audience – to prove not only their expertise but also their integrity and benevolence.
This means using clear, compelling and inclusive language in all public-facing work. This is best ensured by working in tandem with communications staff and journalists – not as a final flourish but throughout the research cycle. In parallel, communications staff and journalists must ‘take back control’ of the new media landscape to ensure the broadest possible diffusion of these messages.