Five rules for evidence communication

Avoid unwarranted certainty, neat narratives and partisan presentation; strive to inform, not persuade.

“Be persuasive”, “be engaging”, “tell stories with your science”.

Most researchers have heard such exhortations many times, and for good reason. Such rhetorical devices often help to land the message, whether that message is designed to sell a product or win a grant. These are the traditional techniques of communications applied to science.

This approach often works, but it comes with danger.

There are myriad examples from the current pandemic of which we might ask: have experts always been explicit in acknowledging unknowns? Complexity? Conflicts of interest? Inconvenient data? And, importantly, their own values? Rather than re-examine those cases, we offer ideas to encourage reflection, based on our own research.

Our small, interdisciplinary group at the University of Cambridge, UK, collects empirical data on issues such as how to communicate uncertainty, how audiences decide what evidence to trust, and how narratives affect people’s decision-making. Our aim is to design communications that do not lead people to a particular decision, but help them to understand what is known about a topic and to make up their own minds on the basis of that evidence. In our view, it is important to be clear about motivations, present data fully and clearly, and share sources.

These observations fit with the literature, which identifies expertise, honesty and good intentions as the key to trustworthiness1. Researchers need to demonstrate all three: we cannot expect to be trusted on the basis of expertise alone.

The field of evidence communication has been growing over several decades, mainly stemming from researchers in medical communication, but there is still much we don’t know about its effects, or best practice. If one is not trying to change belief or behaviour, it’s hard even to know how to measure success. Like all engagement efforts, many of the effects of a message are moderated greatly by non-verbal cues and the relationships between communicator and audience. But these challenges are why we think it important to consider alternative approaches (see ‘Quick tips for sharing evidence’).


The aim is to ‘inform but not persuade’, and — as the philosopher of trust Onora O’Neill says — “to be accessible, comprehensible, usable and assessable”.

• Address all the questions and concerns of the target audience.

• Anticipate misunderstandings; pre-emptively debunk or explain them.

• Don’t cherry-pick findings.

• Present potential benefits and possible harms in the same way so that they can be compared fairly.

• Avoid the biases inherent in any presentation format (for example, use both ‘positive’ and ‘negative’ framing together).

• Use numbers alone, or both words and numbers.

• Demonstrate ‘unapologetic uncertainty’: be open about a range of possible outcomes.

• When you don’t know, say so; say what you are going to do to find out, and by when.

• Highlight the quality and relevance of the underlying evidence (for example, describe the data set).

• Use a carefully designed layout in a clear order, and include sources.