Issues Magazine

Risk Communication

By Craig Cormick

Scientists and regulators tend to talk a lot about how to better communicate risk, and the science of risk assessment to the public, but they don’t tend to talk so much about how to better understand what the public thinks about risk, and why they think what they do. Yet understanding risk perceptions is vital to better risk communications.

What are the biggest risks facing humanity today? Global climate change? Terrorism? The spread of new technologies like genetically modified foods and nanotechnology? Invasion by seven-foot shape-shifting aliens?

Your answer will vary, of course, depending on your perspective and world view. And that’s a key point in better understanding risk communication.

We need to better understand the differences between perceptions of risk as seen by scientists and perceptions of risk as seen by the public.

The scientific formula for understanding risk is: risk = likelihood x impact.

However, the public perception of risk tends to be expressed as: risk = hazard x outrage.

Understanding these two differences and the difficulty of changing one perception to the other is crucial to understanding the difficulty of risk communication. Towards this are five key lessons I’ve derived from over 10 years working with public perceptions of the risk of new technologies:

1. When information is complex, people make decisions based on their values and beliefs rather than on facts and logic.

2. People seek affirmation of their attitudes (or beliefs) – no matter how fringe – and will reject any information or facts that are counter to their attitudes (or beliefs).

3. Attitudes that were not formed by facts and logic are not influenced by facts or logic.

4. Public concerns about the risk of contentious science or technologies are almost never about the science – and scientific information therefore does little to influence those concerns.

5. People most trust those whose values mirror their own.

According to David Ropeik, a leading consultant on risk communications and former Director of Communications for the Center for Risk Analysis at the Harvard School of Public Health: “The perception of risk – the way we interpret information in order to survive – is a subjective, affective mix of facts and feelings, intellect and instinct, reason and gut reaction”.

When Feelings Get Confused for Facts
When feelings get confused for facts, though, risk communication gets difficult. If you doubt that, consider the last argument you might have had, or witnessed, between advocates and opponents of genetically modified foods or infant vaccinations.

Vincent Covello, Director of the Centre for Risk Communication in the United States, and one of the global gurus on risk communications, has said that when people are stressed, their perceptions and decisions are influenced by a wide range of factors, with technical facts often being the least important (worth less than 5%).

They also tend to:

• have difficulty hearing, understanding and remembering information;

• be distrustful of others; and

• focus more on negative than positive information.

You can even test yourself on this. Think of a strongly held opinion you have about the risk of something. It might be wind farms, nuclear power, nanotechnology, genetically modified foods or climate change.

The first question to ask yourself is: Do you know your belief to be true, or feel it is true. If you said feel, ask yourself whether your opinion is supported by the majority of society. Then ask yourself whether your opinion is also supported by the majority of scientific evidence.

Now you really have to be honest with yourself and ask whether or not you only search for, and find, evidence that supports your opinion. And then the big one: would you ever change your mind if provided with evidence that challenged your belief?

Most of us will be pretty sure that we make good decisions based on evidence, but the fact is that we often make decisions based on what we feel to true, and find ways to convince ourselves that it is known to be true. This is largely due to the way our brains are wired: to seek shortcuts and muddy analytical thinking with emotional thinking.

How We Think about Risks
Consider the following points about how we tend to think about risks, and how they might have influenced your answers.

Although our intuition has served us well for tens of thousands of years, for example keeping us safely inside caves rather than venturing out into the dark, dangerous night, it can actually be quite unsuited to evaluating the modern high-tech world in which we now live. And when we are time-poor, overwhelmed with data, uncertain or driven by fear or emotion, we tend to assess information on intuition, or mental shortcuts, using values rather than using logic.

Also, most people, when faced with an issue related to science and technology adopt an initial position of support or opposition based on a variety of beliefs and predisposed values rather than scientific facts. Such values can include a belief in the sacredness of life or nature, or can include a belief that humans have a right to dominate nature. And we tend to form and hold opinions to protect values that are important to us, and rate evidence that supports our opinions higher than evidence that challenges them.

Jenny McCarthy, celebrity leader of the anti-vaccination movement in the United States, says that she bases her rejection of vaccines on her intuition. Likewise, many alternative therapists advocate that people should put their trust in their own intuition to justify their choices.

A large number of psychological studies show that we respond to scientific evidence in ways that justify our pre-existing beliefs, selecting information that supports our beliefs and dismissing information that doesn’t. This is called confirmation bias, and is in part due to the way we are wired psychologically. But it leads us to distortions of perception, inaccurate judgments or illogical interpretations that we fail to recognise as such.

A contentious risk issue to illustrate this is childhood vaccinations and belief in their link to autism and other nasty side-effects. According to the US Centers for Disease Control and Prevention, one in five Americans believes that vaccines can cause autism, and two in five Americans have either delayed or refused vaccines for their child. According to the Australian General Practice Network, vaccination rates have been dropping in Australia over the past seven years, and now only with 83% of 4-year-olds are covered. This is below the 90% rate needed to assure community-wide disease protection, so it is more likely there will be outbreaks of fatal but preventable diseases.

In some areas – usually where there are high pockets of alternative lifestyle supporters such as south-east Queensland, the northern rivers of New South Wales, the Adelaide Hills and the south-west of Western Australia – vaccination rates are as low as 70%.

Of interest is the fact that anti-vaccination fears have not notably diminished since the original study linking vaccinations with autism by Dr Andrew Wakefield, a consultant at the Royal Free Hospital School of Medicine in England, was discredited and retracted by the esteemed medical journal The Lancet and he was struck off the medical register. This demonstrates how strongly non-scientific beliefs can be ingrained in us, and how difficult they can be to influence with any scientific facts.

Information Flows in a Web 2.0 Era
In the age of the internet, the information communication flows are entirely different to what we may have been used to just a decade or so ago. We all know that the promise of the internet to provide us with a wealth of information to make us smarter and better was akin to the early hopes that television would make us more educated and could teach us many languages. Instead, we are better at watching people dance and sing and cook on TV, and we are swamped with irrelevant data on the internet or we just look for data that supports an existing belief.

The internet is not fully to blame though – it is just a channel for information – but the sheer amount of data available – of highly variable quality that doesn’t distinguish between research, news and opinion – has changed the relationship between information and attitude formations.

Where once we might have started with the germ of a wacky idea and sought to check its validity with experts such as teachers, or even by reading an encyclopaedia, we now have the ability to find a community of people somewhere in the world with similar wacky ideas that have never been tested by an expert. And through ongoing affirmation and reinforcement of wacky ideas they become values or beliefs that are difficult to change.

Cathy Fraser’s 2003 PhD study at the Australian National University into vaccinating and non-vaccinating parents found that only 1.6% of parents who chose to vaccinate used the internet for more information – but of non-immunising parents 36.2% used the internet. All had access to the standard health department publications on vaccinations.

Brian Zikmund-Fisher, Assistant Professor of Health Behaviour and Health Education at the University of Michigan’s School of Public Health, has said that the era of web 2.0 has moved our discussions from facts and figures towards narratives and stories. He has said that the web 2.0 world is all about experience, and can be seen to be a “triumph of experiential and emotional learnings over cognitive rational thought”.

The Fear Factor
Franklin D. Roosevelt said: “We have nothing to fear but fear itself”. If only.

When it comes to the fear factor, even the physical layout of our brains is working against us. Studies have shown that sensory inputs are sorted by the thalamus and sent on to different parts of the brain for a response. But the amygdala, which is the “Danger, Will Robinson!” part of the brain, is located closer to the thalamus than almost any part of the brain, and gets to respond to messages first. It is certainly a lot quicker than a response coming back from the prefrontal cortex, which is responsible for our higher order thinking and decision-making.

Ropeik says that in analysing risk perception based on brain functions:

Both the physical architecture and biochemistry of the brain ensure that emotion and instinct have the upper hand over reason and rationality. Nothing could make this point more clearly than to state that the conscious awareness of fear is merely ... what your brain, and body, have already been up to in the name of self-protection. Before you know you are afraid, you are. The inescapable truth is that, when it comes to risk, we are hardwired to feel first and think second.

Another take on fear is that, according to Frank Furedi, Professor of Sociology at the University of Kent, and author of The Precautionary Principle and the Crisis of Causality, we are losing our capacity to deal with the unknown because we are increasingly believing that we are too powerless to deal with the perils confronting us.

He says that one of the many consequences of this is a growth in risk-related policies that are increasingly being based on feeling and intuitions rather than on evidence and facts.

When we compare statistics on risky behaviours and their impact on loss of life, and our perceptions of the risks of such behaviours, there can be quite large differences. For instance, Professor Bernard Cohen of the University of Pittsburgh has developed a measurement of loss of life expectancy, whereby he has estimated the days of life lost from different behaviours. He found that smoking cigarettes equated to a loss of life of over 2000 days from a male’s life expectancy; being 20% overweight led to a loss of about 1000 days of life; while radiation from nuclear power led to less than one day’s loss of life.

Conversely, a study of the perceptions of risk by Paul Slovic of the University of Oregon showed that smoking was perceived as a relatively low risk while nuclear radiation was seen as a high risk.

Risk Versus Benefit
Covello has quantified several of the factors that govern risk perceptions:

High Risk vs Low Risk
Lack of choice vs Having choice
Rare vs Common
Fatal vs Not fatal
Not known to science vs Known to science
Not controllable vs Controllable
New vs Old

We know that risks can be mitigated by benefits, and people will accept risks if they perceive the benefits to be greater (for example mobile phone use), but perceptions of risks and benefits don’t always align as neatly as perceptions of high risks and low risks do in Covello’s table. Risks can be vague, but benefits need to be specific. Risks are easily believed and are often perceived to be higher than they actually are, while benefits need credible evidence to be believed. Risks are accepted without acknowledgement of benefits, but benefits are most accepted when risks are acknowledged.

Trust
Added to all this is the issue of trust. It might be surprising to know that we assess trust largely in the first 30 seconds or so of listening to a commentator, and the biggest determinant of trust is whether we perceive a person to be listening and caring. There is much less emphasis on whether we perceive them to be an expert in what they are talking about.

So, in a battle of trust between an expert who might appear to be distant and aloof, and who quotes lots of facts and figures, and a community representative with little knowledge but lots of empathy who talks about feelings, the community representative gains the highest trust easily.

Covello has summed up this predicament as: “People don’t care what you know, they want to know that you care!”

It is a lesson that anybody working in risk communications needs to take to heart, along with the numerous studies into how the public perceives risks. For risk communication to be effective, it needs more to align with the ways of thinking of the public than to expect the public to align their thinking with that of the experts.