‘Perception of risk’ by Paul Slovic, Part 3: So how do we communicate risk then?

The last part in my discussion of ‘Perception of Risk’, by Paul Slovic

At the core of why people don’t agree on risk is the difference between how the experts and the lay public understand risk. To the experts, risk is most often boiled down (too far) to metrics like ‘expected annual mortality’, which are figures that allow easy comparison, and seem (to the well-informed experts) to be able to perfectly communicate how risky an activity is. The is in direct contrast with the way the lay public understands and come to understand risk. To try and communicate risk based solely on facts ignores the context in which a lay person will be receiving those facts, the inputs and preconceived ideas that that person has already dealt with, and the emotional understanding already in place.

At the end of the paper, Slovic attempts to address the looming question that hangs over the whole paper, after taking all this information on board, is it actually possible to communicate risk effectively? One methods that is put forward (based on the transmission, lecture style of science communication) is to put the risk level in a way that the lay public can understand.

One really good example of this type of communication comes from Dr. Derek Muller on the Veritasium channel on YouTube:

This is a really digestible example of the use of qualitative risk to demonstrate the relative danger. But Slovic says that this is not enough, (agreeing with the current trend of science communication) what needs to be done is to have ‘two-way’ communication, instead of one-way. As enjoyable as the video is, it highlights how the theme that runs through most of the videos on Veritasium can be applied to other topics. In Veritasium, Dr. Muller attempts to break down peoples preconceived ideas about how physics works, and to help people understand that using human intuition to understand complex topics is often not enough, because they don’t operate in a way our relatively simply brain can understand. I think this thought can be applied to communication as well, human society has grown to the point where our ancestral ability to socialise and gossip doesn’t cut it when it comes to understanding how large groups of people will react to and understand certain information.

To really get the message of risk communication across, a two-way dialogue needs to be started. A dialogue that takes into account the facts, but also the the lived experience of the lay public. Interestingly, at the end of the paper Slovic makes the statement that the lay public is actually better than the experts at understanding the broader consequences of these sort of events, and as such, it seems to me that the onus is on the experts to defend themselves to the public, rather than hand down wisdom from above.

Say something controversial.

Matt

‘Perception of risk’ by Paul Slovic, Part 1: How do people process risk?

I was recently put on to an important research paper by Universal Genes, “Perception of Risk” by Paul Slovic. This paper is a review of the literature (at the time) surrounding the public perception of risk, and how the public comes to its opinion on how risky a technology or event is. Because so much is covered by this paper, I’m going to do a couple of blog posts on it, just so I can do it justice.

This first post is more about some of the big take home messages of the paper, about how the public perceives risk, the next couple of posts will be more about specific aspects of the issue.

An important part of understanding the perception of risk is how people come to accept it. Society, when asked, rates the risk for all activities as too high. But if the technology or activity is seen as being ‘highly beneficial’ then the risks are tolerated. Some early research indicated that the voluntary nature of risk is one of the most important factors in people’s perception of how risky it is. People were found to be 1000 times more likely to find a risk acceptable if they see the action as voluntary.

To understand better how people perceive different technologies, and how this affects the level of risk perceived, a study was conducted. Risks were compared based on how highly they rated on two scales: how unknown the technology was, and how much dread the technology inspired. The ‘unknown’ factor is based on how delayed adverse events could be from a certain event or technology (which no doubt creates the arguments against GMOs, as there is always the question of ‘how do we know if wont affect us in however many years). The ‘dread’ factor came from how likely people though the technology could result in a catastrophic mishap, lack of control, potential fatalities and the inequitable distribution of risk. In the groups of people interviewed, the ‘dread’ factor was seen to be most important in a persons perception of the risk involved.

Technologies like nuclear power scored highly on both scales. When the technologies that were rates were analysed for how much regulation there was on them, there is a clear relationship between how high they scored on both scales. The higher the dread or uncertainty, the more likely that they would be highly regulated (or that there would be loud calls for more regulation, see nuclear power and GMOs).

Fear of the unknown seems to be playing a big role here, and Slovic goes in to a lot of detail about the factors that influence risk perception. I’ll cover that in the next post.

Say something controversial.

Matt