‘Perception of risk’ by Paul Slovic, Part 3: So how do we communicate risk then?

The last part in my discussion of ‘Perception of Risk’, by Paul Slovic

At the core of why people don’t agree on risk is the difference between how the experts and the lay public understand risk. To the experts, risk is most often boiled down (too far) to metrics like ‘expected annual mortality’, which are figures that allow easy comparison, and seem (to the well-informed experts) to be able to perfectly communicate how risky an activity is. The is in direct contrast with the way the lay public understands and come to understand risk. To try and communicate risk based solely on facts ignores the context in which a lay person will be receiving those facts, the inputs and preconceived ideas that that person has already dealt with, and the emotional understanding already in place.

At the end of the paper, Slovic attempts to address the looming question that hangs over the whole paper, after taking all this information on board, is it actually possible to communicate risk effectively? One methods that is put forward (based on the transmission, lecture style of science communication) is to put the risk level in a way that the lay public can understand.

One really good example of this type of communication comes from Dr. Derek Muller on the Veritasium channel on YouTube:

This is a really digestible example of the use of qualitative risk to demonstrate the relative danger. But Slovic says that this is not enough, (agreeing with the current trend of science communication) what needs to be done is to have ‘two-way’ communication, instead of one-way. As enjoyable as the video is, it highlights how the theme that runs through most of the videos on Veritasium can be applied to other topics. In Veritasium, Dr. Muller attempts to break down peoples preconceived ideas about how physics works, and to help people understand that using human intuition to understand complex topics is often not enough, because they don’t operate in a way our relatively simply brain can understand. I think this thought can be applied to communication as well, human society has grown to the point where our ancestral ability to socialise and gossip doesn’t cut it when it comes to understanding how large groups of people will react to and understand certain information.

To really get the message of risk communication across, a two-way dialogue needs to be started. A dialogue that takes into account the facts, but also the the lived experience of the lay public. Interestingly, at the end of the paper Slovic makes the statement that the lay public is actually better than the experts at understanding the broader consequences of these sort of events, and as such, it seems to me that the onus is on the experts to defend themselves to the public, rather than hand down wisdom from above.

Say something controversial.

Matt

Advertisements

‘Perception of risk’ by Paul Slovic, Part 2: What factors influence risk perception?

Back to Slovic

There are several factors that have been identified as having the biggest impacts on the perception of risk: control, familiarity, catastrophic potential, and level of knowledge available to non-experts. Nuclear power, unfortunately, ticks most of these boxes for a majority of the lay public, and this makes it extremely difficult to communicate the actual risks.

It seems to me that these influencing factors can also work in the other direction. Take ‘familiarity’ for example. When the atomic age was first under way, there was probably a reasonable good ‘popular science’ understanding of the way that nuclear power worked, and this familiarity probably helped encourage the growth of the nuclear industry — with support rallied by events like the ‘Atoms for Peace’ speech — and acceptance in the wider public sphere (aside from a vocal minority). But this familiarity has come back in another form, familiarity with the disaster narrative. When people think of nuclear power now, the unerringly refer to events like Chernobyl, the disaster has become the familiar, which in turn seems to push the perception of the risk of nuclear events higher and higher.

Another factor that comes into perception of risk, is what Slovic calls the ‘signal potential’ of an incident. This refers to the capability of a single incident of imply further and more far-reaching impacts of the accident. The example he used was that of the difference of a trainwreck and a nuclear disaster like what happened at Three Mile Island. Hundreds of people may be killed in the trainwreck but, because it is a supposedly ‘closed’ event, there will be little alarm in the wider public. Compare this however, with Three Mile Island. No one was killed as a result of that accident, and the wider consequences were relatively minor, from a human health standpoint (not to belittle the suffering of those that were affected). And yet, the public reaction to the event was wildly over the top, and from here came the majority of the costs of Three Mile Island, the huge public backlash caused massive increases in regulations and the requirements for building nuclear reactors (which massively increases costs, and makes nuclear less competitive), but also in the massive financial and social losses sf the companies involved, but also the communities around Three Mile Island, which suffered under the weight of the negative image of the area.

So is it actually possible to communicate risk effectively? I’ll talk about that in the next post…

Say something controversial.

Matt

‘Perception of risk’ by Paul Slovic, Part 1: How do people process risk?

I was recently put on to an important research paper by Universal Genes, “Perception of Risk” by Paul Slovic. This paper is a review of the literature (at the time) surrounding the public perception of risk, and how the public comes to its opinion on how risky a technology or event is. Because so much is covered by this paper, I’m going to do a couple of blog posts on it, just so I can do it justice.

This first post is more about some of the big take home messages of the paper, about how the public perceives risk, the next couple of posts will be more about specific aspects of the issue.

An important part of understanding the perception of risk is how people come to accept it. Society, when asked, rates the risk for all activities as too high. But if the technology or activity is seen as being ‘highly beneficial’ then the risks are tolerated. Some early research indicated that the voluntary nature of risk is one of the most important factors in people’s perception of how risky it is. People were found to be 1000 times more likely to find a risk acceptable if they see the action as voluntary.

To understand better how people perceive different technologies, and how this affects the level of risk perceived, a study was conducted. Risks were compared based on how highly they rated on two scales: how unknown the technology was, and how much dread the technology inspired. The ‘unknown’ factor is based on how delayed adverse events could be from a certain event or technology (which no doubt creates the arguments against GMOs, as there is always the question of ‘how do we know if wont affect us in however many years). The ‘dread’ factor came from how likely people though the technology could result in a catastrophic mishap, lack of control, potential fatalities and the inequitable distribution of risk. In the groups of people interviewed, the ‘dread’ factor was seen to be most important in a persons perception of the risk involved.

Technologies like nuclear power scored highly on both scales. When the technologies that were rates were analysed for how much regulation there was on them, there is a clear relationship between how high they scored on both scales. The higher the dread or uncertainty, the more likely that they would be highly regulated (or that there would be loud calls for more regulation, see nuclear power and GMOs).

Fear of the unknown seems to be playing a big role here, and Slovic goes in to a lot of detail about the factors that influence risk perception. I’ll cover that in the next post.

Say something controversial.

Matt