During pre-production research for our Risk Factor documentary, we noticed that for some big issues like... say... global warming, a lot of people don't see risk the same way.
At first we thought... if the public was better educated then everyone would agree (and agree with us). We even took some of this rich data and put it into a useful risk assessment tool calle Risk Navigator. But then we met Dan Kahan.
As head of a research lab at Yale University, Dan looks for reasons why people hold divergent views on risk. He has organized scores of surveys, asking thousands of people questions to determine their personal values, and reveal their opinions on the biggest risk factors of the day.
In survey after survey Dan discovered that there are smart and educated people on both sides of every contentious issue. So if neither side has a monopoly on smart (or stupid) where do our divergent perceptions of risk come from?
Dan has a theory for that: Cultural Cognition. This theory contends that our ideas (cognition) are determined by the beliefs of the cultural group we’re in. To (over)simplify, we tend to hold the beliefs and identity of our tribe.
But does that apply to everyone? Aren't well-educated people more likely to break free from groupthink and come to similar fact-based conclusions? Dan paints a different picture:
"You might think that people who don’t know a lot about science, they’re going with [their group]. Whereas the people who know a lot, they’re going with the evidence. And if that were true, then the people who are the highest in science literacy, they’d be converging. They wouldn’t be nearly as polarized as everybody else. In fact, they’re more polarized."
"What happens is [educated] people are using their reason to extract from the available data the significance that will support the position consistent with the one that dominants in their group and explain away the rest. So the problem isn't stupid people."
Then what is the problem, in Dan's view?
"It’s that smart people have the misfortune of living in a science communication environment that’s become polluted with certain kinds of toxins. These toxins are associations, these symbolic associations between positions and having certain kinds of identities."
Dan's conclusion seems sound, but it is also depressing. It suggests that we're prisoners of our tribe’s worldview, and that there's a shortage of independent thinking. But in a recent interview for sciencemag.org Dan provided a ray of hope, based on his latest research. It comes down to two words: science curiosity.
"Science curiosity is a desire to seek out and consume scientific information just for the pleasure of doing so. ... In our research, we've seen that with greater curiosity, people are more willing to take new information into account when forming their opinions about the world. ... In our study we observed this trend by chance when we weren't looking for it. And that’s why we should study more – asking the question "If science curious individuals get information that’s contrary to their predispositions, are they more open to it?"
If you want to hear more of Dan Kahan’s ideas you can watch one of his entertaining lectures on YouTube. It’s called Culture, rationality and beliefs about risk.