Don’t Shoot the Science Messenger

by Bill Chameides | January 27th, 2010
posted by Erica Rowell (Editor)

Permalink | 2 comments

How do we perceive calls and fouls in a sports match? What about our opinions on environmental issues? Researchers point to group ties and “cultural cognition” as key influencers in both.

Why are we so polarized when it comes to issues like the safety of vaccines or risks of environmental pollution?

Consider the following scenario. Your favorite college basketball team (e.g., Duke) is playing its greatest nemesis (e.g., North Carolina), and you are watching the game with a friend who just happens to be rooting for the opposition. The game is down to the wire when your team’s star drives to the basket and collides with another player. The referee blows the whistle, and while it’s completely obvious to you the other team fouled, your friend is convinced your team charged.

Why is it that each of you watching the same event arrived at opposing views about what happened? Psychologists say it has to do with “group ties” — it’s not that either of you is disingenuous but rather that your perceptions are influenced by emotional allegiances.

The Role of Cultural Cognition

Building from a 1954 study similar to the hypothetical scenario above, law professors Dan Kahan and Donald Braman have conducted research in this area. In 2006 they argued in an article in the Yale Law & Policy Review that a similar process can explain the intense polarization around environmental and health issues in the American public. Last week in an opinion piece in the journal Nature (subscription required), Kahan summarized their theory, defining “cultural cognition” as:

“the influence of group values — ones relating to equality and authority, individualism and community — on risk perceptions and related beliefs.”

Just as group ties predispose people to interpret sporting events in favor of their team, ties to people who share cultural values predispose individuals to interpret information they receive (such as scientific evidence) in ways that favor their group’s worldview. Kahan writes: “People endorse whichever position reinforces their connection to others with whom they share important commitments.”

According to the research, the American public generally splits into two broad worldviews:

  • Individualists and hierarchal types, who favor personal initiative and respect for authority: People who ascribe to this view, according to the research, tend to be suspicious of science pointing to environmental risks because accepting such things would lead to restrictions on personal and economic freedom.
  • Egalitarians and communitarians, who favor goals that benefit community and limit disparities: People with this worldview, the research suggests, are typically suspicious of business and industry (perhaps because they lead to an unequal distribution of wealth) and thus tend to believe such societal forces are causing problems and so must be restricted.

Kahan claims that the differences in these two worldviews “explain disagreements in environmental-risk perceptions.”

Kahan and colleagues’ research has found that people’s highly polarized positions on a panoply of familiar environmental and health issues (nuclear power, genetically modified organisms, nanotechnology, climate change, vaccinations) could be explained by this simple bifurcation “more completely than differences in gender, race, income, education level, political ideology, personal or any other individual characteristic.”

Rethinking the Messenger’s Role

Another important aspect of cultural cognition research is the critical role of the messenger — a key influencer of each group’s response to scientific information.

When a group of individualists was given scientific information on an environmental issue by someone perceived to be an egalitarian, they tended to dismiss it. When they received the same information from someone perceived to be of their group, they tended to accept it. And vice versa for the egalitarian group.

Such findings should be big news for big news organizations and their science coverage. Disseminating scientific information via spokespeople associated with a given group (like the inconveniently Democratic Gore) could be disastrous to the underlying message. Such an approach, Kahan argues, “encourages citizens to experience scientific debates as contests between warring cultural factions — and to pick sides accordingly.”

So what’s a society that needs to agree on how to deal with environmental and health threats supposed to do?

The researchers suggest that experts from both sides of the cultural divide present sound scientific information to the public and that, rather than focusing on trying to convince people of specific conclusions on specific issues, we should “create an environment for the public’s open-minded, unbiased consideration of the best available scientific information.”

I find the research to be fascinating, but I am disappointed with the recommended solutions.

Getting information from experts representing both cultural worldviews is a great idea, but that won’t prevent the public from hearing from members of their group who sound a different message. So to me the fix seems kind of simplistic and apple pie. But I could be wrong and simply reacting culturally. And by the way, those refs must have been blind at the Duke-NC State game last week. No way Duke would have lost if it wasn’t for those terrible calls.

filed under: faculty, politics, science
and: , , ,


All comments are moderated and limited to 275 words. Your e-mail address is never displayed. Read our Comment Guidelines for more details.

  1. Bill Chameides
    Feb 4, 2010

    Jim, I am sure that happens.

  2. Jim
    Jan 29, 2010

    The refs made all the right calls, State just played a better game! Ha! 🙂 The problem is finding enough leaders on each side to present the science accurately, not just experts, but the leaders. But good luck with that. My feeling is that charismatic leaders tend to be swayed by their group affiliations and emotions. Have you ever read the book “Mistakes were made, but not by me”? Once people voice their stance on an issue, they tend to dig in deeper and deeper despite the evidence.

©2015 Nicholas School of the Environment at Duke University | Box 90328 | Durham, NC 27708
how to contact us > | login to the site > | site disclaimers >

footer nav stuff