We likely can’t (and maybe even shouldn’t) purge people’s values from public scientific discourse the way geoengineering might remove CO2 from Earth’s atmosphere. But perhaps we can limit the heat our values bring to the discussion. As suggested by Nyhan (1st point below) and Kahan (2nd and 3rd), public science communicators could try the following.
- Invoke values most people can identify with.
- Diversify debate so as to ‘cancel out’ the polarizing influence of opposing values.
- Affirm the target audience’s values using information relevant to the issue.
If successful, these strategies will help open minds to the scientific evidence, wherever it leads. Kahan writes:
[T]he goal of these techniques is not to induce public acceptance of any particular conclusion, but rather to create an environment for the public’s open-minded, unbiased consideration of the best available scientific information.
Kahan seems to envision something like the following: Assuage fears that the evidence conflicts with defining values and thus prevent (or limit) motivated reasoning. The mind opens. Logical reasoning then weighs the evidence.
This picture looks incomplete. Removing the threat that a person’s values are misguided doesn’t remove the threat that her beliefs about the evidence are wrong (nor, of course, is it intended to). According to Nyhan and Kahan, motivated reasoning can strongly commit us to false beliefs in debates that we take personally, even if we do so only unconsciously. Granted, we’ll likely be less attached to those beliefs if revising them doesn’t appear to require repudiating “individualism,” “communitarianism,” or whichever values describe who we are ideologically. But I worry that admitting and forgiving error (our own and others’) remains a central challenge.
Disputes over what is true or false are often just as much about who is right or wrong— “Me or you? Us or them?” Concerns about being right or wrong about the facts seem to bring us right back to concerns about being a member of this or that ideological “tribe.” Suppose motivated reasoning tends to lead one group into error on a certain issue, while an opposing group sees the facts clearly. But then say that communication strategies that control or correct motivated reasoning help the misguided group see that their values aren’t threatened. Their minds become more open to the truth.
In this hypothetical scenario, the values of both groups prove compatible with the facts. Yet their values will remain in opposition to one another. (It may not be binary opposition, but there will likely be significant separation on Kahan’s worldview scales.) Consequently, the group who erred may remain reluctant to admit a defeat on the facts, as it could be seen as a battle lost in a broader ideological war. In fact, the group who didn’t err may treat such an admission as an opening to blame, shame, and defame their opponent.
Perhaps I’m being too pessimistic. The studies discussed above suggest that even on embattled issues, people can change their minds about the facts, at least when they don’t have to change their values. My hope then is this: if one side admits the other is right about the facts on an issue that requires neither side to abandon its values, the side who was right all along will see their opponent’s admission as an opportunity to join forces, not jockey for an ideological victory.
Open minds must be open to revision. Conceding error and accepting correction, however, can require courage. But it won’t require quite as much when our opponents have the grace and humility to concede that we’re all vulnerable to error. After all, motivated reasoning appears to be a psychological pitfall with which we can all identify.