A disagreement can be good for the political system.

A disagreement can be good for the political system. Joyce Marrero/Shutterstock.com

Against Consensus

The more you agree with people on your side of political debates, the more likely you are to be wrong about the facts.

INSERT IN_ARTICLE AD

One annoying effect of extreme political polarization is that people on different sides can't agree about facts. Americans' opinions tend to coincide with those of others on their own end of the political spectrum about things such as whether there was a cover-up about Benghazi, whether the IRS has illegally targeted right-wing groups, whether Obamacare is hurting the economy, the extent to which fracking presents an environmental threat, and about many other factual matters. Certainly, Democratic politicians, activists, and spokespersons very often agree about things like that, and so do Republicans.

But though we have come to expect this, we ought to regard it as discrediting to people on both sides, as strongly suggesting that they are not committed to saying what's true. Relentless political partisans—Reince Priebus or Debbie Wasserman Schultz, Charles Krauthammer or Michael Tomasky, Rachel Maddow or Sean Hannity, Hillary Clinton or Marco Rubio—should at this point be regarded as having next to no credibility on factual questions.

Commitments with regard to values—to justice, or equality, or liberty—define your position on the political spectrum, or even what we might call your political identity. Such things may be among your most intense beliefs, and motivate many actions. But with regard to most factual questions, such things are just irrelevant.

Philosophers traditionally draw a distinction between normative and factual claims. We might distinguish them by noting that different sorts of reasons count for or against assertions about facts and those about values. Whether it would be a good thing for it to rain on Saturday in Fresno, or whether it would help us make progress if it does, is irrelevant to the question of whether it actually will rain Saturday in Fresno: That depends on whether drops of water actually fall out of the sky.

In sincerely trying to find the truth about factual claims, it is often important to decide whom to believe. To do that, it’s important to figure out who is advocating something on the basis of evidence. The people most likely to be sensitive to evidence—and therefore most worth listening to—often disagree with the consensus of the people around them on factual matters. We ought provisionally to regard people who frequently act as dissidents, heretics, and pariahs in their own political group as being more committed to speaking the truth than people who usually or always agree with the consensus. A person who diverges from the consensus of the people with whom she agrees politically may have other problems of credibility. But she does not start off with this one.

In order to build an argument for this, let's conjure some imaginary situations. First, imagine that we are psychologists, and for whatever reason we are conducting a study of the belief systems of people with green eyes. It turns out that they all believe that cutting taxes increases revenue. We'd be stunned by the arbitrary unanimity. But now imagine further that it turns out that green-eyed folks unanimously share many other beliefs: that there is a highest prime number, for example, that there is extraterrestrial intelligent life, that it will rain next Saturday, and that there was no cover-up about Benghazi.

One thing we could certainly conclude is that whatever they might say about themselves, many green-eyed people are not basing those beliefs on the evidence. That is because the evidence in each case is split, and we'd expect disagreement among people with green eyes, or would expect the split among people with green eyes to be similar to the split in the population as a whole. Whether you have green eyes is completely irrelevant to rationally assessing the effects of tax cuts on revenue. Something is making the green-eyed people believe these things, we must suppose, but it is not having reasons. Maybe it's genetic, or maybe having green eyes is associated with some sort of neurological glitch.

Now imagine that all the leftists in Fresno think it will rain on Saturday and all the rightists think it will not. We'd find that as arbitrary as in the case of the green-eyed people and tax cuts. Whether you are on the left or right has no connection to having reliable information on whether it will rain next Saturday in Fresno.

And whether you are on the left or on the right is no more relevant to the question of whether there was a cover-up about Benghazi than it is about whether it will rain. Really, it’s not. Actual evidence here would concern such things as who knew what when or who communicated what to whom. Whether you lean forward or back, whether you think we need more equality or more liberty, more welfare programs or freer markets: These have nothing to do with what happened after Benghazi, or what the actual effects of fracking are, or whether restrictions on gun ownership reduce violence. If people on the same side with regard to political values agree about such controversial factual questions, it is very likely that they are generating these beliefs in a rationally arbitrary way.

I'll try to make the point clear with a mathematical proof, or at any rate some back-of-the-envelope calculations.

Suppose that a good assessment of the evidence that it will rain on Saturday in Fresno makes the probability 50 percent either way. If everyone's belief were responsive to the evidence, and since right-wingers and left-wingers have roughly the same access to evidence, we'd expect a 50/50 split within each group among people forming an opinion. In such a situation, we could provisionally estimate the likelihood that their beliefs are based on evidence by the distance from this 50/50 split within each group. In a case where all the right-wingers think it will rain and all the left-wingers think it will not or vice versa, we should infer that there's at least a 0.5 probability (on a 0-to-1 scale), with regard to any person in either group, that he believes what he believes because of factors other than the evidence, or that his belief is rationally arbitrary.

Starting off with your credibility cut in half is a serious handicap, but that's just the first step in our calculation of the credibility index.* If the initial probability that a person's position on a single factual issue is responsive to the evidence is at most 0.5, then the odds multiply as the issues do. That is, if your position on whether it will rain agrees with that of most people your side of the political spectrum (or for that matter with your friends, or your fellow shoppers at Whole Foods, or people who have the same eye color as you do), and so does your belief about Benghazi, then provisionally we should infer that the chances that your belief about the issues together is sensitive to the evidence are at most 0.25. Add, for example, agreement on the effect of fracking and you're at 0.125. At the conjunction of 10 shared factual beliefs, the probability that the conjunction of the beliefs is sensitive to evidence is about 0.0009. If we are assessing a person’s credibility by whether the evidence has anything to do with how that person forms beliefs, we very quickly get to the point at which that is vanishingly unlikely.

Now all of this proof happens in a world of stipulations: unanimous groups approaching claims where the evidence is perfectly split. Made-up worlds are the only places you can mathematically “prove” something like this, but the phenomenon of real people with extremely low credibility indices is all too real. You are perhaps thinking that your own side has the preponderance of evidence in each case. Of course you are, if you have a side—which is a good reason not to. But assign almost whatever probabilities you like, and after a run of cases of agreement on factual matters with people on your side politically and no disagreements, you've got essentially no credibility.

Finding the truth is not our only goal. Perhaps our capacity for belief is given to us by evolution in order to help us achieve unity and solidarity with one another, so as to be able to act in concert. We often believe for social reasons, to express our membership in a family, a community, a nation, or a movement. Political partisans and others who are very susceptible to peer pressure want to believe the truth, sort of, but not as much as they want to belong, and perhaps they would not, on reflection, change that. But certainly, no one ought to believe anything such a person says on the grounds that they said it; if that is how they decide what to believe, they've repudiated any attempt to say what's true. We often know precisely what a political partisan will say about some controversial factual matter before they open their mouths and before the evidence comes in. It is a mistake to regard someone like that as a credible source of information.

In trying sincerely to find the truth on a factual question, a good first move in a case where one has not formed an opinion or on which the evidence is equivocal is to critically examine and provisionally regard as irrational an opinion that is a consensus among people like you. The most credible people—the people who are responsive to evidence—often disagree with the consensus of people around them. I think we sense this as they speak, and on the other side, we often mistrust politicians or spokespeople for interest groups. It's not that they are lying in some particular case, exactly; it's that they long ago lost any commitment to reality.

When we live in rival unanimous systems of facts, we generate rival unrealities, dueling hallucinations. Perhaps that's how we ought to think of Red and Blue America: not as geographical or ideological regions, but as rival fictional universes, as though there's a war between Middle Earth and Narnia. People create a consensual illusion and confirm it to one another. This cannot in the long run be anything but a continual source of ever-more-unanimous error within groups and ever-more-extreme polarization between them.

If it is impossible to achieve social unity without a consensus of factual belief, then we are in a position of having to choose between unity and truth. But of course various forms of unity do not presuppose unanimity; one can love someone with whom one disagrees politically, for example. We should encourage configurations of people, from political movements to middle-school cliques to all of us together, to be open to heretics, dissenters, and eccentrics. We want groups that do not treat dissidents as dolts or monsters, or even as obviously wrong. There could even be groups—there have been groups—in which dissenters are valued rather than dismissed, ostracized, or executed. That's a very good idea, because dissenters are likelier to be right.


* A belief that is generated at least partly in response to an assessment of the evidence is an evidence-sensitive belief. Beliefs that arise entirely independently of evidence or entirely from causes other than an assessment of the evidence are evidence-arbitrary beliefs.

An evidence-random group with regard to some particular factual matter is one in which membership does not typically provide special access or barriers to evidence weighing on that matter. For example, one's gender does not give any special access or present special barriers to evidence that there are tiny frogs living at high altitude in Peru. But it might well be relevant with regard to abortion, for example.

Two claims are evidentially independent when, roughly, the evidence for or against them does not intertwine, when distinct bodies of evidence count for or against them. (“It'll rain Saturday in Fresno” and “There was a cover-up on Benghazi” are evidentially independent.)

In general, to assess credibility on a factual claim, estimate the probability given the available evidence. Examine the distribution of agreement within a certain evidence-random group for deviation from this split. So if the probability that climate change will cause catastrophic sea-level rises is 0.67 given the available evidence, there is a 0.67 initial index for members of a unanimous evidence-random group. Divide by the percentage in the case of non-unanimous groups. So, if the group is at 75 percent of consensus, divide 0.67 by 0.75 to get a 0.893 credibility index.

With regard to evidentially independent claims, each instance of agreement within any of one's evidence-random groups multiplies the (im)probability that the conjunction of the beliefs of any member is sensitive to the evidence. So, multiply such cases, canceling one for each case of heterodoxy, to find the credibility index.

For example, multiply in Benghazi at 60/40 to get 0.536; the IRS at 40/60 to get 0.214, and so on. (The numbers will be higher given non-unanimous groups.)

Sadly, a thousand clarifications, refinements, and detailed responses to objections are necessary.

(Image via Joyce Marrero/Shutterstock.com)