Article Thumbnail

The Psychology of Doubling Down

Why some people go harder when faced with evidence they’re wrong

You’ve seen it before: A friend, acquaintance, coworker or random high school friend posts on social media about chemtrails or dubious science on global warming or a side-eye questioning of whether the pay gap is real. Commenters or friends rush in to question the faulty thinking—but instead of examining what’s being said and rejiggering their worldview, the original poster doubles down, pivoting to any other argument that solidifies their original point. What’s behind the double-down, and why is it so hard to resist?

President Trump provides some of the most clear-cut recent examples. He recently doubled down on the North Korea issue, claiming that his threat to send “fire and fury” their way was not only not harsh — it actually wasn’t harsh enough. As The New York Times noted:

“Frankly, the people who were questioning that statement, was it too tough? Maybe it wasn’t tough enough,” he told reporters at his golf club in Bedminster, N.J. “They’ve been doing this to our country for a long time, for many years, and it’s about time that somebody stuck up for the people of this country and for the people of other countries. So if anything, maybe that statement wasn’t tough enough.”

Trump did it again with Charlottesville, refusing to condemn neo-Nazis by saying there was blame to be issued on both sides. When called out on the false equivalency, he doubled down again, insisting that the alt-right and “alt-left” are simply two sides of the same violent coin. As the Los Angeles Times notes:

At his news conference, Trump made a glib and utterly unpersuasive argument that tearing down a statue of Lee would put the U.S. on a slippery slope to … something. “This week it is Robert E. Lee, and this week Stonewall Jackson,” Trump said. “Is it George Washington next? You have to ask yourself, where does it stop?”

Naturally, he’s done it many other times.

Most of us know the term double-down from blackjack. You’re dealt two cards, and you have the option of potentially doubling your profit by taking on the risk of one more card. Maybe you go bust, but maybe you win big.

“It is considered the ‘money’ move in basic blackjack, a way to make twice as much profit with one flick of the wrist,” Matt Villana writes in a guide for when to use the move in cards. “Dealers and pit bosses refer to it as ‘reaching deep.’ For the rest of us, it’s known as ‘doubling down.’ And, to be honest, most of us do it way too often.”

We do it too much in life, too. And in real life, the application is slightly different. In gambling it’s a term used for calculated risk, one that typically indicates you have enormous confidence in winning. In real life the confidence applies to the conviction that you’re somehow above the fray of facts, and also possess just enough stubbornness required to die on that hill. It requires a steadfast refusal to admit there’s any possibility that you’re wrong, followed by wild scrambling to save face.

In other contexts, people use the term to simply mean make more effort or do more, as in “double down” to help Haiti, or “double down” on women’s issues. There is also, it’s worth noting, a KFC Double Down sandwich—the bread is replaced by two pieces of fried chicken.

But most of us nowadays use double down to indicate stubbornly clinging to a notion in the face of evidence to the contrary. And while the doubling-downer feels smug and confident, to the observer, it often looks like an obvious hot-air pivot by someone too insecure to consider that they might be wrong. While we should expect politicians to do it (after all, their livelihood depends on appearing to have the answers), anyone is capable of doubling down — journalists, partners, friends, scientists and colleagues.

Especially men? There are no statistics to indicate that men are more likely than women to double down on a bad argument. But Georgetown linguistics professor Deborah Tannen told The Atlantic that, at least when it comes to arguing differences between the sexes, men are more likely to see arguing as a contest, whereas women are more likely to see it as exchanging information. The result may be that men are motivated to do whatever it takes to “win” an argument, which could include coming up with anything to keep looking right, facts be damned—or at least heavily manipulated.

Either way, what’s it all about? On some level, it’s a chicken-and-egg problem. Often our worldview comes first, and the facts to support it later, if at all. If that worldview is an emotional one (and when isn’t it?), a deeply entrenched way of seeing things based on any number of factors — gender, race, socioeconomics, sexual orientation, regionalism, religion — then we’ll be hard-pressed to give it up no matter what the evidence says. And that evidence can be cherry-picked from anywhere if we find ourselves in need of a few “facts” to back it up.

“Creationists, for example, dispute the evidence for evolution in fossils and DNA because they are concerned about secular forces encroaching on religious faith,” Michael Shermer wrote at Scientific American in a piece on how to convince someone they’re wrong when facts fail. “Anti-vaxxers distrust big pharma and think that money corrupts medicine, which leads them to believe that vaccines cause autism despite the inconvenient truth that the one and only study claiming such a link was retracted and its lead author accused of fraud.”

In such instances — and Shermer cites more, like 9/11 truthers and climate change deniers — people presented with evidence that they are in fact wrong will instead focus on the doubt-introducing minutiae to keep their position afloat. Even when that “evidence” is debunked, it makes no difference whatsoever.

You can see in nearly any social justice argument, too. If you believe that men and women are naturally different, and that men are naturally better at say, programming, then you will double down on that view, cherry-picking the science at hand, to justify a point of view that says women shouldn’t, for instance, work at Google.

Shermer says this is about two psychological concepts at work: cognitive dissonance and the backfire effect. The former involves the difficulty in reconciling two opposing ideas. Shermer writes:

In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult when the mother ship failed to arrive at the appointed time. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.”

The backfire effect, Shermer explains, is the weird notion where being corrected actually makes some people feel more correct in the original faulty belief. Many people, for instance, continued to believe that Iraq had weapons of mass destruction even after the Bush Administration — which started the rumor — admitted it did not. And even scientists make this mistake.

All of this is a fancy way of saying that we believe whatever we think is right, and we are not above a little embroidering of reality to fit that belief. It’s easy to think of such folly as the province of the less intelligent, but it’s not necessarily true.

In some instances, higher intelligence can lead to faulty thinking. A recent study found that people with higher cognitive abilities are more likely to stereotype because they’re good at pattern recognition. “Stereotypes are generalizations about the traits of social groups that are applied to individual members of those groups,” the study authors note, in a report at Science Daily. “To make such generalizations, people must first detect a pattern among members of a particular group and then categorize an individual as belonging to that group. Because pattern detection is a core component of human intelligence, people with superior cognitive abilities may be equipped to efficiently learn and use stereotypes about social groups.”

However—and this is a big however—when those same people are presented with new information to contradict those stereotypes, they’re better at realizing the error of their ways and shedding the “narrative script” they may have invented whether it’s true or not.

This means there’s hope, at least for some of us, if only we can get good, reliable information about our worldview. That seems harder than ever in an age of fake news, strong media bias and relentless spin. And of course that leaves people unwilling to seek out a variety of sources or be open to hearing that they’re wrong forever out in the cold. But don’t tell them that; they’ll just double down.