butting heads boardroom

Beware the Boardroom Double-Down

Roger Sollenberger

Editor’s Note: You are going to love this article. This article is the best. I have no doubt. I’m totally sure of it. I promise you. If I’m wrong, I’ll give you back the time you took to read it. Oh, what’s that? I can’t give people time? True. But I won’t need to, because you’re going to love this article.


Consider where the term “double-down” comes from: gambling, blackjack specifically. The double-down is a calculated risk. You have two cards showing, and you can double your bet on the third. Get the right card and you cash in; the wrong one, and you bust.

Earlier this spring, professional gambler and unlikely celebrity James Holzhauer applied the double-down strategy with stunning success to carry him through a record-setting Jeopardy! run. He bet big on "Daily Doubles," assuming he had a high chance of knowing the correct answer. It was the most effective way to maximize his winnings, but if he didn't build a runaway lead, he left himself vulnerable. For Holzhauer, the strategy literally paid off.

In the real-er world, “double-down” has become shorthand for irrational, self-interested stubbornness: Defending yourself in the face of all facts – or an overwhelming number of critical, impartial voices – that say you’re wrong.  Unfortunately, one of the lessons leeching into our cultural waters is that if enough people choose to double down with you, you probably won't face real consequences. (Editor’s note: Everyone take a deep breath and think of someone you know who might do this now and again. Annnnnnd, exhale.)

The double-down has been described as sort of a "chicken-and-egg problem." Which comes first, our beliefs or the supporting facts? Much of the time our beliefs come first, and the supporting facts come later. Sometimes we cherry-pick those facts to justify our commitment to a foregone conclusion. Sometimes we might not bother with facts at all. As our personal commitment to a belief escalates, so do the stakes of being wrong, or being perceived as wrong, and our egos and self-interest play an increasingly large role.

The problem is our brains really, really like that.

The recent spread of high-profile double-downs demands that business leaders pay close attention to choices, processes and those who influence both.

Ripples in Every Realm

Like Holzhauer, the double-down has been on a winning streak. The last few years have seen a stunning rise of double-down culture in politics in particular.  The calculation seems something like, "If you never admit you’re wrong, you always win." And though a certain president might be the go-to embodiment of double-down, it's not a partisan tactic. For the most high-profile counter-example, remember how Virginia's top three Democrats responded to the scandals that rocked the state earlier this year.

Although political double-downs have been fairly prominent, we can't isolate the double-down to any single realm – like politics, business, sports, family life or gambling. It's a psychological phenomenon already well documented in cognitive dissonance theory research.

The recent spread of high-profile double-downs does demand, however, that business leaders pay close attention to choices, processes and those who influence both. Social trends can slip easily past our work culture bouncers, and if enough people go along with a decision-maker who doubles down in the face of all facts to the contrary, it's difficult – if not impossible – to check. The double-down isn't always negative, but it can close doors, cauterize growth, stigmatize risk, and blind us to reason. The responsibility for stopping it, then, isn't only on the individual. It's also on the enabling culture or group.

If enough people go along with a decision-maker who doubles down in the face of all facts to the contrary, it's difficult – if not impossible – to check.

Let’s consider the decades of science on overconfidence, confirmation bias and cognitive dissonance to get a better grasp of what's going on with double-downs, why it's important and what to guard against.

Cognitive Dissonance

Cognitive dissonance is the unpleasant feeling you get when you're forced to reconcile two contradictory facts, beliefs or behaviors at the same time. It's a well-studied field in social psychology, as well as in management research. One of the cornerstone studies of cognitive dissonance theory (CDT) found that "if a person is induced to do or say something which is contrary to his private opinion, there will be a tendency for him to change his opinion so as to bring it into correspondence with what he has done or said. The larger the pressure used to elicit the overt behavior ... the weaker will be the ... tendency."

In other words, we not only like to believe we're right, we also like it when other people think we're right. We don't, however, like it when we're compelled to contradict our convictions. The more pressure we feel to do so, the more we stick to our guns. In CDT terms, we can frame the double-down as an escalation of commitment.

We can frame the double-down as an escalation of commitment.

How do we react to cognitive dissonance? We might simply choose to ignore the inconvenient stuff (dissonant cognitions), or we might add consonant cognitions (i.e. supportive facts) in an effort to stack the deck in our favor (spreading the alternatives) or we might just dismiss the dissonant cognitions as being not important after all.

What are the signs leaders should look for to know when cognitive dissonance might arise?

Responsibility for choice. When we feel personal responsibility for a decision, we feel more pressure to reduce the discrepancies in our cognitive dissonance. As pointed out in one study, when someone presents us with evidence that we made a poor choice, our dissonance increases. We scramble to resolve it, often by doubling down and escalating our commitment to our original position. Think of the disbelief the character of John Hammond displays in Jurassic Park when first confronted with the claim that his supposedly asexual dinosaurs can in fact reproduce. 

Self-justification. CDT tells us it's psychologically and emotionally easier to self-justify than it is to revisit our decision, admit fault and change our position. Recent studies in management research have illustrated how this tendency develops in decision-makers (see also here and here), finding that decision-makers frequently react adversely to negative information because it challenges their inner belief that they are rational actors. Ironically, this itself is an irrational belief: To go back to the Jurassic Park example (Editor’s note: Yes!), we can contrast the irrational reaction from Dr. Hammond – who prides himself on scientific rationality – with that of Jeff Goldblum's character, a chaos theory specialist who sets himself aside and immediately accepts the facts: "Nature finds a way."

Selective information processing.Closely related to the above, CDT studies have also predicted that when someone has a preferred choice in mind, but hasn't yet made a decision, they'll tend to cherry-pick information. Unsurprisingly, cherry-picking (i.e. “selective information processing”) coincides with escalation of commitment.

When someone has a preferred choice in mind, but hasn't yet made a decision, they'll tend to cherry-pick information.

The corporate world provides familiar examples. Business leaders often experience cognitive dissonance when new competitors or new models emerge. When the Next Big Thing comes along, it not only challenges an existing successful business model, it challenges a veteran leader's personal experience running that business.

Consider the reaction of Blockbuster CEO Jim Keyes when asked about why he turned down the chance to buy Netflix for less than $1 million: “Neither Redbox nor Netflix are even on the radar screen in terms of competition.” Sure, they weren't on his radar as competitive entities, but their competing business models told a different story. Blockbuster, of course, went bankrupt clinging to an antiquated model in the face of the overwhelming strength of the on-demand market, and Keyes' hubris is now the stuff of legend.


Double-downs in business, from product decisions to hiring and investments, are often a function of overconfidence. If you truly believe that you're right, you can't give in, and in fact you often have to push your position even harder. In some circumstances, of course, this can be positive, but this could also mask an overconfidence bias.

According to one study, overconfidence bias is more prevalent among entrepreneurs than the general population. This isn't exactly surprising. Eager people – especially young, eager ones – who are confident enough to pour limited resources into a new idea might not be well tuned to their own personal limits. They might perceive their decisions carry less risk, or, if they have limited experience, they might not see some risks at all. The Dunning-Kruger effect famously illustrates this issue. To take it a step further, if someone has an unrealistic view of their own abilities, they might be more predisposed to combine their overconfidence with anchoring, or "going with your gut."  (Editor’s note: See also, Illusory Superiority.)

Again, you can win big this way, or you can go bust. Nothing encourages a double-downer like positive reinforcement, and when you develop such overconfidence, you might double down on doubling down. What's more, others might be more predisposed to go along with you.

Confirmation bias

Confirmation bias – the tendency to search out, interpret and prefer information that supports a belief (Editor’s note: read all about it!) – is dangerous to business for several reasons. Simply put, it can inform bad decisions.

In hiring, for instance, studies show how the presentation of candidates affects not only how recruiters view them, but also how others in the company view them. We see this in the tendency to favor candidates who come at the recommendation of a friend or trusted associate. We also see it at the company level, when a candidate or new hire is presented having already earned the confidence of others in the company. An overriding faith in ourselves, or in others we trust, can lead us to make the wrong hires again and again, looking anywhere except into the mirror for explanations of our high turnover rate.

Or with a product launch: A business leader might tell his company he's come up with a killer idea, and direct a team to conduct surveys, research the competitive landscape, hold focus groups and so on. He might not pay any attention to the real data, however: All that market research is just a way to confirm what he already believes. Worse, he might not even be aware he's doing it. I've personally experienced a version of this, and it nearly tanked my company.

How to fight the double-down

There are a few strategies, but the overriding theme is familiar: Get egos out of the picture. When the sense of self is threatened, dissonance creeps in.

There are a few strategies, but the overriding theme is familiar: Get egos out of the picture.

Consider having someone play the role of devil's advocate. Task someone with arguing against prevailing positions. Not only would such a role be helpful in early stages of the decision-making process but also later, when attitudes – and egos – have begun to harden. This also helps create a business culture that normalizes healthy criticism and admitting fault.

Put another way: Don't take yourself so seriously. A company, for better or worse, is a team effort. You've chosen your team for a reason. No single person knows how to do it all. The rise of casual corporate culture popularized and then caricatured by foosball loving internet startups helped encourage employees and leaders alike to relax, which, in many instances, led to more democratic office environments in which dissent wasn't emotional and disagreement wasn't stigmatized. In such a cooperative environment, the double-down is a poor culture fit. (Of course, that also runs a risk. If you don't take yourself seriously enough, you might double-down on that.)

A company, for better or worse, is a team effort. You've chosen your team for a reason. No single person knows how to do it all.

The double-down can also work inversely as a sort of uniting force. For instance, when a supervisor breaches a code of conduct, teams experience dissonance between allegiance to the boss and disapproval of the boss's behavior. Frequently, this actually unites teams and can spur difficult but necessary changes. In other words our motives to reduce dissonance don't always have to lead to negative and irrational double-downs; they can just as easily be positive.

The social contract depends on accepting that sometimes – a lot of the time – we're wrong. It helps us build trust and offers rewards such as forgiveness, honor, flexibility, community, as well as personal and corporate growth. People feel more comfortable around a person who listens and is open to change, and they’ll be more motivated to work with and for that kind of person. It’s in our personal, professional and organizational self-interest to watch out for the double-down. The responsibility, therefore, is on the culture or group that enables an irrationally stubborn decision-maker to double down. Ultimately, it comes down to all of us.

The social contract depends on accepting that sometimes – a lot of the time – we're wrong.
Roger Sollenberger


Get the latest behavioral
science insights.