cocky arrogant man

Know Your Nuggets: Dunning-Kruger

bob_sullivan
Bob Sullivan
Writer

Editor’s Note: I heard someone mention the Dunning-Kruger Effect at a conference the other day, so I’m pretty sure I’m an expert now.  But I know not everyone is as super smart and ready for political office as I am – sorry, ya’ll – so I figured we’d do one of our Know Your Nuggets on it and maybe you’d learn something. I mean, I haven’t read it, but I don’t need to, ‘cuz I’m super smart.  Dunning-Kruger 2020!

 

It can sure feel smart to mention the Dunning-Kruger Effect at a cocktail party. It's fun to talk about and it often gives the speaker a kick of feeling superior. There's usually a giggle or two that accompanies mention of the "people are too stupid to know they are stupid" cognitive bias. Be careful, though – ironically, many people overestimate how well they understand it.

Named after social psychologists David Dunning and Justin Kruger, the concept is based on their original study published in 1999. Their experiments were simple: subjects were asked to self-rate their skills on subjects like grammar, then they were tested. Poor performers wildly overestimated their abilities. Those who scored in the 12th percentile – at the very bottom of the class – had estimated they'd land in 62nd percentile, squarely above average. Anyone who's ever corrected someone's grammar on a social media post probably knows the feeling.

The idea seems pretty straightforward: People don't know what they don't know. The less they know, the less likely they are to seek out new information that might change their views. But there's something more subtle about the concept:

“The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club,” Dunning told Vox in a recent interview. “People miss that.”

“The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club.”

Beware that notion that Dunning-Kruger is about “stupid people.” The 1999 research project found that people who scored as high as the 80th percentile still overestimated their abilities. While the effect is more dramatic at the bottom of the test scores, four out of five people didn’t know as much as they thought they did. Odds are, you overestimate yourself, too. We all have a little Dunning-Kruger in us.

Since that initial study, the results have been replicated around the world; meanwhile, Dunning and Kruger have honed the concept. The latest research shows that learning something about a topic can actually make things worse. People often erroneously think they've become very knowledgeable about something after spending just a short amount of time researching it. Doctors who see patients after they spend 10 minutes reading about a disease on the Internet struggle with this problem daily.

People who are complete neophytes seem to be aware of that, but the Dunning-Kruger Effect spikes hard when people take on a new subject.

 

 

People often erroneously think they've become very knowledgeable about something after spending just a short amount of time researching it.

In a 2018 study, Dunning and co-author Carmen Sanchez called this the "beginner's bubble." In a concocted experiment that had subjects identify "zombie diseases," participants "rapidly surged to overconfidence" after a few learning experiences. The authors then replicated their findings in the real world. They found young adults who enter the financial world quickly learn to overestimate their financial literacy skills, before slowly realizing how little they know about money.

"When it comes to overconfident judgment, a little learning does appear to be a dangerous thing," the authors write. "Although beginners start with humble self-perceptions, with just a little experience, their confidence races ahead of their actual performance."

That's a bad combination in the age of the instant expert. This will be bad news to Internet commenters, many who comfortably pontificate about everything from global warming to cancer-causing foods.

There do seem to be additional patterns to be aware of when discussing Dunning-Kruger. This bias seems to suffer from gender bias. Plenty of research shows men are more likely than women to overestimate their abilities. Here’s one: in a study of college physiology students, the average male student had a 61 percent chance of thinking he was smarter than a classmate, while the average female only had a 33 percent chance.

Money might be a factor, too. Those from high social classes are more overconfident than lower-class individuals. In a paper titled The Social Advantage of Miscalibrated Individuals, the authors found business owners and job applicants who came from means were more likely to be overconfident.

Overconfidence, in turn, made them appear more competent and more likely to attain social rank,” the authors note, lamenting that this characteristic of human nature can help perpetuate socioeconomic inequality.

But perhaps the least well-understood part of the "stupid people don't know they are stupid" bias is that it also includes an equally pernicious mirror image – smart people don't know they are smart. Back to Dunning and Kruger’s initial work: they found top 20% performers under-appreciated how good they were. Put another way, experts believe that the overconfident among them are actually their equals, or even smarter. That really opens the door for overconfident idiots.

 

Experts believe that the overconfident among them are actually their equals, or even smarter. That really opens the door for overconfident idiots.

So how to deal with Dunning-Kruger in the office? First, it's best not to call it the "stupid" bias.

One thing Kruger suggests is to be comfortable with saying, “I don’t know.” In study after study, he says, it’s remarkable how few subjects opt for “I don’t know” when given the choice. Neil Postman also makes this point in the book “Amusing Ourselves to Death” – how often do you see an expert on television give “I don’t know” as an answer?

We all have blind spots. The more we are open to them, the less likely we’ll fall for overconfidence.

On the other hand, it’s important to recognize that there may very well be smart people in your office who are shouted down by the overconfident. Actively solicit their opinions. Don’t assume it’s a bad sign that an employee sounds hesitant or cautious when making a suggestion; it might mean they are a “superforecaster,” or someone who wisely views their opinions as potentially being wrong.

But also, most critically, be humble. Accept that, statistically, you are probably wrong several times each day. Test your choices. Examine data after the fact, and leave the door open for course correction. Don't fall for the trap of thinking you can become an instant expert on a topic, that you can read a brief or a book and command some subject matter in a day or two.

Don't fall for the trap of thinking you can become an instant expert.

The real lesson from Dunning-Kruger is that the learning curve doesn’t follow anything like the shape we imagine – a slow steady climb up a staircase. Instead, it’s more like falling off a cliff and climbing back up before you can ascend to the top of a mountain.

 

bob_sullivan
Bob Sullivan
Writer

Subscribe

Get the latest behavioral
science insights.