THE CURSE OF OVERCONFIDENCE

In this podcast and Q&A, David Dunning, co-discoverer of the Dunning-Kruger effect, investigates the misinformation gap built into our brains: We don’t know what we don’t know.

By Corey S. Powell

The Dunning-Kruger effect describes a disturbing cognitive bias that afflicts us all. People with limited expertise in an area tend to overestimate how much they know—and we all have gaps in our expertise.

That disconnect may explain why some patients turn to “Dr. Google” to make at-home diagnoses of complex medical problems, as well as the missteps we all make from time to time, from fixing the plumbing to representing ourselves in a court of law.

Over the years, the Dunning-Kruger effect has gone from a scientific hypothesis to a popular meme, pulled out in shouting matches across social media. In the hierarchy of insults, there are few more powerful than invoking the idea that your opponents are so stupid that they don’t even know how stupid they are. It’s just one step short of calling the other side a bunch of Nazis, aka Godwin’s Law—the traditional way that flame wars end. 

David Dunning, now a social psychologist at the University of Michigan, and Justin Kruger, now at NYU, proposed their namesake effect in a famous 1999 paper below.



In a series of surveys combined with tests, they found that students from Cornell who scored in the bottom quartile estimated that they had scored in the third quartile, and identified related forms of unearned confidence. Since then, Dunning has extended his investigations into the mechanisms of trust and belief. OpenMind co-editor Corey S. Powell spoke with Dunning about his ubiquitous effect and how it colors self-knowledge for us all. (This conversation has been edited for length and clarity.)soun


LISTEN TO THE PODCAST

OpenMind · Expertise

Corey S. Powell speaks with David Dunning on the nature of expertise.

READ THE INTERVIEW

The Dunning-Kruger effect is a term that gets thrown around a lot in arguments, especially online. People use it to say that their opponents don’t know what they are talking about. What’s it like to have your research turn into a pop-culture meme?

It’s strange because public notoriety has nothing to do with scientific or academic notoriety. I feel about it like Jack White of The White Stripes feels about that riff from “Seven Nation Army.”

It’s everywhere, around the globe. Jack White is tickled pink that something that he wrote passed into folk music. I feel the same way, but I wish people wouldn’t use it as an invective, because it is really about being reflective about yourself and knowing that there might be things you don’t know. It’s not about judging other people.

The common misconception is that the Dunning-Kruger effect means “stupid people don’t know they’re stupid.” Can you explain the true significance of your research?

The Dunning-Kruger result is a little complicated because it’s actually many results. The one that is a meme is this idea: On any particular topic, people who are not experts lack the very expertise they need in order to know just how much expertise they lack.

The Dunning-Kruger effect visits all of us sooner or later in our pockets of incompetence. They’re invisible to us because to know that you don’t know something, you need to know something. It’s not about general stupidity. It’s about each and every one of us, sooner or later.

You can be incredibly intelligent in one area and completely not have expertise in another area. We all know very smart people who don’t recognize deficits in their sense of humor or their social skills, or people who know a lot about art but may not know much about medicine. We each have an array of expertise, and we each have an array of places we shouldn’t be stepping into, thinking we know just as much as the experts.

My philosopher friend and I call that “epistemic trespassing,” because you’re trespassing into the area of an expert. We saw this a lot during the pandemic. There was a law professor who knew a little evolutionary biology and a little math. He came up with a model of how many cases of COVID-19 there would be in the United States, and his answer was 500, maybe 5,000. He had trespassed into the realm of epidemiology, and he didn’t know what he didn’t know.

Does being aware of the Dunning-Kruger effect help you avoid it, or does it make you even more vulnerable?

One common question I get asked is: What about you? What are your Dunning-Kruger spots? My response is, if Justin and I are right about the Dunning-Kruger effect, I’m the last person to know the areas where I’m incompetent. I’m sure I have colleagues and friends who’d be very willing to fill you in. But life is very good at revealing them. I think it was Vernon Law, the baseball pitcher, who said that life is the cruelest teacher because it gives you the test before it provides the lesson.

I’m willing to listen to the lesson after the test. And I’m consigned to the observation that you become a master at a science once you realize that you are always going to be a beginner. There are always going to be new challenges to face. I’m going to have to improve, to change what I do, to learn what mistakes I’m prone to.

When I look back at my papers, including the 1999 paper that caused us to have this interview, there are things I wish I could have done differently. I welcome that feeling. I’m never going to be a finished product, despite my advanced age. That’s the philosophy I’ve settled on.

There have been several recent articles criticizing the 1999 Dunning-Kruger study, questioning both its methods and its conclusions. How do you respond?

I’m glad you brought it up because you should know there’s a critique. That’s part of science. The critique is that the Dunning-Kruger effect is a statistical artifact known as regression to the mean. People who are poor performers on a test can only overestimate themselves. Those who are high performers can only underestimate themselves, so it’s a measurement error, an artifact.

We talk about that issue in the original article. We did a nine-study series investigating regression to the mean. Other people have done studies that call the artifact into question. The critique tends to focus on the first two studies of a four-study paper in 1999. I can’t dismiss the irony of people not taking into account the 25 years of research that have happened since.

Another area of your research is decision-making. What have you learned about the ways that we process information and then choose what to do?

One theme we look at is what psychologists call motivated reasoning; lay people refer to it as self-deception, wishful thinking or rationalization. The list of creative ways people have for reaching conclusions they wish to reach and dismissing conclusions they find threatening is amazing.

We’ve shown that it goes down even to the level of visual perception: You literally see what you want to see. If we show you wonderful chocolate truffles, they appear physically closer to you than if we take those same chocolate truffles and form them into the shape of dog poop. The more you delve into it, the more you realize the brain is interpreting what’s going on all the time.

When people are looking for advice, they often seek out authority figures who they think share their values, or who believe the things they already believe to be true. Is there a way to break out of that pattern?

Robert Heinlein, the famous science fiction author, said that it’s difficult to learn from someone who always agrees with you. You have to find dissenting voices. The best expert is plural. Look at consensus. Look at multiple experts. Check out a variety of people, and watch out for just favoring the ones who already agree with you.

This gets to another related area—how we decide who to trust. Have you come up with any helpful answers?

If you’re a rational self-interested being, you should assume that other people are rational, self-interested beings who are going to exploit you. (In a purely rational system) you shouldn’t trust other people, because they’re not going to reciprocate. And yet, everybody does it. That’s a good thing because trust allows us to have something called civilization.

It’s a mystery why we do it. Trust is especially a mystery to economists who believe in the rational-actor model. We took on that mystery by doing experiments where people could trust another person with their money—a complete stranger. The experiment is anonymous.

If they trust the other person, they can get money back at a profit. But the other person can also decide to keep all the money. So, the question is, do you give your money to a person you’ve never met and never will meet. Maybe you’ll get some money back at a profit, or maybe you’ll lose all your money.

According to a standard economic analysis, no one should give their money. But a majority of people gave their money to a complete stranger. They trusted the other person, even though they thought the odds were likely that they were never going to get the money back.

OK, I’ll bite. Why do we trust strangers, even when it’s not the rational thing to do?

It took us 10 years, but we finally were able to document to our satisfaction what was going on. We all live in a world of norms—certain principles that we live by with other people, even if they’re complete strangers.

One particularly salient norm is so well learned that we forget we even know it: We do not insult other people. We have to give the money because if we don’t give the money, we’re calling them untrustworthy. We’re insulting them.

And we want to avoid that, even if we don’t know who they are and we’ll never meet them. Our research suggests that a lot of our decisions, including purely economic ones, are driven by social and emotional concerns.

Well, people are often weird about money. Does irrational trust extend to other aspects of human behavior?

It’s true for how we judge another person’s knowledge too. Another of the games we play shows that if someone tells us something, we’ve been taught to assume it’s true. That’s what makes us gullible. We’re in an information age when people are concerned about the public being gullible to false information.

But imagine if we believed that everything other people told us was false. Civilization would break down! So we’re built to be gullible. It is part of the rules that make civilization possible.

I’m amazed to hear you say these things since we’re bombarded with stories about how angry and suspicious the public has become. It sounds like you’re saying the opposite: We have so much kindness and politeness that we struggle to overcome it. Is that right?

Well, that’s true. But I think what’s interesting about the internet and social media is that it takes us out of the setting where we learned all these politeness rules. Right here, you and I are having a conversation. We’re in a relationship. Twitter is not that.

On Twitter, I proclaim something by posting, and you come along a few hours later and you proclaim. We’re not interacting, we’re proclaiming asynchronously. The kindness rules and the politeness rules are not in play.

My anthropologist friends remind me that every time a new communication technology comes around, such as the telegraph or telephone, there is a breakdown in social norms. Whatever politeness rules have been built up don’t yet apply to the new platform.

We’re in the middle of that right now. I think what’s happening with social media is that we haven’t developed the politeness rules that we have for face-to-face interaction.

How do we strike the right balance: maintain enough trust for a functioning civilization but keep our gullibility in check so we don’t fall for every loud, crazy idea on the interne

It’s an interesting question, and the answer is: I don’t know. It’s likely that scientists won’t figure this out. Users will figure this out. Norms will arise bottom-up. For example, on Facebook, a norm had to arise that when couples break up, the person who was broken up with gets to announce it. That norm didn’t arise from a proclamation on high. This is one of the big questions of the future. I just hope there are answers.

Is there any way to know when we’ve trespassed beyond our area of expertise in our own lives? And is there anything we can do about this type of blindness once we are aware?

There are two types of interventions you can do. Something that you can do for other people is give feedback, although not all of us are skilled at giving feedback. One of the best interventions was done by two chemistry professors who taught these huge introductory chemistry classes.

They had students complete weekly practice quizzes before they took the actual tests, so they found out what they didn’t know. Then the professors—and this is key—added a second component.

They had the students sit down and plan out what they were going to do about the deficits they identified. Now they knew: You’re missing this, what are you going to do about it?

So is it just up to us to slog through all the issues surrounding self-deception, motivated reasoning and trust?

A lot of (helpful coping mechanisms) are already incorporated into professions. In law, you have another side, the opposing attorney, who is going to tell you how you’re wrong. Doctors are trained to think about alternative diagnoses.

The scientific method is aimed at disproving hypotheses, not proving hypotheses. In fact, you never use the term prove, because you’re never certain. There are a lot of people in these professions out there, using these techniques every day.

That’s an interesting point. Could we create social and professional institutions that do a better job of reining in our Dunning-Kruger blindspots and our gullibility?

That’s something I’m looking at right now. Being overconfident is only human. I could argue that it is inevitable. We’re always going to choose the course of action that we think is the most reasonable, so naturally we’re going to have some confidence in it. You’d have to evolve institutions that go: Wait a minute, stop and think. Those institutions would have to carry some wisdom that we can thread into our own lives.

This conversation has left me unexpectedly optimistic. The real Dunning-Kruger effect seems a lot less damning toward human nature than the cartoon version that people talk about online.

Yeah, it’s not a question of stupidity. Our ignorance is an everyday companion that we will all carry for the rest of our lives. And ignorance can be a bit of a trickster, darting around the corner, so we never actually get to see what it looks like. I teach a course on self-judgment, and in the first few weeks, I usually dwell on how meager self-insight is.

The Greeks said to “know thyself,” which turns out to be an almost impossible task. I warn my students not to get depressed about this. People worry, “Oh no, there are all these things I don’t know.” Well, that was true beforehand! The only difference is that you now know a few more things than you did before. Maybe a few of those nuggets can be helpful in the future. So don’t get depressed. Be optimistic.



This Q&A is part of a series of OpenMind essays, podcasts and videos supported by a generous grant from the Pulitzer Center‘s Truth Decay initiative.

This story originally appeared on OpenMind, a digital magazine tackling science controversies and deceptions.


We are proud to report that SyndicatedNews.NET is now republishing OpenMind Magazine at https://SNN.BZ under Commons License Attribution-NonCommercial-NonDerivatives 4.0 International (CC BY-NC-ND 4.0 license