The Dunning-Kruger Effect Explained with Confidence
A lot of this blog is direct quoting of a 2014 article by Dunning, not all of which is credited in the text.
What is It?
The Dunning-Kruger Effect states that incompetent people cannot recognize just how incompetent they are, and therefore are overly confident about their abilities. By way of contrast, experts under-estimate their ability. (1999 David Dunning, Justin Kruger).
Logic itself almost demands this lack of self-insight: For poor performers to recognize their ineptitude would require them to possess the very expertise they lack. Although what we know is often perceptible to us, even the broad outlines of what we don’t know are all too often completely invisible. To a great degree, we fail to recognize the frequency and scope of our ignorance. On the other hand, experts assume that everyone knows what they know, and therefore they have less confidence in their knowledge being particularly special.
The Effect is Widespread
Although the meme sometimes quoted as “stupid people don’t know how stupid they are”, the DK Effect affects us all. Anyone who doesn’t know much about any given set of cognitive, technical, or social skills tend to grossly overestimate their prowess and performance – whether it’s grammar, emotional intelligence, logical reasoning, firearm care and safety, debating, or financial knowledge. College students who hand in exams that will earn them Ds and Fs tend to think their efforts will be worthy of far higher grades; low-performing chess players, bridge players, and medical students, and elderly people applying for a renewed driver’s license, similarly overestimate their competence by a long shot.
A Range of Cognitive Biases
The Dunning-Kruger Effect, and the related set of cognitive biases, are not actually newly observed phenomena, of course. They have been observed throughout time. There are a range of cognitive biases around ‘illusory superiority’, of which the Dunning-Kruger Effect is but one.
There are many quotes describing the Effect, or similar cognitive biases:
“Confidence is what you have before you understand the problem.” – Woody Allen
“For every complex problem there is an answer that is clear, simple, and wrong” – L. Mencken
“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” – Josh Billings
“You don’t know what you don’t know” – Scott Adams (Dilbert)
Huh? … [Complete Ignorance is Bliss]
Dunning makes an observation in a 2014 article that I believe goes well beyond the Dunning-Kruger Effect. He states that “…in many cases, [complete] incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge. [When looking at the results of a simple physics test] …people who got none of the items right often expressed confidence that matched that of the top performers. When looking only at the confidence of people getting 100 percent versus zero percent right, it was often impossible to tell who was in which group.”
I Once Was Blind, But Now I See
It turns out that driver’s education courses, particularly those aimed at handling emergency maneuvers, tend to increase, rather than decrease, accident rates. They do so because training people to handle, say, snow and ice leaves them with the lasting impression that they’re permanent experts on the subject. One solution is to not try to teach them in the first place!
Hmm-m-m. There’s More to This Than I Thought.
Much incorrect self-assessment of competence derives from the person’s ignorance of a given activity’s standards of performance. Dunning and Kruger’s earlier research indicates that training in a task, such as solving a logic puzzle, increases people’s ability to accurately evaluate how good they are at it, but further studies have shown that after a little instruction, people tend to be more confident that they are right, but unfortunately, not a lot less wrong! Dunning goes on to observe “…people are more confident, even when they are wrong, after instruction. Tellingly, the only response that uniformly went down after instruction was ‘I don’t know.’ What education often does appear to do is imbue us with confidence in the errors we retain.
Oh Man, I’m Never Going to Understand This
As knowledge in the area increases, people are more able to understand just how much they don’t understand – and in fact often over-compensate in their lack of confidence. This effect persists, so that even experts are not 100% confident in what they know.
Expose Logical Fallacies (Socratic Method)
Dunning says “In the classroom, some of best techniques for disarming misconceptions are essentially variations on the Socratic method. To eliminate the most common misbeliefs, the instructor can open a lesson with them—and then show students the explanatory gaps those misbeliefs leave yawning or the implausible conclusions they lead to. For example, an instructor might start a discussion of evolution by laying out the purpose-driven evolutionary fallacy, prompting the class to question it. (How do species just magically know what advantages they should develop to confer to their offspring? How do they manage to decide to work as a group?) Such an approach can make the correct theory more memorable when it’s unveiled, and can prompt general improvements in analytical skills.”
Replace the Misbelief with a Truth
Telling people that Barack Obama is not a Muslim fails to change many people’s minds, because they frequently remember everything that was said—except for the crucial qualifier “not.” Rather, to successfully eradicate a misbelief requires not only removing the misbelief, but filling the void left behind (“Obama was baptized in 1988 as a member of the United Church of Christ”). If repeating the misbelief is absolutely necessary, researchers have found it helps to provide clear and repeated warnings that the misbelief is false. I repeat, false.
Precede Lessons With a Self-Affirmation Exercise
Being wrong prompts a defensive response. In order to counter that effect, some success has been had in building up self-esteem before challenging a belief. This is particularly important where that belief forms a part of the person’s self-identity. In a study in which pro-choice college students negotiated over what federal abortion policy should look like, participants made more concessions to restrictions on abortion after writing self-affirmative essays.
Force Critical Thinking with a Devil’s Advocate
Behavioral scientists often recommend that small groups appoint someone to serve as a devil’s advocate—a person whose job is to question and criticize the group’s logic. While this approach can prolong group discussions, irritate the group, and be uncomfortable, the decisions that groups ultimately reach are usually more accurate and more solidly grounded than they otherwise would be.
It appears that the Dunning-Kruger Effect is, appropriately, subject to the Dunning-Kruger Effect itself. A simple formulation such as ‘stupid people don’t know how stupid they are, and smart people think everyone else must be smart too’ hides layers of cognitive bias complexity. I started my research for this article with the confidence of “I Once Was Blind, But Now I See” that most folk are probably in. I’m now somewhere between “hm-m-m. There is More to This Than I Thought”, and “Oh Man, I’m Never Going to Understand It”. How about you?