(C) Daily Kos
This story was originally published by Daily Kos and is unaltered.
. . . . . . . . . .



Cognitive Bias Bootcamp: The Dunning-Kruger Effect [1]

['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.', 'Backgroundurl Avatar_Large', 'Nickname', 'Joined', 'Created_At', 'Story Count', 'N_Stories', 'Comment Count', 'N_Comments', 'Popular Tags']

Date: 2022-10-10

Hello again, fellow travelers! It’s been awhile since I’ve put a new entry to one of my “Bootcamp” series up (lots of reasons why that I won’t go into right now, but holy crap has it really been since July???), but as I’ve found the time this weekend I wanted to put together one on a cognitive bias that gets a lot of bandwidth on the ‘net and is pretty widely known: The Dunning-Kruger Effect!

So, what is the Dunning-Kruger effect?

It’s a cognitive bias whereby people overestimate their knowledge or competence in or about something. The phenomenon was first defined by researchers David Dunning and Justin Kruger in 1999 in their paper “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.”

In their study, for example, Dunning & Kruger found that people they evaluated with assessments of logic, grammar, and sense of humor found that participants that scored in the 12th percentile self-rated themselves as being in the 62nd percentile.

The basic concept is that in order to successfully evaluate one’s own skill or knowledge level in a skill or subject, a person needs a certain level of skill or competence in that area. In other words, they need to know enough to know how little they really know. To quote Dunning & Kruger themselves, “Those with limited knowledge in a domain suffer a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it.”

In 2018, a follow up study by Anson, “Partisanship, political knowledge, and the Dunning‐Kruger effect” (I haven’t provided a link because it’s paywalled, but the abstract is easily found if you’ve a mind to find it), Anson found that “individuals with moderately low political expertise rate themselves as increasingly politically knowledgeable when partisan identities are made salient. This below-average group is also likely to rely on partisan source cues to evaluate the political knowledge of peers.” In other words, people with low levels of political knowledge not only overrated their knowledge, but this effect was pronounced when the politics involved were of a partisan nature. This group was also more likely to get what knowledge they have from partisan sources. This held up regardless of whether the subjects politics were on the left or right of the political spectrum.

The graph I’ve put at the top of this diary is an illustration of how the knowledge curve seems to go. Initially, the subject gets a little knowledge or skill in something. Once they get a little knowledge under their belt, they eventually reach a level where they think they know much more than they really do.

But then, if they continue to gain skill/knowledge, they quickly get past that and realize just how little they actually know. Sometimes this can lead to this self-evaluation more or less falling off a cliff, into the “Valley of Despair” labeled on the graph where the subject, on realizing they really aren’t nearly as knowledgeable/competent as they thought, to become frustrated and discouraged. In other words, they realize that knowledge/skill doesn’t come cheap and requires a lot of work, and they may despair of their ability to be able to get there.

After that, knowledge/skill level increases and self-evaluation of same move upward at a much more connected and consistent level. Someone who’s pretty competent in an area generally knows they have that degree of competence but know they haven’t reached true expertise, for example, and true experts usually recognized that they are among the top of their field — though even they often realize that there is always a higher level to attain.

But...is it real? The Controversy over Dunning-Kruger

Some people argue that the Dunning-Kruger effect is not real and can be explained away, for example claiming that it is a mere statistical artifact. For example, at least two papers have claimed that the Dunning-Kruger effect can be found using random data, and that both experts and novices both tend to over AND underestimate their abilities, just that experts do it with a smaller range.

Another argument is that Dunning-Kruger is just a combination of the “Better Than Average” cognitive bias where people in general tend to be overly optimistic of their abilities compared to their average peer group, and the regression to the mean.

David Dunning himself has responded to critics on this, noting for example that the “statistical artifact” crowd are (though he doesn’t specifically use the term) cherry-picking their data, basing their case on a narrow set of selected data among a couple decades of studies while ignoring the data that doesn’t support their desired conclusion. Dunning also noted that there have been other methods used to evaluate the Dunning-Kruger effect that avoid the pitfalls of regression.

The set of folks that think that the Dunning-Kruger Effect is not real is relatively small and the general consensus of research over the nearly quarter of a decade now since the initial study was done is that it is real, though it’s specific impacts can be debated.

Consider, for example, a 2018 study of 1,300 American polled about vaccines. Roughly 1/3 of them claimed to be as knowledgeable or more knowledgeable than doctors and medical researches about the causes of autism. When actually tested on their knowledge, that group tested the lowest of those surveyed. That seems to be far, far more than a simple “statistical artifact” could explain away.

Something to consider going forward, though: being a victim of Dunning-Kruger is NOT an indicator of intelligence. Someone is not stupid just because they overestimate their tennis ability (related side note: oh, my, have you seen how many guys think they can score against Serena Williams?) or their knowledge about the Roman Empire. Someone can, indeed, be a brilliant and acknowledged expert in a field and yet fall victim to Dunning-Kruger elsewhere where they are not (see, for example, Nobel Prize winner Linus Pauling and his vitamin C obsession). So don’t assume that someone falling victim to Dunning-Kruger is a fool or stupid. And, indeed, it is possible that you or I may fall victim to it ourselves if we are not careful — which is why it’s so important to be aware of these cognitive biases so we can do our best to self-evaluate.

Logical Fallacies Bootcamp:

The Strawman

The Slippery Slope

Begging the Question

Poisoning the Well

No True Scotsman!

Ad Hominem

False Dilemma

Non Sequitur

Red Herring

Gamblers Fallacy

Bandwagon Fallacy

Appeal to Fear

The Fallacy Fallacy

Appeal to Personal Incredulity

Appeal to Authority

Special Pleading

Texas Sharpshooter

Post Hoc

Appeal to Nature

Furtive Fallacy

Alphabet Soup

Cognitive Bias Bootcamp:

Bystander Effect

Curse of Knowledge

Barnum Effect

Declinism

In-Group Bias

Hindsight Bias

Survivor Bias

Rhyme-as-Reason Effect

Apophenia (& Paradoleia)

[END]
---
[1] Url: https://www.dailykos.com/stories/2022/10/10/2111202/-Cognitive-Bias-Bootcamp-The-Dunning-Kruger-Effect

Published and (C) by Daily Kos
Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified.

via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/dailykos/