Recent research has shed light on people’s inability to detect false or misleading information, revealing what researchers affectionately dub the “bullshit blind spot.”
In a pair of studies, researchers found that people who were the worst at detecting misinformation often vastly overestimate their detection skills, believing they are better than the average person. Conversely, those who excel at detecting BS tend to underestimate their abilities, believing they are slightly worse than the average person.
The findings, published in the journal Thinking & Reasoning, offer valuable insights into why seemingly intelligent people sometimes fall prey to false information and the psychological factors at play.
Lead study author Dr. Shane Littrell, a postdoctoral research associate at the University of Miami, said he and co-author Dr. Jonathan Fugelsang, a professor of psychology at the University of Waterloo, wanted to better understand if there are common characteristics among people who fall for misinformation and “why smart people believe dumb things.”
“My co-authors and I recently published a study examining whether people who spread misinformation are also more likely to fall for it – that is, whether one can ‘bullshit a bullshitter,” Dr. Litterell told PsyPost.
“One of the main implications of that work suggests that people who intentionally spread misinformation in some situations can also unintentionally spread it without realizing it in other situations,” Dr. Litterell added. “To me, this seemed to suggest that some people who knowingly spread bullshit are unaware of the fact that they often fall for it themselves, possibly because they think they’re better at detecting it than everyone else.
Researchers conducted two studies involving 412 participants to examine the connections between BS detection, overconfidence, and cognitive processes in evaluating misleading information.
In the first study, participants rated a series of statements as profound or nonsensical. Their ability to correctly classify actual profound statements and identify generated nonsense statements determined their BS detection score. Participants also estimated their own performance and the performance of others, providing valuable confidence metrics.
Half of the statements examined by participants were quotes from famous public figures, generally considered profound. The other half of the statements were randomly generated by a computer program to sound profound but were inherently meaningless and nonsensical.
The researchers also sought to determine whether individuals prone to falling for BS rely more on fast, intuitive thinking or slower, reflective thinking.
To explore this, researchers conducted a second study examining individual differences in participants’ perceptions of their thinking processes and whether they could detect nonsensical information immediately or only upon further reflection.
By comparing participants’ subjective ratings of their thinking process with objective measures of their evaluation speed, the study found a positive correlation, suggesting that both intuitive and reflective thinking are involved in detecting and falling for BS.
Dr. Littrell said the main finding from the study was that individuals most susceptible to falling for BS not only overestimated their detection abilities but also believed they were superior to the average person in this regard.
“This is kind of a double-whammy in terms of bullshit susceptibility that we call the ‘bullshit blind spot.’ The other interesting finding was that the people who are best at detecting BS are actually under-confident in their detection skills and think they’re worse at it than the average person (i.e., they have a bullshit ‘blindsight’),” said Dr. Littrell.
According to Dr. Littrell, the “bullshit blind spot” is more problematic than “bullshit blindsight” since it involves individuals who are both poor at detecting BS and overly confident in their abilities.
“I think the most important thing to take away from our findings is that everyone would be better off practicing more intellectual humility and skepticism,” appealed Dr. Littrell.
“This is tough for most people because we all like to believe that we’re smart and in control of what we think and believe and that we aren’t easily fooled. Unfortunately, many people who believe this are quite wrong.”
Researchers noted that their recent findings seem to support past studies which have found that less intelligent people tend to engage in a higher frequency of persuasive but unconvincing bullshit. In contrast, people with higher intelligence engage in less but higher quality and more convincing bullshit.
“It may be the case that bigger bullshitters are not necessarily better bullshitters,” wrote study authors. “Bringing these related lines of research together seems a logical ‘next step’ for future bullshitting research to take.”
Researchers say the results ultimately serve as a reminder that everyone is susceptible to being fooled by misinformation.
Dr. Littrell highlighted the need for intellectual humility in daily life, enabling individuals to recognize their cognitive vulnerabilities and become more discerning about the information they encounter.
“The phrase, ‘What if I’m wrong?’ can be an incredibly liberating and protective mantra to live by,” said Dr. Littrell.
[Editor’s Note: An earlier version of the article paraphrased Dr. Littrell as saying, “the most striking finding from the study…” was changed to read “the main finding from the study…” to better reflect study authors’ sentiments.]
Tim McMillan is a retired law enforcement executive, investigative reporter and co-founder of The Debrief. His writing typically focuses on defense, national security, the Intelligence Community and topics related to psychology. You can follow Tim on Twitter: @LtTimMcMillan. Tim can be reached by email: tim@thedebrief.org or through encrypted email: LtTimMcMillan@protonmail.com