Tags

, , , ,

I get ideas everyday of things to write about: tips, links to articles, strings of thematically similar posts on Facebook. I admit, I’m kind of a slow writer, and I often start a piece, get distracted and start notes on another topic, and wind up having multiple drafts I come back to as my mood suits me until I feel they are good enough to publish.

However, I also get discouraged during this process because somehow I feel like either there will never be an end to the topics that need covered, or that I won’t get through to the people who don’t accept expert evidence or reasoned arguments — and trust me, that’s more common than you think.

I’ve become very interested in the psychology of believing things without evidence, particularly as it relates to medicine and holistic healing. An interesting article in the November/ December 2014 issue of Skeptical Inquirer titled “Remembrance of Apocalypse Past” looked at the psychological mechanisms instilled by believers of the 2012 Mayan apocalypse, and how those mechanisms caused those believers to maintain their belief, even after the apocalyptic date passed and nothing happened.

According to the authors, four factors come into play when debunked beliefs continue: cognitive dissonance, sub-clinical dissociative tendencies; gestalt processing, and conceptual availability. As a psychological lay-person, these four factors are a bit difficult to wrap my brain around, however, I want to explore how I see each of them, as they are described in the article, in my everyday musings in the world of pseudoscientific health and wellness.

Cognitive Dissonance. This happens when there are inconsistencies between ideologies and fact which create discomfort for the person holding these ideologies. Usually, the more invested a person is in a belief, whether financially, emotionally, as an attitude or lifestyle, the higher the value they will place on it, and the more they will fight conflicting information.

on your side

This rejection of contrary evidence is so strongly demonstrated among those who have latched onto natural medicine because the appeal of simple, safe, panacea cures is so much higher than the complex, imperfect and often limiting existence of science-based treatments. To point out the flaws or hypocrisies of the belief in question creates so much discord that the person will cope by finding any mental escape back into a place of harmony. S/he will attempt to rationalize how the two conflicting bits of information can coexist, and consequently reinforce the faulty belief. This is how conspiracy theories are born.

Dissociative Tendencies. The authors define this type of dissociation differently from how it is used in psychiatry. Rather, subclincal dissociative tendencies are described as a rather normal habit of seeing the world in a slightly altered perception from reality. One common form of this would be to assume supernatural explanations for mundane abnormalities. If the lights begin flickering suddenly, a dissociative tendency would make the person experiencing it more likely to think they saw a ghost, when the more plausible explanation is due to electrical interruptions.

With credulous treatments, a person puts too much trust in their personal experience rather than objective data. As Dr. Steven Novella, a famed neurologist and skeptic puts it, “When someone looks at me and earnestly says, ‘I know what I saw,’ I am fond of saying, ‘No you don’t. You have a distorted and constructed memory of a distorted and constructed perception, both of which are subservient to whatever narrative your brain is operating under.'”

While that may sound arrogant, the research to back up that statement, particularly in regards to eye-witness statements is hard to argue. The same is true for people who have self-experimented with some natural remedy. They may preface their testimony with “I was skeptical at first…” but they will swear the results they experienced were not only real, but directly caused by the remedy they used and no other variable. This is why anecdotal evidence is not reliable at all.

Gestalt Processing. A type of human information processing, this factor seems to work alongside dissociative tendencies. It is a theory that basically states that, unlike feature-intensive processing, gestalt resides at the other end of the scale, where very little critical thought is given to a concept and thus said concept is accepted as a whole. This tends to happen more with ideas that are not very familiar to us.

For example, while we all eat food and most can describe the basic Magic School Bus version of how digestion works, very few of us are really educated on all the biochemical reactions that take place on microscopic levels throughout our various organs and systems. So when somebody proposes that X food is bad because of Y ingredient which is also used in Z industrial manufacturing process, it is easy to credulously entertain a conspiracy theory of corporate plans of poisoning in the name of profit. While that is a big leap to make, and fails to explain precisely how said ingredient allegedly causes health problems, (at what dose, for how long, at what exposure rate etc.,) not to mention why a business or government would have any motive to kill its customers/ citizens (including their own families), it is gestalt processing that allows people to believe that sort of information.

Similarly, when an idea pervades that a plant, which has a history of being used in religious ceremony and in folk medicine for ages has had positive preliminary findings for very specific types of cancer cells, it is those dissociative personalities aided by gestalt thinking who allow that concept to jump to Miracle Plant Cures Cancer! As the exciting news spreads through social media, news outlets, office gossip and dinner table banter, and finally legislative campaigns, we get the final factor contributing to pseudoscientific belief: conceptual availability.

Conceptual Availability. This, to me, is arguably one of the most important ways a simple belief becomes unquestionable dogma. The more attention is given to a claim, the more likely those with reduced feature-intensive processing would be influenced by it. Take for example the vaccine-autism link myth. Though it has been long debunked, decimated in the most thorough way, it is still talked about so often, there are still many parents who think there must be something to it. And while the availability heuristic could probably work both ways, the difference is that scientific beliefs actually have evidence to back them, where as pseudoscience has only gut feelings.

The influences of conceptual availability alongside uncritical gestalt processing, which occurs most often in people with dissociative tendencies who have a lot to lose if they’re wrong are, according to the authors of the Skeptical Inquirer piece, implicit to the continued belief of irrational and pseudoscientific ideologies. Again, I’m not a psychologist, but from all my experience ranging from being a former believer in certain areas of quackery, to engaging with current steadfast believers, to researching and debunking credulous claims, I can easily accept that this is true. However, there’s one last factor that I would add in contribution to this list. Dunning-Kruger.

The Dunning-Kruger Effect. Named after the Cornell University researchers who first described this cognitive bias, the Dunning-Kruger effect is basically a miscalibration of people’s understanding of their place among others. In other words, the unskilled demonstrate illusory superiority, overestimating their knowledge of a skill or trade while the highly skilled tend to underestimate the value of their worth when compared to others. It’s a classic dichotomy of arrogance and humility.

I suspect that the Dunning-Kruger effect works into the psychology of believers just as much as the four factors mentioned above. When you consider the major personalities and leaders of the natural movement, almost none of them have advanced degrees in the areas they are preaching about. When their amateur-sourced opinions are confronted with professional criticism or educated debate, they would rather cry defamation and persecution, deflect with broad ad hominem attacks and unsubstantiated accusations, censor opposing information and questions that could lead to the dissonance they so readily avoid (afterall, they do usually have major financial and emotional commitments to protect).

What irony that they and their followers, whose entire lives are formed around a logical fallacy (the appeal to nature) should declare superiority to those whose life’s work is devoted to systematically understanding the natural world and presenting unbiased information in hopes of progressing the collective knowledge of humankind at this particular moment in history.

The question then becomes, what can be done to counter these influences? The authors of the Skeptical Inquirer article attempt a response to this by saying, “The answer is obvious: We need better education in science, in feature-intensive consideration of facts, and in the ability to analyze paranormal claims in terms of their specific details. For human beings, the world consists of a blend of objective reality and our subjective interpretations of that reality; it is that subjective interpretation that is the most subject to the salutary influence of education.”

I don’t know how I feel about this though. To me, it sounds like a very vague and half-assed conclusion, as if they were reaching their word count limit and needed to wrap it up quickly. Of course education and critical thinking need to be bolstered, especially when it comes to scientific literacy, but if it were that simple to influence the subjective interpretation of objective reality, we would not still be combating pseudoscience today, when scientific advancements have meant a wealth of information that we know we know.

With little expertise on this, I’m not certain in what ways we can change how a person, and extensively a culture, thinks. Just broadcasting the facts can’t change one’s mind if facts were not part of the belief’s original makeup. I’m learning more and more, it’s not what you say, but how you say it. Is there even a single technique that is successful? I suspect that it is an existential question that I will probably continue to grapple with as long as I write this blog.

Sharps, Matthew J., Liao, Schuyler W., Herrera, Megan R., “Remembrance of Apocalypse Past: The Psychology of True Believers When Nothing Happens.” Skeptical Inquirer. Vol. 38, Issue 6. November/December 2014