Cutting down the dissonance: the psychology of gullibility

Christina Valhouli

T wo years ago, 14-year-old Nathan Zohner, a student at Eagle Rock Junior High in Idaho Falls, announced on the Internet that he had circulated a petition demanding strict control of a chemical known as dihydrogen monoxide. This substance, he wrote, caused excessive sweating and vomiting, can be lethal if accidentally inhaled, contributes to erosion, and has been found in tumors of cancer patients. The student asked 50 people whether they supported the ban. Forty-three said yes, six were undecided, and only one knew that dihydrogen monoxide was... water.

While embracing a ban on H2O seems more foolish than dangerous, this anecdote shows how quickly people embrace some kinds of ideas without subjecting them to critical scrutiny. The human propensity to accept ideas at face value--no matter how illogical--is the fertile soil in which pseudoscience grows. Beliefs in UFOs, astrology, extrasensory perception, palm reading, crystal therapy, or guardian angels do not meet scientific criteria for rational plausibility (such as experimental reproducibility or Karl Popper's idea of falsifiability) and generally rely on anecdotes instead of hard evidence for support, though they may partake of scientific-sounding terms or rationales; all such concepts can be safely described as pseudoscience. Why do people embrace irrational belief systems even after repeated disconfirmation by scientists?

It is easy to dismiss these ideas as amusing and eccentric, but in some situations they pose concrete dangers to individuals; they occasionally even affect society. Former First Lady Nancy Reagan revealed in her autobiography that she employed a psychic for seven years to schedule dates for important meetings; more recently, Hillary Rodham Clinton admitted to having imaginary conversations with Eleanor Roosevelt on the advice of New Age guru Jean Houston.1 These public figures are hardly alone in seeking answers from the stars and soothsayers; the persistence and popularity of such beliefs reflect the many perceived benefits in pseudoscience. Psychologists agree that all belief systems--astrology, Objectivism, religion--ease anxiety about the human condition, and provide the illusion of security, predictability, control, and hope in an otherwise chaotic world.

Scott Lilienfeld, assistant professor of psychology at Emory University and consulting editor at the Skeptical Inquirer, identifies two major catalysts for the prevalence of pseudoscientific beliefs: the information explosion (often a misinformation explosion) and the low level of scientific literacy in the general population. He cites poll data indicating that only 7 percent of the population can answer basic scientific questions like "What is DNA?" or "What is a molecule?" And when science cannot provide answers, or when people refuse to accept a scientific explanation (such as when fertility treatments don't work), pseudoscience often provides highly individualized explanations. "People believe in things like astrology because it works for them better than anything else," says Herbert Gans, the Robert S. Lynd professor of sociology at Columbia. "Your own system is the most efficient one, whether it's a guardian angel, a rabbit's foot, or a God watching over you. And if it doesn't work, there's always an excuse for it."

Another reason people find pseudoscience plausible is a cognitive ability to "see" relationships that don't exist. "We have an adaptive reflex to make sense of the world, and there is a strong motivation to do this," says Lilienfeld. "We need this ability, because the world is such a complex and chaotic place, but sometimes it can backfire." This outgrowth of our normal capacity for pattern recognition accounts for the "face on Mars" (a group of rocks that allegedly resembles a face) or the belief that a full moon causes an increase in the crime rate. When people believe in something strongly--whether it is an image on Mars or a causal interpretation of a chronological association--they are unlikely to let it go, even if it has been repeatedly discounted.

In some cases, contradictory evidence can even strengthen the belief. As Leon Festinger and colleagues discussed in When Prophecy Fails,2 holding two contradictory beliefs leads to cognitive dissonance, a state few minds find tolerable. A believer may then selectively reinterpret data, reinforcing one of the beliefs regardless of the strength of the contradictory case. Festinger infiltrated a doomsday cult whose members were convinced the earth was going to blow up; when the date passed and the earth didn't explode, the cult attributed the planet's survival to the power of their prayers. "When people can't reconcile scientific data with their own beliefs, they minimize one of them--science--and escape into mysticism, which is more reliable to them," says Dr. Jeffrey Schaler, adjunct professor of psychology at American University.

Belief systems tend to respond to challenges according to this pattern, says Lilienfeld. When researching a cherished belief or coming across information about it, a person may process the data as if wearing blinders, registering only the affirming information. The malleability of memory compounds this effect. "Once you have a belief, the way you look at evidence changes," says Tory Higgins, chair of the psychology department at Columbia, whose research specialty is mechanisms of cognition. "When you search your memory, you are more likely to retrieve information that will support it and avoid exposure to information that will disconfirm it. If you fail to avoid it, you attack the validity and credibility of the source, or categorize it as an exception."

Dr. Robert Glick, head of the Columbia Center for Psychoanalytic Training and Research, calls belief systems "societal pain relievers." "People will recruit anything from their environment that will ensure and protect their safety," he says. "It gives you a sense that you're not alone, and helps ease feelings of being powerless." Power--whether an increase in a person's perceived power or an abdication of it--is a major component of pseudoscience, and Glick explains people's relations to power in Freudian terms. He describes belief systems as a metaphoric representation of our parents, providing a release from authority and responsibility. "People have a built-in predilection that wishes for assistance and support. This is an extension of childhood, where there were always people around us who control our life. Beliefs like astrology and even religion are a projection that there are forces in the heavens that are like your parents."

While it may be fun to read horoscopes in the newspaper, can real harm come from believing strongly in pseudoscience? Lilienfeld advises citizens to consider how pseudosciences pose concrete threats by weakening critical thinking and minimizing a person's sense of control and responsibility. For individuals, this phenomenon can translate into thousands of dollars wasted on quack remedies--not to mention the medical danger to patients who forgo more reliable treatments. The risks extend to the societal level. "We need to be able to sift through the information overload we're presented with each day and make sound judgments on everything from advertising to voting for politicians," Lilienfeld says.

Gans offers a more forgiving point of view. "If someone believes strongly in something like guardian angels, and they're not in a mental hospital, and we haven't provided a better answer, why not?" says Gans. "But if you're just sitting inside your house all day and say, 'Well, my guardian angel is going to take care of everything,' then that's bad." And perhaps not too far from supporting a ban on dihydrogen monoxide.


1. Reagan, Nancy, with William Novak. My Turn: The Memoirs of Nancy Reagan (NY: Random House, 1989. Clinton anecdote is from Woodward, Bob, The Choice: How Clinton Won (NY: Simon & Schuster, 1996). Clinton admitted this herself in her syndicated column on June 4, 1996.

2. Festinger, Leon; Riecken, Henry W.; Schachter, Stanley. When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the Destruction of the World (NY: Harper & Row, 1964).


Related links...

  • James K. Glassman, "Dihydrogen Monoxide: Unrecognized Killer," reprinted from Washington Post at Stephen Milloy's junkscience.com

  • ESP information, Psych Infobank, David Myers, Hope College Dept. of Psychology

  • Skeptic's Dictionary: extrasensory perception

  • Mysterium ("A Wonderful Internet Resource For All Things Spiritual, Supernatural, and Sublime")

  • Nicholas Dykes, "A Tangled Web of Guesses: A Critical Assessment of the Philosophy of Karl Popper," Libertarian Alliance Philosophical Notes

  • Jack Germond, "Hillary's Guru: An indicator of the quality of American political debate," Baltimore Sun


  • CHRISTINA VALHOULI is a New York-based freelance writer and a recent graduate of Columbia's Graduate School of Journalism. Her work has appeared in Salon, New York, Odyssey, the Hellenic Chronicle, and other publications.

    Photo Credits:
    Applewhite: AP/Wide World Photos
    Woman with crystal ball: Photos To Go
    Healing crystals: http://www.neatstuff.net/avalon/a-f/crystal-a.html
    Full moon: NASA
    UFO: http://hooked.net/~ufoman/pi/html