Posted 02 May 2017
We have often wondered why individuals believe in crazy claims being made for CAMS. In spite of great evidence refuting the evidence, many will continue to believe the nonsense. This article published in Quartz, tries to make sense of this.
Millions of people refuse to recognize man-made climate change. Americans spend billions on homeopathy. Around 12 million people believe that lizards are secretly ruling the world. The world is filled with bad, baseless, factually inaccurate ideas that refuse to die. If you’ve ever found yourself unable to halt someone else’s idiotic plans once they were already in motion, you’re not alone.
Read the article at Quartz
In the event the article is not available at Quartz, it is reproduced here.
Millions of people refuse to recognize man-made climate change. Americans spend billions on homeopathy. Around 12 million people believe that lizards are secretly ruling the world. The world is filled with bad, baseless, factually inaccurate ideas that refuse to die.
Philosopher Russell Blackford, a lecturer at the University of Newcastle in Australia, tweeted about this phenomenon earlier this month:
If you’ve ever found yourself unable to halt someone else’s idiotic plans once they were already in motion, you’re not alone. Whether you’re a politician trying to make congress see sense or simply a manager trying to halt an atrocious team-building plan, there’s simply no foolproof way to kill a terrible idea.
Blackford blames the momentum behind bad ideas on cascade effects. Yes, individuals are prone to making poor decisions for emotional or biased reasons (known as “cognitive heuristics”) and this irrationality is part of the problem. But there’s also a broader sociological issue, in that others’ opinions carry a huge amount of weight in influencing our views. A cultural consensus—even without proper evidence—can form pretty quickly.
If one person convinces a second, says Blackford, then a third person will be far more likely to agree with the majority view. This effect exponentially increases with each person who agrees with the others. “We soon have a sociological effect whereby everyone knows that, say, a certain movie is very good or very bad, even though everyone might have ‘known’ the exact opposite if only a few early voices had been different,” says Blackford.
The cascade effect can help explain why great movies such as The Wizard of Oz or Heathers can flop at the box office, while terrible movies such as Hangover III rake in millions. It can also steer equally talented people into wildly different levels of success—because one or two influential people vouching for an employee carry a lot of weight.
This means views can be culturally “obvious” even when there’s no objective evidence
“Once a view is popular with the general public, or just within your own ‘tribe,’ it takes a lot of courage even to question it to yourself,” says Blackford. For example, just 50 years ago, homosexuality was banned in many western countries. “It would have been a very brave person to put their hand up and say, ‘There’s nothing wrong with being gay.’”
And so public opinion often stays in place even after private beliefs slowly shift. However, “if there’s a shock to the equilibrium it can collapse suddenly because the appearance of support for it is an illusion,” says Blackford. The fall of the USSR, when fervent belief in communist ideals and leaders faltered in 1989, and the dramatic decline of Christian beliefs in Europe in the 1960s, are two examples of public opinions suddenly shattering.
But Blackford points out that studying the fall of popular consensus to determine how a bad idea got killed is a difficult endeavor—in part because once a new majority opinion is in place, many will pretend they never held the original view in the first place.
Don’t make a bad idea even more powerful
Generally, Blackford says it’s best to try and reason with a bad idea and point out the logical flaws, but this is only likely to have an impact if you’re an influential figure within the tribe. In some cases, though, reasoning can feed support for the bad idea—for example, if famed evolutionary biologist Richard Dawkins were to argue with a creationist, then he’d essentially be giving a platform to unscientific views, thus lending them validity.
As Oxford researcher Brian Earp has noted, it’s far easier to put forward ill-informed and nonsensical views than it is to systematically refute them, meaning that even the most logical rebuttal can fail to puncture a bad idea.
It turns out there’s a phrase to describe this phenomenon: “Gish Gallop.” It’s named for Duane Gish, an American creationist who was famous in the 1980s and 1990s for asking evolutionary scientists to debate with him. He would proceed to put forward a multitude of errors, and ignore all reason. As Earp describes it, “It’s a lose-lose situation. Ignore you, and you win by default. Engage you, and you win like the pig in the proverb who enjoys hanging out in the mud.”
So, how should we respond to the insufferably long life of bad ideas? Well, Blackford points out that we could start by being more skeptical of our own ideas, even those that are widely held and seem obvious. But when it comes to refuting widely held bad ideas, it’s tough to halt a tide of popular nonsense.