Ok, let’s take a break from corporate manipulation for a second and go back to first principles…

One question I get a fair bit of email about is why manipulation works in the first place. Not corporate manipulations, mind you, which is relatively easy to explain (a lot of people putting a lot of time into figuring out how to subtly influence small, individually inconsequential choices) but larger-scale political manipulations. How did a playboy playmate convince a significant portion of the population that vaccines cause autism? Why do so many people believe in 9/11 conspiracies? Why do some people still believe that there WMDs in Iraq (the subject of a full future blog, this one)?

Do you know where the phrase “sour grapes” comes from? Aesop, the Greek fable writer, wrote many of our most well known fables (the tortoise and the hare, anyone?) and one of his fables dealt with a fox that couldn’t reach some grapes. Eventually, the fox convinced himself that the grapes would be sour anyhow, and walked away. Aesop concludes with “People who speak poorly of things that they cannot attain would do well to reflect on this.”failed manipulation is like delicious sour grapes Delicious, sour grapes

Translation: if we can’t have something, we tend to disparage it. This is one example of cognitive dissonance, which is a well-studied theory of how people learn and form opinions. It turns out that humans are remarkably good at justifying themselves – we will question everything before we question whether we could be wrong.

Brendan Nyhan published a very interesting paper (PDF here) where he reports that, in terms of political opinions, facts don’t seem to matter that much: he showed several groups that held wrong opinions real information that contradicted their beliefs, and then watched as they completely failed to change their opinions based on the new information. In fact, perversely, some became actually more convinced of their wrong perception based on seeing the real facts.

Think about that for a second.

Not only did showing these guys information that corrected their opinions NOT change their mind, it actually made a subset of them feel MORE STRONGLY about their wrong opinions!

manipulation on web pages

So very, very sad

We’ve been aware of this type of issue before. In his book “When Prophesy Fail“, Leon Festinger infiltrated some cult members who had predicted the end of the world on a certain date. When the date came and went, they just didn’t pack up and say “oh well, we screwed up. Ha ha, sucks being us for selling our house and valuables! Boy, did we get taken!”. No, they became more intense believers, made some excuses, set a new date, and briefly did everything but admit that they were wrong (read the book, it’s a good read). Leon Festinger showed that it took 3 distinct failures, three ‘disconfirmations’ before most (but still not all) of the group decided that the world was not going to end.

So we’re all inherently wired to ignore facts when they don’t fit with our preconceived notions. If someone believes that President Obama was not born in the US, for example, simply seeing his birth certificate, or having the Hawaii Department of Health certify to it, or anything else, really, is not going to change their opinions.

This is pretty sad, but there’s hope, right? I mean, three disconfirmations is not all that hard to come by in the day of Google, right? Surely, over time, people who hold wrong opinions will get exposed to enough data to correct them, right?

No.

Well, actually, if anything, it gets worse.

See, in a world where anyone can set up a web page, or where you have 12 news channel competing for a limited set of ‘news’, it’s relatively easy to find ways to substantiate your opinion.

Let’s take some examples, using Google as our proxy for availability:

Vaccines cause autism:                  205,000 pages

World will end in 2012: 737,000 pages

Holocaust did not happen: 343,000 pages

30% of Americans believe in Astrology. Many Americans still oppose Fluoride in water for health reasons, even though the original movement was started by the John Birch Society on libertarian, not medical grounds. 6% of Americans (i.e. close to 2M people!!) believe that the moon landing was staged on a sound stage by NASA.

And there are thousands of web pages proving each of these propositions.

And web pages are reliable, right?

In other words, however absurd your belief, in a world where everyone owns a printing press, you will find hundreds, if not thousands of sources that confirm your bias. And each one makes it less likely that you will see enough discordant (accurate) information to shift your bias.

The media does not make it any easier, either. Some outlets try for objectivity by presenting clashing perspectives of any issue, which simply tends to reinforce the opinions held by either side (remember Jon Stewart’s indictment of Crossfire? Google it. It’s 10 minutes of pure bliss). Others (Fox News) have a built-in bias to cater to a certain group, so the stories and perspectives tend to simply increase that bias. Very few tend to promote a fact-based, analytical view of the world.

So how does all this fit with manipulation?

Well, it explains why you can manipulate large groups of people on fairly significant issues, for one thing. If people are biased to ignore facts that contradict their opinions, and they can quite easily find confirmation through the web or media that, in fact, they are right, fighting manipulation and misconception become very difficult.

Facts won’t do it. Research won’t do it (thanks, proliferation of web pages!). Experience will, but many issues are beyond the claim of personal experience. And this is why we still have otherwise rational, nice, smart people believing all sorts of absurd things.

Manipulation works because good manipulators choose their audience. The best manipulators basically focus on confirming things that their audience already believes (talk show hosts are a weak version of this, as are radio-call in show hosts). Facts will not deter this audience, since facts are secondary to opinion for most of us anyhow. Stephen Colbert put it best on his show with one word – Truthiness – the truth that we feel, rather than the one that is objectively true.

Once you find a receptive audience, your work as a manipulator is almost done. Much of the work of political strategists today is to understand how to segment a message, for example: to basically divide the population into subsets, each of which has their own bias and beliefs, and to craft a message to each that caters to those beliefs. And the more confusing, complex, or technical an issue, the easier it will be to obfuscate the truth and the harder it will be for facts to change opinions (climate change, anyone?).

So let me leave you with another of Aesop’s fables – the North Wind and the Sun. The Sun and the wind made a bet to see who was stronger; they decided to see who could remove the coat of a traveler faster. The wind went first, blew and blew, but could not remove the coat – the traveler just bundled himself more and more as the wind blew stronger. When he gave up, the sun started to shine and the traveler, heating up, eventually took off his coat. The moral: Persuasion is much better than brute force. And so it goes with manipulation – hard facts will fail where persuasion and understanding of your audience will win.