What Makes People Believe Ideas?


  Testing an idea

A lie can get halfway around the world before the truth can even get its boots on.

[Mark Twain]

    We all have powerful forces that influence what we believe. They're called family, personal experience, faith, and in some cases special mentors and friends. Anyone who tries to persuade people reckons with these forces—the cumulative result of a lifetime of relationships and personal learning.

    And yet, there are stories that make us believe almost against our better judgement. Origin stories do that. Dan Heath has written about the origin of eBay. A story circulated that its founder created the company so his fiancee could buy and collect Pez dispensers more easily. it wasn't true, yet it was quoted widely.

    Another story was by one of the creators of YouTube who used to claim that the idea for the business came after a dinner party in 2005, where two of the company's masterminds, Chad Hurley and Steve Chen, shot some video and then tried to post it online and found out just how hard that was back then. Says Heath:

Now, that is, at a minimum, an exaggerated tale. In fact, there's a third founder of YouTube who claims the dinner party never happened. And Steve Chen later admitted in TIME Magazine that the dinner party was embellished to provide a better founding myth.

And I do want to say that while it feels like a little bit of a letdown to realize that this dinner party story is not the whole truth, I feel like it's a little bit unfair for us to expect more of them than the creation of YouTube. I mean, here's this incredible site, and in some sense, that's not enough for us. We want YouTube to have emerged from some kind of everyday experience. It's like it's not enough to have the value of their work. We also want there to be a really compelling story that started it.

    We become attached to the details of a story because it confers mythical qualities to it. Dan and Chip Heath say the use of vivid details is one way to create internal credibility into the idea itself. The details make the claim more believable.

    Made to Stick has other examples of borrowing credibility—from adding details to make our case, to trusting the recommendations of people we want to be like without vetting their honesty and trustworthiness, to our powerlessness when confronting statics.

    Statistics are a sticking point because we forget that they're useful to illustrate a relationship. Good as input to make up our mind on an issue, but generally not as helpful when they're just numbers:

When we use statistics, the less we rely on the actual numbers the better. The numbers inform us about the underlying relationship, but there are better ways to illustrate the underlying relationship that the numbers themselves.

Where's the logic?

    Using compelling details is helpful, as long as the details are genuine and support a coherent argument. Because there are circumstances when we abandon logic in favor of detailed statements. We suppose specific conditions are more probable than a single general one. A well-known example from the work of Amos Tversky and Daniel Kahneman:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which is more probable?

  1. Linda is a bank teller.
  2. Linda is a bank teller and is active in the feminist movement.

    A majority of people chooses the second option, when asked. But the probability of two events occurring together (in conjunction) is always less than or equal to the probability of either one occurring alone. Tversky and Kahneman said we get it wrong because we use a rule of thumb or heuristic to pick—the second options seems more representative of Linda.

In Thinking Fast and Slow, Kahneman writes about the response to a version of the question:

“We were convinced that statistically sophisticated respondents would do better, so we administered the same questionnaire to doctoral students in the decision-science program of the Stanford Graduate School of Business, all of whom had taken several advanced courses in probability, statistics, and decision theory. We were surprised again: 85% of these respondents also ranked 'feminist bank teller' as more likely than 'bank teller'.”

    Yet if we slow down and think about it, we realize that it's more likely to have individuals named Linda who are also bank tellers than instances of a person named Linda who is both, a bank teller and an activist. German psychologist Gerd Gigerenzer studies how we make decisions. He has a hypothesis of why this happens:

“If you read the description, nothing in it suggests that she might be a banker. So, when you ask what is more probable, ‘bank teller’ or ‘bank teller and an activist”, many people say, ‘hmmm may be, the second’. And Kahneman says this is wrong because a single instance of Linda being a bank teller can never be lower than conjunction of Linda being a bank teller and a feminist. He then asks that to be accepted as a proof of human irrationality.

But, it’s far from that, because it implies that people should treat the term “what is more probable” in terms of probability theory. If you look in the Oxford English Dictionary, you will see probability has quite different meanings and they are all legitimate. So, they ask, ‘where is the evidence that Linda is bank teller’, and since there is none, they go to the other option.”

    To test his hypothesis, Gigerenzer framed the question differently. Instead of asking what is more probable, he asked:

There are 100 persons who fit the description above (that is, Linda’s). How many of them are:

1. Bank tellers? __ of 100
2. Bank tellers and active in the feminist movement? __ of 100

How's it is rational?

    Gigerenzer says we fail the test because it's based on logical rationality, which is not as useful in the real world. Instead, we need ecological rationality—the kind of thinking that helps us get what we want in an environment that’s uncertain and dynamic. This means exercising our instincts, using simple but robust rules of thumb. [h/t FF]

    We want to model our behavior to ways that help us achieve our purpose, which makes use of rules of thumb rather than constantly watching for biases. We use Simple Rules,“rather than expending conscious cognitive effort,” say Donald Sull and Kathleen Eisenhardt, we “adopt universal heuristics that are cognitively easy, like representativeness (Pick what is usual) and availability (Pick what first comes to mind).”

    With the caveat that availability bias may get us in trouble, because our intuition is flawed. Dan Heath and Chip Heath use predictions as an example:

Which of the following events kills more people: homicide or suicide? Floods or tuberculosis? Tornadoes or asthma? Take a second to think of the answers.

You might have thought that homicide, floods, and tornadoes are more common. People generally do. But in the United States there are 50 percent more deaths from suicide than from homicide, none times more deaths from tuberculosis than floods, and eight times more deaths from asthma than from tornadoes.

    We predict badly because of the availability of news (and fears) about certain events—we overestimate their likelihood because they're things ready for recall. The emotional resonance is also part of why we remember disastrous events at the hand of nature or someone else better. We're encoded to keep ourselves safe. 

    Overconfidence gets us in trouble when we underestimate the kind of problem we're facing. Yet entrepreneurs need to have a healthy dose of confidence to push through early misses and into new territory. Where is that line? When we want to innovate, we may explore a course with a lesser probability of success but less competition. In the law of statistics this is called probability matching.

    Given that common biases may trick us, how do we go about believing ideas—ours and those put forth by others. Testing our own hypothesis as Gigerenzer did in the example above is one way to figure out if details are aiding our decision-making or leading us astray. Workable simple rules—a handful of guidelines we tailor to us and our task to balance concrete steps the flexibility—can be useful when under pressure.

    Like Tversky and Kahneman, Gigerenzer thinks about people and human behavior. He says in addition to looking at the positive side of heuristics or simple rules his approach is broader:

“Our group is interdisciplinary and so we have knowledge in mathematics, statistics, economics, biology—it’s important as well—philosophy. So sometimes we have a more integrated view of what is rational.

Laymen, non experts may believe philosophers and mathematicians have the last say on defining what is rational. But, that is not true. It’s not true that there is just one meaning of probability and one meaning of logic. Especially if you consider the whole of human knowledge across human disciplines.

Tversky and Kahneman side is less sensitive to that because their foundations come more from experimental psychology, and from that part of mathematics that actually believe that the problem of defining rationality is solved. That colours their methods, interpretations.” 

    When establishing simple rules using credentials we can test helps us try out an idea for ourselves and see if it works. We try before we buy, we validate the idea with honest and trustworthy sources, we seek a strong data point at the intersection of several disciplines. Plus, we can now run model competitions on how heuristics and optimization models perform in the world.

    It takes work, but it saves aggravation in sparing us a wrong decision, believing in an idea with a steep and painful downside. Unless, of course, narrative fallacy leads us to blind belief. Dan Heath tells one more origin story:

Christopher Columbus, as we all know, wanted to prove that he could reach India by sailing west. But no one believed his crazy theory that the Earth was round. And, in fact, his own sailors en route were terrified that they were about to fall over the edge of the Earth, and they almost mutinied.

So there's a guy named James Loewen, a professor at University of Vermont, who has pointed out that virtually every element of this story is false. That, in fact, we still don't really know where Christopher Columbus was going. There's a lot of disagreement among historians that even Columbus' best-known biographer isn't totally sure where he was headed. And furthermore, there was no element of is the Earth round or flat here. Most people at that time already knew that the Earth was round. The evidence was there for them to see. They noticed that, if another ship is receding into the horizon, their hull disappears first, and then the mast later, which implies that there's some kind of curvature in play.

And again, here's a guy who crossed an ocean and became one of the first Europeans to set foot on a new continent, and yet, we want more from this guy. We want him to be having hand-to-hand combat with his crew en route. We just crave the drama. We crave the obstacles.