“What would I eliminate if I had a magic wand? Overconfidence,” says Daniel Kahneman in a recent interview with The Guardian. The author of a much celebrated best seller Thinking Fast and Slow and winner of the Nobel prize in economics:
... is downbeat about the capacity of his brand of psychology to effect change in the world. I imagine he would simply argue he’s a realist about human nature. And, indeed, studies showing that “skilled” analysts are hopeless at predicting the price of shares have yet to translate into mass sackings or even reduced bonuses on Wall Street or in the City. The same goes for evidence that the influence of a high-quality CEO on the performance of a company is barely greater than chance.
There are however other ways to put his insights to good use. For example, to ward off the Halo Effect and consider more points of view, meeting participants should write down their ideas about the issue at hand before the start of a meeting when no one has spoken yet.
In Willful Blindness: Why we Ignore the Obvious at our Peril Margaret Heffernan examines the reasons why we do that:
“We may think that being blind makes us safer, when in fact it leaves us crippled, vulnerable, and powerless. But when confront facts and fears, we achieve real power and unleash our capacity for change.”
Almost anyone can learn to
“see better, not just because our brain changes but because we do. As all wisdom does, seeing starts with simple questions: What could I know, should I know, that I don't know? Just what am I missing here?”
We tend to see what we expect to see. Culture drives what we want to see, individually and as businesses. Other factors influence our willful blindness:
- Social Proof — for example, she references Serpico and the Genovese murder
- Authority Misinfluence — as in “the boss said do it”; in some cultures the boss also demands solutions are brought to the table with problems
- Stress Influence — sleep deprivation, tight deadlines drive reduced mental capacity
- Contrast Misreaction — tiny changes mislead us
In the interview, Kahneman also talks about the concept of adversarial collaboration:
In the early 2000s Kahneman sought out a leading opponent of his view that so-called expert judgments were frequently flawed. Gary Klein’s research focused on the ability of professionals such as firefighters to make intuitive but highly skilled judgments in difficult circumstances. “We spent five or six years trying to figure out the boundary, where he’s right, where I am right. And that was a very satisfying experience. We wrote a paper entitled ‘A Failure to Disagree’.”
Heffernan says “good disagreement is central to progress.” Her TED talk reviews why certain human thought patterns — like conflict avoidance and selective blindness — lead organizations and managers astray. She tells the story of Alice Stewart, a woman doctor Oxford in the 1950s.
Stewart's interest was in the emerging field of epidemiology, the study of patterns in disease. To make a mark, she knew she needed to find a hard problem and solve it — she chose the rising incidence of childhood cancers:
… when her carbon copied questionnaire started to come back, one thing and one thing only jumped out with the statistical clarity of a kind that most scientists can only dream of. By a rate of two to one, the children who had died had had mothers who had been X-rayed when pregnant. Now that finding flew in the face of conventional wisdom. Conventional wisdom held that everything was safe up to a point, a threshold.
It flew in the face of conventional wisdom, which was huge enthusiasm for the cool new technology of that age, which was the X-ray machine. And it flew in the face of doctors' idea of themselves, which was as people who helped patients, they didn't harm them.
Alice Stewart rushed to publish her preliminary findings in The Lancet in 1956. She also had a sense of urgency on completing her research, before the cases disappeared:
… she need not have hurried. It was fully 25 years before the British and medical — British and American medical establishments abandoned the practice of X-raying pregnant women. The data was out there, it was open, it was freely available, but nobody wanted to know. A child a week was dying, but nothing changed. Openness alone can't drive change.
How did she figure out she was right?
she had a fantastic model for thinking. She worked with a statistician named George Kneale, and George was pretty much everything that Alice wasn't. Alice was very outgoing and sociable, George was a recluse. Alice was very warm, very empathetic with her patients. George preferred numbers to people.
But he said this fantastic thing about their working relationship. He said, “My job is to prove Dr. Stewart wrong.” He actively sought disconfirmation. Different ways of looking at her models, at her statistics, different ways of crunching the data in order to disprove her.
He saw his job as creating conflict around her theories. Because it was only by not being able to prove that she was wrong, that George could give Alice the confidence she needed to know that she was right.
Collaborators who are not echo chambers of each other make greater progress when they think together. This kind of conflict requires
1. finding people who are very different from ourselves — (see also how do you select your team?) since we generally prefer people like us, we need to make an effort to step outside our comfort zone, show we care, and
… seek out people with different backgrounds, different disciplines, different ways of thinking and different experience, and find ways to engage with them. That requires a lot of patience and a lot of energy.
2. being prepared to change our minds — while this is difficult for individuals, it's even more difficult, if not rare, in organizations
… the biggest problems we face, many of the biggest disasters that we've experienced, mostly haven't come from individuals, they've come from organizations, some of them bigger than countries, many of them capable of affecting hundreds, thousands, even millions of lives.
How do organizations think? Well, for the most part, they don't. And that isn't because they don't want to, it's really because they can't. And they can't because the people inside of them are too afraid of conflict.
We are likely similar to the executives Heffernan polled, we are afraid of raising issues — “fully 85 percent of them acknowledged that they had issues or concerns at work that they were afraid to raise.”
The solution is to see it as thinking. Most organizations cannot think together, they cannot use talent's collective brain power, the power of conversation, and the act of working together to experiment and test different points of view.
How do we get there? Either letting our curiosity win and ask simple questions — the child in the Emperor's new clothes example a few days ago on how explicit language creates mutual knowledge — or balancing our fear of silence with that of speaking.
We need to learn to defend our thesis. Open networks and open information are wonderful to have, however we need to make use of them:
The fact is that most of the biggest catastrophes that we've witnessed rarely come from information that is secret or hidden. It comes from information that is freely available and out there, but that we are willfully blind to, because we can't handle, don't want to handle, the conflict that it provokes. But when we dare to break that silence, or when we dare to see, and we create conflict, we enable ourselves and the people around us to do our very best thinking.
Watch the full talk below.
For a deeper dive in how to use conversation as a tool to elicit information and engage in honest and effective talk, I recommend Dialogue, the Art of Thinking Together by William Isaacs.