Our Attention Filter has Ten Instinct-Shaped Holes in it. Learning What They Are Helps us Deal with Risk.

Framing effect attention filter hole
 We manage to ignore all the noise, including much potentially useful information and data on the facts, yet we pay attention to stories and snippets that trigger our dramatic instincts. That's because we get an emotional high from them.

    Our filter has ten holes in it. Hans Rosling named them in Factfulness. Fear is the most used among them — the others are gap, negativity, straight line, size, generalization, destiny, single perspective, blame, and urgency. Fear is hard-wired into our brain, we fear missing out (FOMO) and we have a strong fear of failure, making a mistake.

    We should learn to distinguish something that is “frightening” from something that is “dangerous”. One is perceived, the other is real risk.

Why trends are useful

    Trends help us create a framework around variables we track over the long term. In Megatrends, John Nasbitt says that each new technology, to be successful, must be coupled with a compensatory human response. Thirty years later, we can tell the trends because they endured in our culture.

    The three main categories of fear: are fear of physical harm, fear of captivity — as in loss of control or loss of freedom — and fear of contamination. They distort our worldview and lead to horror of things we don't fully understand or deal with in every day life, like infection and poison.

   When stories combine two fears, like that of a plane crash and kidnapping — tapping into fear of harm and captivity — we're pulled in. Then we get programmed by hearing it over and over again seemingly everywhere. This is a distraction that keeps us from learning the deeper data behind the story and taking action, if appropriate.

    Context helps us allocate resources as needed, including emotional energy. Now is the time to understand more, so we may fear less, said Marie Curie. Yes, there are still terrible and painful things happening to people and we should help and empathize.

    Yet the predictive story points to a larger trend that is cause for optimism. That is the story we want to let in our attention filter.

Mimicry and unknown unknowns

    The majority of the stories we get in the media are the local and familiar kind. They merge facts with anecdotes to share things that are of potential human interest, that connect with belief. They trigger us by appealing to our dramatic instincts.

    Sensational headlines draw attention, but distort reality and keep us setting apart what we know from what we don't know. Multiply by the avalanche of similar articles chasing attention, which include many containing false data, and pretty soon it becomes hard to tell a mimic from the real thing.

   Batesian Mimicry# is a form of mimicry where a harmless species has evolved to imitate the warning signals of a harmful species directed at a predator of them both. For example, we should avoid the venom of the Texas Coral Snake while the Mexican Milk Snake and other types of snakes that look similar# are not as bad. 

    Beyond the things we already know, there are things we don't know we should know about. But once we're aware of them, we can learn those things because they're knowable. We can learn to tell the snakes apart, for example. Known known and known unknown are two instances of certainty.

Unknown unknowns

    There are also things we know we don't know, we've identified we don't know them and others we don't know we don't know. The diagram from the Project Management Institute# calls out the nature of risk — uncertainty creates the need for understanding risk.

You can't manage what you don't understand

    Recently, I had a meeting with a group of executives who were looking to evaluate a number of approaches to deal with a situation that involved risk. They had created a brief that called both for preparation and optimization.

    After some conversation I realized that they already had some kind of expectation of what the approach would need to look like. Yet, by their own admission, they did not have a full understanding that the brief and the expectation where not on the same page. There was also domain knowledge uncertainty.

    Their attention was focused on short term tactics, yet the request was for a strategic long term approach. When the focus becomes not the topic but the expectation, the results is a disconnect. The hole (or more than one) in our attention filter leads us to confuse perception with reality.

    In Rosling's definition, risk equals actual danger times our degree of exposure. It's very hard to evaluate something when we have a hole in our knowledge and don't know we do. Said another way, risk is the possibility of suffering loss or harm, not the loss itself. 

    When we learn to manage the sources of exposure to risk, including our own reactions (for example, fear) and experience with uncertain situations, we can begin to benefit from the opportunity brought about by change and creative problem solving.

    To manage risk appropriately, we should remember that if the nature of an occurrence is certain, it is more like a fact or knowledge, If it's uncertain (probability of is less than 1, for example), the impact can be uncertain as well. 

    Evidence and data are useful in decision-making, they also keep us from stressing out and feeling helpless.