How the Three Most Common Ideas on Trust are Flawed

How the Three Most Common Ideas on Trust are Flawed


Attacks on encrypted phones or encrypted email have become more frequent. Since systems are more connected than ever, you will likely feel the impact. If not today, at some point in the near future.

You ordered an item online and you got something else, despite the confirmation you got by email. When you send the image with the printed slip and receipt to the store, you get no response. You follow up and get nowhere.

The question of trust is evergreen. Technology has made its role more obvious in everyday life. Connection and integration make things easier, take away friction. But there are implications. How nations get along impacts your grocery shopping, for example. 

Can you trust a person you meet online? Digital body language provides limited cues. Ironically, it seem people do less due diligence on things shared via social media than they do of professionals with plenty of credentials and real-life results.

Thought leaders say: you must build trust. Consultants repeat that companies and institutions should focus on trust. But how do you build trust given that three of the most common ideas on trust are flawed?


First, the claim and the evidence

There's a claim many people believe about the great decline of trust. Therefore the aim is to have more, which means the task is to rebuild trust. But is this the right thing to suggest?

Given how data-driven we've become as a society, it's surprising how little we look at evidence. If you take an evidence-based approach on the claimwhat's your evidence? In a brief talk about trust, Baroness Onora O'Neill suggests opinion polls.

That's what you get: opinions. So the question is, based on what? If you examine the evidence from opinion polls, you find that “the people who were mistrusted 20 years ago, principally journalists and politicians, are still mistrusted.” We still trust the same people we trusted 20 years ago, too—judges and nurses.

There's also an issue of method. You don't ask generic questions to infer deeper attitudes. Because if someone asked you if you trust your elementary school teacher, or grocer, you'd probably want to know, “to do what?” You might even say that you trust some people, but not others.

So the evidence points to how we place trust in a differentiated way.

You may trust a friend to keep the conversation going, but not to keep a secret. Why drop that intelligence when we think of “trust” in abstract? It's not a good idea to eliminate “the good judgement that goes with placing trust.”

In my grocery shopping example, I had purchased from that store for two years. My evidence told me I could trust they would send me the correct Italian groceries each order. Their customer service responded to my emails about website issues promptly.

Then, there was a mistake on an order. Mistakes happen. And everything had been going well on quality and dependability. But it's what they did after the mistake that was decisive. To understand why, let's look at the second point Baroness O'Neill makes: the aim.


Then, the aim

The aim is usually to have more trust. As if trust were something you can accumulate and put in the bank. This is not useful. Because we differentiate between people who are trustworthy, and people who are not.

More indiscriminate trust is not an intelligent aim. Intelligently placed trust is a proper aim. It's the same with mistakes. They do happen, but you can learn to make them more intelligent. There's failure, and then there's intelligent failure.

You can learn who (and what) is trustworthy based on three things:

  1. competence — ability and experience with a defined set of skills and adequacy in using them
  2. honesty — integrity, truthfulness, straightforwardness with a dose of sincerity
  3. reliability — how consistently you can depend on someone/something

Since nobody and nothing is perfect, there are tradeoffs.

Is someone competent in something that is relevant and honest, but not reliable? Then you have a hard time assigning more work. Is something reliable, but then demonstrates incompetence? This is harder, because it depends on the degree, domain, and actions to bridge the gap.

Honesty is a big one. Warren Buffett says, “Honesty is a very expensive gift, Don't expect it from cheap people.” When he hires people, he looks for three things: “The first is personal integrity, the second is intelligence and the third is a high energy level. But if you don't have the first, the other two will kill you.”

Trust is the response to trustworthiness.

Institutions and organizations have tried to put in place systems of accountability that make it easier to judge trustworthiness, but they end up distracting the people from their difficult tasks because they need to tick boxes.

O'Neill says it's hard to judge whether people are trustworthy, and even communicate that we are. But there are things you can do that work well as proxies.


Third, the task

The task is also backwards, because we can work on our trustworthiness. But in the end it's other people who give trust, we cannot rebuild it ourselves. You only create the conditions through relevant competence, honesty, and reliability.

In the case of the grocery store, they could have responded to my inquiry and evidence. Maybe it was a mistake. Their policy on food orders could prevent a swap, but not keep an apology. Admitting that something went wrong doesn't mean it was a deliberate act.

In fact, when someone admits they screwed up, they just reveal their humanity. You can provide evidence of trustworthiness through vulnerability. Money back guarantees and easy returns are ways to demonstrate good will as a company. They are “I owe you” gestures.

Promises show you have confidence in what you're saying and doing. Consider each promise a feedback loop. Each promise you keep allows you to make more promises. It goes from small to big things. You likely start with something simple, like a project, then build from there.

But if people start noticing you don't do what you say… you become less trustworthy. Here's something I wish more people thought about: social media creates collective pressure to up the ante. It also blurs the line between a promise you talked about or mentioned, and one you kept.

Yet, humans are incredibly good at keep score. It's evolution optimization. People will remember, even when you forgot. Also, watch the excuses. There's always a good reason to do something else, or not do something. If it's really compelling, then it's worth talking about it proactively.

A task and a promise, they are orders you take or give yourself. And there's accounting of them. Giving people evidence is a concrete way to handle it. Simpler than talking about rebuilding trust.

As for the role of technology. Technology platforms centralize everything.


The technology shift

Technology has had a large influence in society and culture. This is a global phenomenon that impacts trust. More of our transactions happen through technological mediation. These platforms' settings and algorithms exist to drive certain behaviors.

Hence it's critical to understand the system in use (though that happens at tactical level only), it's valuable to become more self-aware of what triggers behavior (individually, but also as groups) and make space for thought. So pause that tweet!

Technology platforms exploit

American internet entrepreneur and investor Chris Dixon illustrates how centralized platforms operate. First they attract, then exploit; they cooperate, then compete:

Centralized platforms follow a predictable life cycle. When they start out, they do everything they can to recruit users and 3rd-party complements like developers, businesses, and media organizations. They do this to make their services more valuable, as platforms (by definition) are systems with multi-sided network effects. As platforms move up the adoption S-curve, their power over users and 3rd parties steadily grows.

When they hit the top of the S-curve, their relationships with network participants change from positive-sum to zero-sum. The easiest way to continue growing lies in extracting data from users and competing with complements over audiences and profits. Historical examples of this are Microsoft vs Netscape, Google vs Yelp, Facebook vs Zynga, and Twitter vs its 3rd-party clients. Operating systems like iOS and Android have behaved better, although still take a healthy 30% tax, reject apps for seemingly arbitrary reasons, and subsume the functionality of 3rd-party apps at will.

For 3rd parties, this transition from cooperation to competition feels like a bait-and-switch. Over time, the best entrepreneurs, developers, and investors have become wary of building on top of centralized platforms. We now have decades of evidence that doing so will end in disappointment. In addition, users give up privacy, control of their data, and become vulnerable to security breaches. These problems with centralized platforms will likely become even more pronounced in the future.

Reasons he offers in favor of decentralization. His solution may not be your cup of espresso.

We're just scratching the surface. There are emergent peer-to-peer solutions that leverage technology. For example, Holochain. Accountability goes to values. As for value, we're still exploring how to move away from converting everything into money.

The current relationship between trust and technology

opens two main doors: security and power.

On the security part. Bruce Schneier explains how VPNs are entirely based on trust. For those not familiar, a VPN is useful to secure data flows and restrict access to what's beyond the encrypted endpoints. Knowing the owner, location, and laws that apply is critical.

Because you may find yourself in uncharted territory.

As for power… Ben Thompson has a detailed description of the bills proposed by the House Subcommittee on Antitrust investigation of tech companies. The argument cuts both ways. Regulation could become a bottleneck for innovation in a dynamic industry.

But also, the industry has done less exploring and more exploiting as of late.


Implications of the current (belief) system

Many organizations optimize for efficiency, but efficiency can be the enemy of trust. Like relationships, trust takes consistency and constancy of effort—take away all friction, and you have no basis for making the investment.

With trust, the issues at stake are compounded by technology. They include reliability (fulfilling expectations), accountability (keeping promises and correcting screw ups), and protection (platforms to mitigate the risk of bad things happening).

Trust is fundamental to society. In Who Can You Trust? Rachel Botsman explores the role of technology. She notes the evolution of trust from local to institutional to distributed, and how transparency is making that trust leap possible.

Even in distributed systems it's important to know who is trustworthy. Who will tell the truth about a product, service, or piece of news? We're still working out where the buck stops. Communities of practice that welcome critical thinking and include community managers are one bright spot.

There's a distributed system and platform that is greater than technology: culture. Culture is an under-appreciated factor in human evolution. And yet, it is a rich repository of human endeavor and thus filled with energy.

It has more power because it changes more slowly.


Where do we go from here?

Trustworthiness matters more than trying to change attitudes on trust. Give people adequate, useful, and simple evidence you are trustworthy. Competence, honesty, and reliability are the cornerstones of this system.

The distributed human platform is not technology, but culture.

A new research study looks at the question of how culture is driving evolution more than genetics. Timothy Waring and Zachary Wood talk about the long-term gene-culture coevolution and the human evolutionary transition:

First, culture appears to hold greater adaptive potential than genetic inheritance and is probably driving human evolution. The evolutionary impact of culture occurs mainly through culturally organized groups, which have come to dominate human affairs in recent millennia.

Second, the role of culture appears to be growing, increasingly bypassing genetic evolution and weakening genetic adaptive potential.

Taken together, these findings suggest that human long-term GCC is characterized by an evolutionary transition in inheritance (from genes to culture) which entails a transition in individuality (from genetic individual to cultural group).

The researchers have found that culture helps humans adapt to their environment and overcome challenges better and faster than genetics.” The transfer of knowledge happens much faster through culture. Because it happens through practice.

Genetics belongs to nature, and that is a slower pace (when there is no human intervention.) Cultural transmission is “based on flexible human learning and effectively unlimited with the ability to make use of information from peers and experts far beyond parents.” 

Groups drive culture and then culture drives evolution. How can we make this a mutually beneficial system?



Leave a Reply

Your email address will not be published. Required fields are marked *