On Preference Falsification: Private Truths and Public Lies


  Intrinsic Utilitarism

 “He who dares not offend cannot be honest.” 

[Thomas Paine]

First it was social media and now it's technology that is supposed make the trust leap possible through transparency. Social networks and search engines unrolled a welcome mat to the age of expressing ideas freely, and being our own agents complete with identification process… to then turn around and either sell, take, or otherwise exploit data and information gleaned about us, and our customers.

    Big tech companies Google, Facebook, Amazon, and Apple are now playing leading roles and we're definitely supporting actors, when actors at all. In The Four, Scott Galloway says each of those companies captures commercial value based on who we are. How did the Four infiltrate our lives so completely that they’re almost impossible to avoid (or boycott)?

    But before we try and dissect this part of our increasingly public social life, we want to turn to a discipline that is also in the business of broadcasting and consuming information biology. There we learn that deception and self-deception have played a critical role in evolution. The very structure of our minds has been shaped from our earliest beginnings by the need to deceive.    

    In Why we Lie, David Livingstone Smith gives an account of how as our evolutionary ancestors began to gather in larger and larger groups, the increasing complexity of group dynamics led to an arms race between deception and detection [that had] huge implications for the evolution of human intelligence.

    Language upset the balance between deception and detection, giving enormous advantage to the liar. “Most of us are embarrassingly inept at spotting liars.” The problem is that we tend to privilege speech over raw observation, and thereby miss the clues that give the liar away. He says:

“Once our ancestors learned to gossip, they could form secret alliances, deceive each other far more effectively about where they stood in relation to other community members, and stab each other in the back.

[…]

the power to deceive is our main weapon in the struggle for social survival.”

    Smith says self-deception was an adaptation that enabled us to better deceive others. 

Self-deception has been a wonderful gift, but it is now destroying us. Our taste for it resembles our craving for sugar and animal fat.

[…] the most dangerous forms of self-deception are the collective ones. Patriotism, moral crusades, and religious fervor across nations like plagues, slicing the world into good and evil, defender and aggressor, right and wrong.”

    But if insects and flowers lie to propagate, or hide from predators, what makes us continue to lie once our chances for survival improved dramatically? It's complicated. Because in modern society, lying is mixed in with gossip, rumors, and secrets.

    Robin Dunbar, author of Grooming, Gossip and the Evolution of Language, theorizes that “gossip works in human societies the same way grooming does in primate societies, but more efficiently.” Dunbar goes so far as to theorize that language evolved so that people could gossip and more effectively establish and defend social groups.

    “A lie has no power whatsoever by its mere utterance,” says Pamela Meyer. “Its power emerges when someone else agrees to believe the lie.” Behavioral economist Dan Ariely says dishonesty is not rational. In The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves he says we all cheat a little, but not so much that it causes us to compromise our self-image or integrity. He calls this the “fudge factor.”

    There are things we can do to learn the truth, but as it's often the case when people are involved, the answers to falsification increase in nuance and variety. In a recent conversation with Julia Galeff#, economist Timur Kuran speaks about the phenomenon of “preference falsification,” when we say one thing to our family, friends, and social groups, and favor another privately.

    Economists, sociologists and psychologists each base their models on underlying conceptions of man — Homo economicus is the bloodless and instantaneous calculator of costs and benefits, which we now know is a figment of our imagination thanks to the work of behavioral economist Richard Thaler; homo sociologicus is a product of social stimuli; and homo psychologicus is ruled by conscience.

    In Private Truths, Public Lies, Kuran combined these three theories to explain why people often mask their opinions when in groups. Even as he defines it differently in parts of the book, he uses the concept of “intrinsic utility” to mirror the idea of intrinsic motivation. Says Kumar:

Suppose I'm with a group of friends and several of them indicated their interest in going to see a particular movie. But I'm not interested in going to the movie. Perhaps I want stay home and read a book, or I’m interested in going to a ball game, or I'm interested in watching another movie — but I think I will, if I admit that or if I communicate that, I will disappoint them, I’ll hurt their feelings, I’ll be ridiculed. So I say, “Oh, I'd love to come along.”

That is one example of preference falsification. I've indicated that I’ll do something that I really would rather not do.

    But there are consequences to this phenomenon:

The most consequential outcome is that inefficiencies persist, and patterns that many people object to, patterns that make many people uncomfortable, persist. They persist indefinitely, because people think that if they object, if they make a fuss, if they try to organize an opposition, nobody will follow them.

It's not necessarily the case that you'd have to think that you are actually in the minority in terms of what people really feel. You could fail to raise an objection even if you’re quite certain that 90 percent of the people feel exactly like you.

[…]

You might object because you've seen that people who have indicated that even that they have doubts have been crushed, have been punished, have been ridiculed. And you think that other people also understand that, and so they will not step out to defend you even if they secretly admire what you're doing, unless they sense that a critical mass has formed.

You yourself might be willing to object and take some risks if 20 percent of a particular group of a particular community has expressed opposition, or is campaigning against the status quo, against some inefficiency — but you will not if other people have not gone first. You won't take the first step. And others are doing exactly the same calculation, and so they're refraining from moving.

The negative consequence is that a policy or regime that many people dislike, that perhaps the majority intensely dislike, survives out of fear.

     There are unintended consequences to our choices of what we say and what we withhold. People may underestimate the number of others who agree with them, and also may not speak up because they fear they'll be let holding the candle… nobody else will say a thing.

    An example of this phenomenon at work happened just this week at an airport.

    A group of passengers who were left stranded from a previous flight due to technical glitches through security was waiting to board the next flight while on standby. They were to wait until the very last second before doors closed to learn if they had a seat. One of them confided how poorly the airline handled the whole situation, but he said it would take critical mass to send a clear message that it's not okay to treat paying customers that way.

    Even the other two remaining passengers stranded with him would not make waves for fear of losing their potential seat. Monopolies do create those kinds of situations, which is one form of regime. Scarcity creates the conditions for resignation, if not acceptance. It doesn't change the poor situation, but it explains it.

    Airlines also provide little transparency to their ticket classifications and codes, putting the burden of education on customers. No wonder the travel hacking industry is thriving! But influence creates compound effects in both directions — positive and negative.

    Kuran says there are other situations in which we know there's cognitive dissonance. For example, when we know someone's preferences based on a private conversation:

You can be in an organization, in a department meeting and know that there are several of your colleagues who object to, let us say, the chairperson — something that the chairperson is doing. But that they are afraid to offend the chairperson. You will then falsify your preference, knowing that others are doing exactly the same.

    How do we change our mind abut speaking out? There are several factors to consider:

  • new information is available, and our preferences may change based on what we learn
  • a sense of the popularity of alternative positions, maybe something has become acceptable to talk about when it wasn't earlier — for example, now it is quite acceptable to object to the preying on young women at Hollywood
  • the act of preference falsification also creates discomfort, so we want to come clean (this varies individually and based on the type of falsification)

    If a small group finds the courage to speak up because they have a thicker skin, or out of guilt for falsifying, then others may feel more comfortable speaking up. The lid is open, and out come the truth. It could be a combination of things, some people know more than others, or they've reached their limit of tolerance for a certain situation.

     But there's a flip side to this phenomenon, which is the reason why new regimes can be so oppressive. When the wind of change blows in a different direction, the people who supported the former administration may pretend they didn't like it all along. However, the new leaders know this mechanism and crack down or try to draw them out.

    This is how fear changes sides says Kuran:

The people who had stood on the sidelines all along and had hoped that the demonstrations would not succeed, who had hoped that the status quo would persist, a point comes when they realize that a new world has been born, and that the sources of fear are different. Power has shifted. And they have to now, in self-defense, start falsifying their preferences in a different way.

To go back to an example from recent times, there must be people in Hollywood who were quite comfortable with the environment that existed, who perhaps had behaved like Weinstein and who were hoping that the public opinion would not shift. That Weinstein would prevail.

But at the moment, given that public opinion has shifted, they will not defend Weinstein. They will, in fact, argue that all along they've been quite disturbed by the predatory behavior of some people and by the tolerance shown to them, and that the only reason they had not said anything or they had not acted is because they were afraid of retaliation from people like Weinstein.

    What happens when we have different communities and points of reference, where our reputation is defined? In On Dialogue, physicist David Bohm talks about the fragmentation of thought, and of reality as a consequence. Our social reality follows a similar fragmentation as our sources of information.

    Kuran says the reason why people who self-selected and identified with different communities have a hard time mingling with others is the higher level of discomfort they may create by speaking up. It requires greater preference falsification. Telling the truth hurts feelings. What is truth, if we can interpret it? On the other hand, truth is important to us.

    What if we eliminated all preference falsifications? Would we be better off, or do we think that some forms of small falsifications, like someone's taste in fashion would serve a good purpose? At this point the conversation shifts to public policy and the value of putting cards on the table.

    In Principles, Ray Dalio, founder of successful investment firm Bridgewater Associates, talks about the value of “radical truth” and “radical transparency.” To help do that, he says that life, management, economics, and investing can all be systemized into rules and understood like machines. Which then creates the most effective ways for individuals and organizations to make decisions, approach challenges, and build strong teams.

    Dalio believes in idea meritocracy and has been employing computerized decision-making systems to make believability-weighted decisions. He says that striving is part of our lives and society:

the sequence of 1) seeking new things (goals); 2) working and learning in the process of pursuing these goals; 3) obtaining these goals; and 4) then doing this over and over again is the personal evolutionary process that fulfills most of us and moves society forward.

I believe that pursuing self-interest in harmony with the laws of the universe and contributing to evolution is universally rewarded, and what I call “good.” Look at all species in action: they are constantly pursuing their own interests and helping evolution in a symbiotic way, with most of them not even knowing that their self-serving behaviors are contributing to evolution.

[…]

Self-interest and society’s interests are generally symbiotic: more than anything else, it is pursuit of self-interest that motivates people to push themselves to do the difficult things that benefit them and that contribute to society. In return, society rewards those who give it what it wants.

    Self-interest can go hand-in-hand with caring for others. As Alexis de Tocqueville wrote in 1835, “self interest rightly understood is the way of evaluating choices.” Dalio calls his process whose cornerstones are “radical truth” and “radical transparency” productive adaptation, which helps especially during setbacks.

    Kuran says we can test preference falsification by looking at inconsistencies, timelines, and also private vs. public statements. This is especially important if we anticipate a change is coming. Knowledge of how incentives and rewards are doled out helps, too. He says:

We just have to start anticipating the need for this type of information. We have to design our surveys accordingly and start collecting data in various ways under various conditions to be able to identify what exactly is driving changes in public preferences.

    And of course we must also be smarter about how we collect and read the emotional data in the surveys. Which is where smarter algorithms and careful analysis can help. On the other side of the conversation, the Internet magnifies consequences by the speed and acceleration at which opinion travels.

    We can consider nothing shared online private, even as social networks have created direct messaging and groups, privacy on their platforms is an illusion buoyed by lack of immediate consequences and the perception that we are alone behind our screens. This of course has made cautious people separate private from public persona.

    For as Jonathan Swift said, “Falsehood flies, and truth comes limping after it, so that when men come to be undeceived, it is too late; the jest is over, and the tale hath had its effect: like a man, who hath thought of a good repartee when the discourse is changed, or the company parted; or like a physician, who hath found out an infallible medicine, after the patient is dead.” 

    Kuran wrote Private Truths, Public Lie in 1997. A generalization of the phenomenon he describes is in Micromotives and Macrobehavior by Thomas Schelling. If he were writing the book today, he says he would:

put more emphasis on the interactions between the informational drivers of preference falsification and the reputational drivers, as well as the effects.

    Somewhere in the back of my mind I'm thinking that the whole conversation on influence ― what influences us and also who has influence on us and in turn who we do influence would be part of it, too.

+

Books I mentioned in the article:

     

, ,