Truth, Status, and Tribes

The tensions between truth and social cohesion

Epistemic status: abstract, speculative, experimental


For the sake of this piece, let’s hold this assumption as true: people often act in the interest of increasing social status within their tribe. Not you or me of course—we do things out of altruism, honor, and integrity—I’m referring to other people.

The idea of increasing one’s status often has negative connotations—a certain political jockeying or impurity about one’s intentions. I’m implying something different. Instead I refer to “increasing status” in the positive sense: accomplishing something impressive or contributing to others in a way that increases their estimation of your value. That’s the only way it works, anyway.

Paraphrasing Taleb: anything done with explicit intent to improve one’s status likely won’t improve one’s status. Status is a game where every time you think about the game, you lose. After all, the most respected people achieved their status as a byproduct of doing something great, like pursuing altruism, honor, and integrity! So throughout this piece, when you see the concept of people trying to increase their status, see it through this positive lens, as in, improving someone’s reputation. Reputation and status mean very similar things, but reputation has a much more positive connotation.

Why do people aim to improve their reputation? Well, our main evolutionary goals in life are survival and reproduction—let’s hold this assumption too—and having a good reputation is the best way to get people to work with you, take care of you, and partner with you.

How does status and reputation work? This is another simplification, but let’s also concede it for the sake of the piece: Imagine we all have a number floating above our heads—that's our status or reputation score. We can't see it, but we all kind of know where we rank—dating apps sure do, they’ve quantified it! Our brain works over time to perceive this in ourselves and others—in a variety of contexts.

One thing that was shocking to me, but obvious in retrospect, is that we’re so attune to our status score that we’re laser focused on increasing it even at the expense of seeing objective reality, or truth. As Don Hoffman writes in “The Case Against Reality” “An organism that sees reality as it truly is is never more fit than an organism of equal complexity that sees none of reality and is just attuned to fitness payoffs” Or as Don says elsewhere: “Optimizing for truth can make you extinct.”

It’s not that we’ll see objective reality (or truth) and consciously ignore it—our brains won’t even filter for it in the first place. In other words, the first thing we see isn’t whether something is true or not, and then decide whether it’s good for us. We do the opposite: We first filter for what’s good for us, and then filter for truth (e.g. this explains why we feel emotions first, and then rationalize them after.)

To be sure, this doesn’t mean objective reality doesn’t exist. It means we often don’t perceive it because it’s not aligned with our interest. We think people are rational, but if anything, recent scientific discoveries tell us most of human rationality is not used to make decisions, but rather to come up with arguments to rationalize decisions the subconscious brain has already made. 

Let’s clarify these ideas through the example of freeloaders and cooperators. If everyone's a freeloader, it’s unfit to be a freeloader. So much of our emotions in social situations come from reciprocity; if you cooperate, so will I, but if you defect, so too will I defect. This predicts an evolutionary arms race: If I’m a freeloader and I can successfully deceive you, I can succeed. But if all freeloaders succeed in lying, they overwhelm the population. To combat this, the rest of us become better at detecting deception. And the cycle continues.

Now, the best deceiver is the person who doesn’t even know they’re deceiving. If you truly believe your lie, you won’t betray it. So not only are there selection pressures to deceive others, but also to deceive yourself. The more you deceive yourself, the better you can deceive—or convince—others of the same narrative. The best salespeople are the most fervent believers, those with no traces of doubt. And this is why making deception explicit is so taboo: We couldn’t successfully deceive ourselves if we knew we were self-deceivers!

So we self-deceive by signaling. Robin Hanson and Kevin Simler wrote a whole book on how we send signals across nearly every element of our lives. While true and fascinating, many people don’t want to hear that. What a buzzkill!

But constantly self-deceiving is exhausting. For one you have to make stuff up. You have to get used to exaggerating, repeating high-status opinions as if they were your own, and doing what you have to do to signal your status while deceiving others and yourself of your motives. Of course this exhaustion implies that humans are wired up to accurately perceive reality in the first place, but that’s pretty spurious. As mentioned, we just aren’t evolved to prioritize objective reality ahead of increasing our status score.

Consider the Desktop computer interface as an analogy. If you’re writing an email and the icon for email is red & rectangular, that doesn’t mean the email itself in your computer is red & rectangular. It's an interface. It's not meant to show you the truth (the resistors, circuits, and software—complicated stuff!).

Quite the opposite, actually: the interface is there to hide objective reality and give us a way to control a reality without understanding how it works. The “truth” (how the thing works) is not only complicated, but totally irrelevant to our ability to use the computer in the first place. This is what evolution has done — it’s given us an interface to interact with reality without understanding (or caring) about that reality unless it directly affects us.

Bringing this back to status and reputation: It’s important to note that, when we’re talking about increasing status, we’re talking in the context of increasing status within a tribe. Everything makes much more sense when we understand that people are hard-wired to choose a tribe, stick within it, and defend it unconditionally. We’re built for tribes. We can’t live without them. You need people to help you grow food, defend you from enemies, and mate.  Perhaps seeing objective reality was more evolutionarily fit for solitary humans who didn’t live in tribes, since they were too busy finding food and shelter to focus on signalling and self-deceiving. But social animals are different. They have to rise up within tribes and avoid freeloaders.

To be clear, the fact that objective reality is sometimes deprioritized doesn’t mean everyone is a liar. As George Constanza once said “It’s not a lie if you believe it.” In all seriousness, lying implies people are consciously deceiving, but since we’ve established people deceive themselves too, that’s likely not the case. Brains aren’t built to accurately represent facts; they’re built to help navigate social situations, and self-deception is a valuable skill for doing that. But self-deception isn’t the right term either because deception also implies deliberateness, whereas this behavior is subconsciously done at the filtering level. Your brain doesn’t pick between objective reality and the version of reality that increases the status score; it’s unclear whether it even filters for the former, except when it overlaps with the latter. 

What does the prioritization of the status score over objective reality imply? That people who want social status will say whatever is necessary, no matter how ludicrous—and they will believe whatever’s convenient. Indeed, it’s more predictive to think of “belief” not as something someone thinks of as true, but as something someone thinks will improve their reputation. Want more evidence people don’t prioritize objective truth? You may have noticed that after centuries of the scientific method, people still believe in things like Astrology. 

And that’s exactly why the precise content of any ideological point doesn't matter. Brains don’t care about ideology. They didn’t evolve to care about communism or capitalism. These are mostly abstractions. Indeed: Many people are hypocrites with regard to their overt ideological claims if they can get away with it. Al Gore talks about Global Warming while living in a lavish mansion and flying private. Rich parents talk about the importance of public schools while they put their kids in fancy private schools. Cognitive dissonance? No; those cognitive systems are not connected to begin with. 

Common theories of the brain imply people have ideas in their heads that have some causative effect on how they behave. This is incorrect. You don’t “have ideas in your brain” — your brain is not a hard drive. What your brain has is a tendency to change its behavior in order to expect that things that happened before in a certain sequence will happen again.

This is why people aren’t open to changing their minds — because there's no reason they should be. Ideas aren't about logic. Ideas are badges of group membership. They are Schelling points. Ideas aren't things we hold in our minds; they’re things we say. If saying the same things wrong people say isn't going to make you more friends, you aren't going to say those things. But if saying what the right people say is going to make you more friends, well, you’re probably going to say those things instead. That’s the purpose of ideas — to make better friends.

This is also why arguments are often useless, and only cause people to dig into their pre-existing positions. All this stuff about steel-manning other arguments to find truth—it may catalyze scientific progress, but in most social circles, it’s also a surefire way to lose friends and alienate people. (Which is why people who do so are heroes, and why communities who encourage it are foundational for progress). Indeed: Understanding how the outgroup thinks is frowned upon. You’re not supposed to try to understand what’s going on. You’re supposed to not get it and loudly condemn it. We think of this blind allegiance as harmful behavior, but it actually signals and induces loyalty to one’s in-group.

Yes, the crazier a bonding ideology is — the further away it is from objective reality — the more powerful it is in forming tribes. Why? The crazier the idea, or the crazier the stuff one does in order to get into the tribe, the more it proves one’s loyalty to the group by shutting off their other available options. Because defection is so common, this is significant. After all, if loyalty is just a social contract between convenience and utilitarian calculation, logic tells you to look for a better deal down the road. Then by backward induction, you say: “Well, if there’s a better deal down the road, better for me to defect now”. But if all societies followed this thinking, they’d crumble pretty quickly. So if you don’t want societies to fall apart, and if you want something larger than a tribe and a family, you’re going to need people doing crazy stuff that burns their boats to other options to prove their loyalty. You need people showing you they have no other place to go, no other tribe to join.

Interestingly, these bonding ideas shouldn’t only be absurd — they should also be open to interpretation. If they are verifiable or falsifiable, then people can adjudicate them. People have fought over interpretations for an eternity. That’s how religion is so adaptive. 

Consider the phrase, “Abolish the police”. Even the people who claim “Abolish the police” concede that’s not what they truly mean. They mean redirecting police funds to social services. But the phrase “Abolish the police” achieves the goal of improving their reputation within their tribe because, by abandoning reasonableness, they are burning the boats to the enemy tribe, demonstrating their in-group loyalty. Your status score increases when you make yourself unsuitable for other tribes, because it proves you won’t defect and join them. Until you do that, all tribal attachments are fleeting and contingent.

Now, reciting the phrase “Abolish the police”, unless you have experienced crime (or police misconduct), doesn't have real consequences for you. It's just a set of words. Your reaction to that proposition doesn't depend on any negative memory. The only real consequences to that conversation is the opinion that your peers will have about you. So if any opinion contrary to abolishing the police gets your peers mad, and their anger will result in you having lower status, your reaction will likely be "sure, let’s abolish the police (and redirect the money to social services, etc).” The vast majority of ideas don't have physical consequences; all they have are social ones. They are status markers. Whether we abolish the police or not won't affect you. It may over the long term, but human brains don't work like that. You learn behaviors to avoid danger and earn pleasure. And social disapproval affects you immediately in a negative way, so you avoid it.

This could be said about any political idea. The content is secondary, and the consequences are besides the point. What counts is what works in the political arena — the thing that gets retweeted, makes you friends, or is otherwise evolutionary fit. How ideas spread depends on that, not actual internal logic or likely consequences.

So then why do we care so much about politics? What's the point of ideology? Ideology is just the water you swim in. It’s a structured database of excuses, used to signal your allegiance or defection to the existing ruling coalition. This is why fanatics of one political stripe sometimes convert to fanatics of the other. Different ideas suit different people at different times. Ideas can adapt to people, but more often than not people adapt to ideas. Which is why the culture war is so existential — because the marketplace of ideas isn't run by the scientific method, it's run by high school politics. People are often vessels looking for a hit. They don’t really care where they get it; they only care that it’s good.

So what determines what people say? It seems clear that the default mode of thinking for humans is groupthink, and the content of that groupthink is computed with a complex algorithm taking into account one's peer group, the loyalties one owes and to whom, who has more status and what talking points fit better with the religion that one has been dutifully drilled into since kindergarten.

To illustrate how tribal our reasoning is, I’ll refer to Scott Alexander’s old post called “I Can Tolerate Anything Except The Outgroup”, explaining why the center-left seems to hate, say, Margaret Thatcher or Milton Friedman, more than Osama Bin Laden. It’s because the Blue Tribe's outgroup isn’t Al-Qaeda or Antifa or campus activists — it’s the Red Tribe. The Red tribe's outgroup isn’t Russia or Trump or even the most extreme evangelical Christians — it’s the Blue Tribe. If you're in the center, you join the tribe that keeps you safe. To quote SSC:

“Imagine hearing that a liberal talk show host and comedian was so enraged by the actions of ISIS that he’d recorded and posted a video in which he shouts at them for ten minutes, cursing the “fanatical terrorists” and calling them “utter savages” with “savage values”.

If I heard that, I’d be kind of surprised. It doesn’t fit my model of what liberal talk show hosts do.

But the story I’m actually referring to is liberal talk show host / comedian Russell Brand making that same rant against Fox News for supporting war against the Islamic State, adding at the end that “Fox is worse than ISIS”.

That fits my model perfectly. You wouldn’t celebrate Osama’s death, only Thatcher’s. And you wouldn’t call ISIS savages, only Fox News. Fox is the outgroup, ISIS is just some random people off in a desert. You hate the outgroup, you don’t hate random desert people.

I would go further. Not only does Brand not feel much like hating ISIS, he has a strong incentive not to. That incentive is: the Red Tribe is known to hate ISIS loudly and conspicuously. Hating ISIS would signal Red Tribe membership, would be the equivalent of going into Crips territory with a big Bloods gang sign tattooed on your shoulder.”

Ideological sincerity doesn't make sense at face value. Why would anything like that ever evolve? Given how ideologies actually work, a gene that made you consistent with your ideology couldn't possibly spread in the gene pool. But a gene for the ability to adroitly deploy Orwellian talking points to signal your loyalty to the winning tribe — now that's helpful.

To recap, truth doesn’t matter when you’re trying to arrange a society. Not only does it not matter, but pervasive knowledge of truth quite likely is deleterious for societal harmony. You basically can't have a society, not a long-lasting one anyway, if the truth is widely known.

To be sure, some people do get status from pursuing truth: scientists, writers, and academics, to name a few. The more truth they pursue, the higher their status score, because 1/ they're excellent at it, and 2/ because they’re part of tribes that reward pursuit of truth with high status. (Notice how those tribes often lack power…). This must be the tribe I care about impressing, evidenced by me writing this (if I’m going to make vague abstractions, I can’t exempt myself…) But most people aren't interested in or skilled enough to gain significant status from truth-discovery, so they’re less likely to prioritize the truth over fitting in. Because sometimes (but not always), there are trade-offs between truth and social cohesion. Why’s that? Because people don’t want to hear all the inconvenient truths. And they certainly don’t want to hear that there are truths you think they don’t want to hear.

Which makes sense on some level: A society in which people clearly understands status as somewhat of a zero-sum game means that lower-status people are less likely to pass their genes to the next generation; clearly that would be a hard society to cooperate in. And so humans have evolved to ignore this somewhat zero-sum conflict, which is obvious when you think about it. But seeing the obvious is not what human nature is about. Why would it be? Human nature is about deceiving and signalling believing it and sticking to it, so we can all get along. People don’t want to believe this of course. It’s part of the broader self-deception we pointed out earlier. The burden of having to realize our constant self-deception is too big of a cross to bear, since it threatens the edifice of our reality. For most people, it’s better to ignore it.

To illustrate this tension between truth and social cohesion, It’s worth diving into the difference between mistake theory and conflict theory:

Mistake theory claims political opposition comes from a different understanding of issues: if people had the same amount of knowledge and proper theories to explain it, they would probably agree. 

Conflict theory, in contrast, states that people disagree because their interests conflict—the conflict is zero-sum so there’s no reason to agree—which means the only question is how to resolve the conflict.

Mistake theory tends to be a more popular explanation for conflicts, since it’s true enough of the time and also a more empowering lens by which to look at problems. 

Now, the fight for genetic survival is somewhat zero-sum, and even in those short periods of abundance when it is not, the fight for mating supremacy is quite zero-sum. This should be somewhat obvious, but there are dangers to this truth being made explicit. It’s hard to coordinate around this truth, especially when this truth implies humans being constantly in conflict. As conflict theory implies, hiding that truth is just easier to coordinate around.

So why is there inherent zero-sumness in status acquisition? Maybe because during most of human existence most males weren’t able to reproduce. There simply weren't enough eggs for every sperm out there, and females had to choose among men, which meant men had to compete. Some people like to think if we ended material scarcity there wouldn’t be zero sum competition. But that’s unlikely. To be sure, it doesn’t mean all of life is conflict theory. Mistake theory is often the correct mapping: if people get on the same page they can often work through problems, and that’s encouraging. But life is not always positive-sum in all situations all the time. Conflicts will continue to exist, and distracting people from the fact that there is zero-sum conflict present is sometimes politically useful when trying to organize people with a shared mission. 

To be sure, life has gotten less zero-sum over time. As we increasingly move from scarcity to abundance, hopefully status acquisition will become more positive sum, and mistake theory will increasingly be the correct lens to apply—and as a result seeking truth will be more aligned with social cohesion. 

Maybe that’s the silver lining of this piece: As the world becomes more positive-sum over time, truth and social cohesion should better co-exist.


Until next week,

Erik