Thursday, November 3, 2016

Crony Beliefs

For as long as I can remember, I've struggled to make sense of the terrifying gulf that separates the inside and outside views of beliefs.

From the inside, via introspection, each of us feels that our beliefs are pretty damn sensible. Sure we might harbor a bit of doubt here and there. But for the most part, we imagine we have a firm grip on reality; we don't lie awake at night fearing that we're massively deluded.

But when we consider the beliefs of other people? It's an epistemic shit show out there. Astrology, conspiracies, the healing power of crystals. Aliens who abduct Earthlings and build pyramids. That vaccines cause autism or that Obama is a crypto-Muslim — or that the world was formed some 6,000 years ago, replete with fossils made to look millions of years old. How could anyone believe this stuff?!

No, seriously: how?

Let's resist the temptation to dismiss such believers as "crazy" — along with "stupid," "gullible," "brainwashed," and "needing the comfort of simple answers." Surely these labels are appropriate some of the time, but once we apply them, we stop thinking. This isn't just lazy; it's foolish. These are fellow human beings we're talking about, creatures of our same species whose brains have been built (grown?) according to the same basic pattern. So whatever processes beget their delusions are at work in our minds as well. We therefore owe it to ourselves to try to reconcile the inside and outside views. Because let's not flatter ourselves: we believe crazy things too. We just have a hard time seeing them as crazy.

So, once again: how could anyone believe this stuff? More to the point: how could we end up believing it?

After struggling with this question for years and years, I finally have an answer I'm satisfied with.

Beliefs as Employees

By way of analogy, let's consider how beliefs in the brain are like employees at a company. This isn't a perfect analogy, but it'll get us 70% of the way there.

Employees are hired because they have a job to do, i.e., to help the company accomplish its goals. But employees don't come for free: they have to earn their keep by being useful. So if an employee does his job well, he'll be kept around, whereas if he does it poorly — or makes other kinds of trouble, like friction with his coworkers — he'll have to be let go.

Similarly, we can think about beliefs as ideas that have been "hired" by the brain. And we hire them because they have a "job" to do, which is to provide accurate information about the world. We need to know where the lions hang out (so we can avoid them), which plants are edible or poisonous (so we can eat the right ones), and who's romantically available (so we know whom to flirt with). The closer our beliefs hew to reality, the better actions we'll be able to take, leading ultimately to survival and reproductive success. That's our "bottom line," and that's what determines whether our beliefs are serving us well. If a belief performs poorly — by inaccurately modeling the world, say, and thereby leading us astray — then it needs to be let go.

I hope none of this is controversial. But here's where the analogy gets interesting.

Consider the case of Acme Corp., a property development firm in a small town called Nepotsville. The unwritten rule of doing business in Nepotsville is that companies are expected to hire the city council's friends and family members. Companies that make these strategic hires end up getting their permits approved and winning contracts from the city. Meanwhile, companies that "refuse to play ball" find themselves getting sued, smeared in the local papers, and shut out of new business.

In this environment, Acme faces two kinds of incentives, one pragmatic and one political. First, like any business, it needs to complete projects on time and under budget. And in order to do that, it needs to act like a meritocracy, i.e., by hiring qualified workers, monitoring their performance, and firing those who don't pull their weight. But at the same time, Acme also needs to appease the city council. And thus it needs to engage in a little cronyism, i.e., by hiring workers who happen to be well-connected to the city council (even if they're unqualified) and preventing those crony workers from being fired (even when they do shoddy work).

Suppose Acme has just decided to hire the mayor's nephew Robert as a business analyst. Robert isn't even remotely qualified for the role, but it's nevertheless in Acme's interests to hire him. He'll "earn his keep" not by doing good work, but by keeping the mayor off the company's back.

Now suppose we were to check in on Robert six months later. If we didn't already know he was a crony, we might easily mistake him for a regular employee. We'd find him making spreadsheets, attending meetings, drawing a salary: all the things employees do. But if we look carefully enough — not at Robert per se, but at the way the company treats him — we're liable to notice something fishy. He's terrible at his job, and yet he isn't fired. Everyone cuts him slack and treats him with kid gloves. The boss tolerates his mistakes and even works overtime to compensate for them. God knows, maybe he's even promoted.

Clearly Robert is a different kind of employee, a different breed. The way he moves through the company is strange, as if he's governed by different rules, measured by a different yardstick. He's in the meritocracy, but not of the meritocracy.

And now the point of this whole analogy.

I contend that the best way to understand all the crazy beliefs out there — aliens, conspiracies, and all the rest — is to analyze them as crony beliefs. Beliefs that have been "hired" not for the legitimate purpose of accurately modeling the world, but rather for social and political kickbacks.

As Steven Pinker says,
People are embraced or condemned according to their beliefs, so one function of the mind may be to hold beliefs that bring the belief-holder the greatest number of allies, protectors, or disciples, rather than beliefs that are most likely to be true.
In other words, just like Acme, the human brain has to strike an awkward balance between two different reward systems:
  • Meritocracy, where we monitor beliefs for accuracy out of fear that we'll stumble by acting on a false belief; and
  • Cronyism, where we don't care about accuracy so much as whether our beliefs make the right impressions on others.
And so we can roughly (with caveats we'll discuss in a moment) divide our beliefs into merit beliefs and crony beliefs. Both contribute to our bottom line — survival and reproduction — but they do so in different ways: merit beliefs by helping us navigate the world, crony beliefs by helping us look good.

The point is, our brains are incredibly powerful organs, but their native architecture doesn't care about high-minded ideals like Truth. They're designed to work tirelessly and efficiently — if sometimes subtly and counterintuitively — in our self-interest. So if a brain anticipates that it will be rewarded for adopting a particular belief, it's perfectly happy to do so, and doesn't much care where the reward comes from — whether it's pragmatic (better outcomes resulting from better decisions), social (better treatment from one's peers), or some mix of the two. A brain that didn't adopt a socially-useful (crony) belief would quickly find itself at a disadvantage relative to brains that are more willing to "play ball." In extreme environments, like the French Revolution, a brain that rejects crony beliefs, however spurious, may even find itself forcibly removed from its body and left to rot on a pike. Faced with such incentives, is it any wonder our brains fall in line?

Even mild incentives, however, can still exert pressure on our beliefs. Russ Roberts tells the story of a colleague who, at a picnic, started arguing for an unpopular political opinion — that minimum wage laws can cause harm — whereupon there was a "frost in the air" as his fellow picnickers "edged away from him on the blanket." If this happens once or twice, it's easy enough to shrug off. But when it happens again and again, especially among people whose opinions we care about, sooner or later we'll second-guess our beliefs and be tempted to revise them.

Mild or otherwise, these incentives are also pervasive. Everywhere we turn, we face pressure to adopt crony beliefs. At work, we're rewarded for believing good things about the company. At church, we earn trust in exchange for faith, while facing severe sanctions for heresy. In politics, our allies support us when we toe the party line, and withdraw support when we refuse. (When we say politics is the mind-killer, it's because these social rewards completely dominate the pragmatic rewards, and thus we have almost no incentive to get at the truth.) Even dating can put untoward pressure on our minds, insofar as potential romantic partners judge us for what we believe.

If you've ever wanted to believe something, ask yourself where that desire comes from. Hint: it's not the desire simply to believe what's true.

In short: Just as money can pervert scientific research, so everyday social incentives have the potential to distort our beliefs.

Posturing

So far we've been describing our brains as "responding to incentives," which gives them a passive role. But it can also be helpful to take a different perspective, one in which our brains actively adopt crony beliefs in order to strategically influence other people. In other words, we use crony beliefs to posture.

by Kevin Simler, Melting Asphalt |  Read more:
Image: via: