Empathy is the secret ingredient that makes cooperation – and civilization – possible

Human societies are so prosperous mostly because of how altruistic we are. Unlike other animals, people cooperate even with complete strangers. We share knowledge on Wikipedia, we show up to vote, and we work together to responsibly manage natural resources.

But where do these cooperative skills come from and why don’t our selfish instincts overwhelm them? Using a branch of mathematics called evolutionary game theory to explore this feature of human societies, my collaborators and I found that empathy – a uniquely human capacity to take another person’s perspective – might be responsible for sustaining such extraordinarily high levels of cooperation in modern societies.

Social rules of cooperation

For decades scholars have thought that social norms and reputation can explain much altruistic behavior. Humans are far more likely to be kind to individuals they see as “good,” than they are to people of “bad” reputation. If everyone agrees that being altruistic toward other cooperators earns you a good reputation, cooperation will persist.

This universal understanding of whom we see as morally good and worthy of cooperation is a form of social norm – an invisible rule that guides social behavior and promotes cooperation. A common norm in human societies called “stern judging,” for instance, rewards cooperators who refuse to help bad people, but many other norms are possible.

This idea that you help one person and someone else helps you is called the theory of indirect reciprocity. However, it’s been built assuming that people always agree on each others’ reputations as they change over time. Moral reputations were presumed to be fully objective and publicly known. Imagine, for instance, an all-seeing institution monitoring people’s behavior and assigning reputations, like China’s social credit system, in which people will be rewarded or sanctioned based on “social scores” calculated by the government.

But in most real-life communities, people often disagree about each others’ reputations. A person who appears good to me might seem like a bad individual from my friend’s perspective. My friend’s judgment might be based on a different social norm or a different observation than mine. This is why reputations in real societies are relative – people have different opinions about what is good or bad.

Using biology-inspired evolutionary models, I set out to investigate what happens in a more realistic setting. Can cooperation evolve when there are disagreements about what is considered good or bad? To answer this question, I first worked with mathematical descriptions of large societies, in which people could choose between various types of cooperative and selfish behaviors based on how beneficial they were. Later I used computer models to simulate social interactions in much smaller societies that more closely resemble human communities.

The results of my modeling work were not encouraging: Overall, moral relativity made societies less altruistic. Cooperation almost vanished under most social norms. This meant that most of what was known about social norms promoting human cooperation may have been false.