I’m just now getting around to reading Joseph Stiglitz’s book from last year The Price of Inequality. There’s lots of interesting stuff in there, some of which I may end up talking about here on the blog. But as I was reading the other day, there was one particular section that struck me. He’s talking about some of the things that classical economic theory (with its ruthless devotion to efficiency) has trouble with, and the example he brings up is human beings’ strange preoccupation with fairness.
According to classical economics, in any transaction people should only be interested in maximising their personal gain (properly understood). They shouldn’t care whether what they are getting is ‘fair’ in some abstract, moral sense; they should just care about the best deal they can get. They absolutely shouldn’t care about whether they are giving the other person a fair shake. Stiglitz describes some examples which seem to flatly contradict this theory. These examples come from laboratory studies of economic ‘games’. These games consist of simple scenarios designed to reveal people’s basic economic behaviour.
Stiglitz focuses on two particular games. Both are played just with two people and involve how to split a fixed pot of money, say £100. In the ‘Dictator’ game, one person simply dictates how they money is split, with the other person having no choice in the matter (I never said they were particularly fun games). If people only care about maximising personal gain, then the ‘dictator’ should always just keep all of the money. But this almost never happens. In fact, dictators on average give away xx% of the money to the other player.
The second game is called the Ultimatum game. Here one person still gets to decide on how the money is split, but this time the other player gets a choice. They can either accept the sum they are offered, or they can reject it. If they reject it, both players get nothing. Again, the prediction from standard economics is clear. It doesn’t make economic sense for Player 2 to reject any amount, because any amount is still better than nothing. Player 1, knowing this, should always offer the lowest possible amount. But again, this is almost never what actually happens. The first player almost always offers a reasonable split; not 50/50, but say 70/30. And they are smart to do that because Player 2, if offered a tiny amount, will often reject it, meaning both players get nothing. This isn’t strictly economically rational on Player 2’s part, but they seem to want to punish Player 1 for offering such an unfair deal.
They key part about both of these games, and what makes the results so persuasive, is that they are not repeated. Both players play the game once, and never again. This is important because, if the games went on, seemingly irrational behaviour could become rational. Being stingy in the Dictator game, or, on the other side, being too quick to accept being shafted in the Ultimatum game; these actions might come back to bite you later. But because these games are one-shot affairs, they should reveal people’s true selfish/rational core.
For some left-wingers, the results of these experiments are the ultimate ‘screw-you’ to classical economics. They jibe with our instinct that people are more than just selfish calculators of utility. They care about grander things like justice and fairness that economics just can’t account for. I’ve also heard it suggested that they show people have an evolved ‘instinct’ for fairness. That our evolution in small groups of hunter-gatherers has hardwired into us a sense of ‘fair-play’. That’s why, whether we’re on the giving or the receiving end, we feel unfairness on an emotional level, in our gut (though of course we can learn to ignore this sensation if it suits us).
You can even tell a plausible story about how this ‘fairness instinct’ might have evolved. In hunter-gatherer societies, food is often like that old saw about buses – you wait a long time then a lot (i.e. a big kill) comes along at once. Since it’s impossible to store meat for any period of time, whoever made the kill has to share with the rest of the group, trusting that the next successful hunter will do the same. You can imagine that, in this scenario, someone without an instinct for sharing fairly might do badly over the long haul.
The problem is there’s another clear explanation for the Ultimatum and Dictator findings that doesn’t seem to require people to care about fairness at all. That’s that you can’t persuade people that any given interaction is truly ‘one-shot’, that it’s completely divorced from all other interactions they’re going to have in the future. As I mentioned, if the players believe they’re going to be interacting with each other, or with someone else who has witnessed their behaviour in the game, then economically irrational behaviour becomes rational. They need make generous offers now in the hopes that people will be generous with them in the future, or they need to avoid accepting low offers to avoid setting a precedent. You can tell people that their behaviour in this one game has no consequences outside it, but, deep down, will they really believe you?
So maybe the Dictator and Ultimatum games aren’t the final nails in the coffin of classical economics. But we should ask ourselves why it might be so difficult to persuade people that their behaviour does not have social consequences, that they’re not part of a repeat interaction. Maybe that’s part of our evolutionary heritage. We didn’t evolve in the modern world of anonymous one-shot interactions, but in small, tightly bound groups where everyone knew everyone else and what they were up to. Maybe this has left us with an instinctive feeling that we’re always part of ongoing relationship with the person we’re dealing with, and with everyone who might be watching. At the end of the day, maybe this isn’t that much different from a ‘fairness instinct’.