What of Human Nature in the Art of the Deal?
Though deal-makers in international conflicts are often unsung and occasionally reviled, they deserve credit for one of the most extraordinary developments of our era: interstate conflicts have plummeted since the end of WW2, and civil wars have declined precipitously since the early 1990s. No small credit for the decline must go to the upsurge – over these same periods – in negotiations that stop wars and prevent them from restarting. And understanding how international actors ‘get to yes’ is an engaging intellectual problem with enormous moral consequences.
John Kay (in The Art of the Deal) elegantly lays out the rational basis of negotiation: “The art of the deal lies in structuring a negotiation in which there is a surplus to divide, so that both sides win.” In the case of war, there is almost always a surplus: it is easier to destroy than to create, such that both sides are better off if they agree to refrain from destruction. So why is it often so hard to drag belligerents (or adversaries) to the negotiating table?
I suspect that the answer may emerge from an ongoing revolution in the social sciences. The idealization of human beings as rational actors is being replaced by more realistic models informed by cognitive psychology and evolutionary biology. Just as behavioural economics is showing that certain irrationalities in economic outcomes may be explained by our cognitive quirks, a new ‘behavioural political science’ may show that destructive outcomes may come from other bugs in our psychological makeup.
An example: The evolutionary biologist Robert Trivers has noted that natural selection favours organisms that can deceive adversaries about their strength and fitness. But it also favours organisms that can see through this deception – setting up an evolutionary arms race between lying and lie detection. Since the best liar is the one who believes his own lies (so that he cannot betray himself with involuntary ‘tells’), evolution will select for a degree of self-deception. Many demonstrations in social psychology have borne this out, such as the Lake Wobegon Effect, named after the mythical town in which all the children are above average. When it comes to any desirable trait you name, a majority of people judge themselves to be above the population mean.
A lethal example of self-deception is overconfidence in war. Historians have long noted that many cataclysms have been ignited when two adversaries both believe that an impending conflict between them will be a cakewalk. (Remember “Mission Accomplished”?) The anthropologist Richard Wrangham has noted that deceiving oneself about one’s strength can make one’s threats more credible (since, in one’s mind, one is not bluffing), thereby increasing the chance that the other guy will back down. It can also mobilize one’s own forces with the promise that they will be glorious victors, rather than cannon fodder. The political scientist Dominic Johnson, working with Wrangham, has demonstrated this effect in the lab. Subjects in war games who role-played the leader of a fictitious country overestimated their chances of victory, and the greatest self-deceivers were the most likely to launch unprovoked attacks.
Another bug in our rationality is the mentality of taboo. Often, structuring a negotiation to divide a surplus does not tempt the two sides to claim their winnings, but rather inflames them (and unaffected third parties) into moral outrage. Just think about how you might react to a proposal that markets be set up for kidneys, votes, military service, jury duty or babies given up for adoption. The psychologist Philip Tetlock has argued that people treat certain values as sacred. They do not just refuse to trade them off against other goods in life, but are offended at being given the opportunity – and revile other people who consider it. This mindset may have evolved to protect our personal relationships: no one would befriend us, marry us or invite us into a coalition if they had to worry that we were constantly weighing the value of the relationship against other temptations. But, of course, this can get in the way of rational political and economic arrangements in the modern world.
A recent study by Jeremy Ginges, Scott Atran, Douglas Medin, and Khalil Shikaki shows how the psychology of taboo can confound the bestlaid plans of rational deal-makers. They asked Israeli West Bank settlers and members of Hamas whether they would accept various versions of a negotiated two-state solution to the Israel-Palestine conflict. Not surprisingly, a significant proportion of the Palestinians adamantly rejected the possibility of a two-state solution in which they would ‘recognize the historic right of the Jewish people to Israel’ or give up all claims to sovereignty over East Jerusalem. However, a significant proportion reacted with even greater anger and disgust if the compromise was sweetened by Israel paying Palestine a billion dollars a year for ten years! The Israeli settlers – also not surprisingly – rejected a two-state solution that required Israel to give up ‘Judea and Samaria’ or ‘recognize the legitimacy of the right of Palestinian refugees to return’ (with the agreement not actually requiring Israel to absorb the refugees). But they were even more opposed if the deal included additional American aid of a billion dollars a year for a century, or a guarantee of living in peace and prosperity.
Fortunately, humans are not just moralists, but also hypocrites, as in the joke about the woman who considered an indecent proposal to sleep with a man for a million dollars, but reacted to his offer of a hundred dollars with, “What kind of a woman do you think I am?” (Reply: “We’ve already established that; we’re just haggling over price.”) Tetlock finds that people retreat from their taboos when a tradeoff pits a sacred value against another sacred value, rather than against a secular one like money. For instance, people will soften their opposition to a voluntary market for transplant organs when they are told that it would save many lives. The Ginges team found that the Palestinian and Israeli absolutists toned down their fury over the possibility of compromising their sacred values in a two-state deal when it would require the other side to compromise on their sacred values. For the Palestinians, this meant that Israel would ‘recognize the historical and legitimate right of the Palestinians to their own state, and would apologize for all the wrongs done to the Palestinian people;’ for the settlers, the Palestinians would ‘recognize the historic and legitimate right of the Jewish people to Eretz Israel.’
The challenge for deal-makers, then, is not just to structure a negotiation in which two rational adversaries would win, but to develop work-arounds for features of human nature that prevent adversaries from being thoroughly rational.
Steven Pinker is Harvard College Professor of Psychology at Harvard University, and author of many books on language, mind, and human nature, including How the Mind Works, The Blank Slate, and The Stuff of Thought.