Read articles, including Science News stories written for ages 9-14, on the SNK website.
Traveler's Dilemma: When it's smart to be dumb
Some game theory paradoxes can be resolved by assuming that people adopt multiple personae, and aren’t rational.
A+ A- Text Size
Enlarge
GAME THEORY MATRIX
Game theory typically describes games using a matrix like the one above. In this game, you and your opponent each choose to say “yes” or “no.” The first number shows the amount you’d win and the second number shows the amount your opponent would win. If you are both playing rationally, game theory would recommend that you say “yes” (and get 6 points) and your opponent says “no” (and gets 1 point), because neither of you can get more points by changing your own move without changing your partner’s move. If you alone switched, your payoff would go from 6 to 4, and if your opponent alone switched, his payoff would go from 1 to 0. A play like this is called a “Nash Equilibrium.”
Adapted from a graphic by David H. Wolpert

Now that the airlines are done with it, your suitcase looks like a gorilla stomped on it. The antique vase you’d packed so carefully is smashed.

Never fear, the airline representative reassures you. The airline will reimburse you for the value of the vase. You just have to agree on what it’s worth, and the representative has a scheme to figure it out. One of your fellow passengers, it turns out, had an identical vase, which is now identically smashed. Each of you will say what you think the vase is worth, between \$2 and \$100.  If the two prices are the same, the representative will assume you’re telling the truth. If the values differ, he’ll figure the lower value is accurate and reimburse each of you that amount — with a \$2 bonus for “honesty” for the lower bidder and a \$2 penalty for the higher one.

What will you say the vase is worth?

Being the greedy type, you first figure you’ll bid \$100. But then a sneaky thought occurs to you. If you bid \$99 instead, you’ll come out at least as well, no matter what your fellow passenger does. Here’s your reasoning: First suppose she bids \$100. Then you’ll get \$99 plus the \$2 bonus — better than the \$100 you’d get by bidding \$100. If she bids \$99, you’ll get \$99 rather than the \$98 (\$100 less the \$2 penalty for bidding higher than her). And if she bids anything less than \$99, you’re stuck with her bid less the penalty, which is just the same as you’d have gotten for bidding \$100.

You open your mouth to say \$99… and then close it. You know who your fellow passenger is — you sat next to her on the flight. She is both wickedly smart and ruthless, and this analysis won’t have escaped her. She’s got to know that a \$99 bid will get her at least as much money as a \$100 one, so she definitely won’t bid \$100. In that case, you’re better off bidding \$98, for the same reason that \$99 was better than \$100.

But wait! She’ll know that too. Better bid \$97… No, \$96… No, \$95…

Enlarge
IRRATIONALITY PAYS OFF
Your opponent will make more money in this game if he chooses to behave irrationally and pick "yes" or "no" at random. If you know that's his strategy, you're better off changing your answer to "no." In that case, half the time you'll get 4 points and half the time you'll get 5, for an expected payoff of 4.5. If you said "yes," your expected payoff would be only 3 (the average of 6 and 0). Then, because your strategy makes your move fixed, his expected payoff will be the average of 5 and 6, or 5.5. If he had played rationally, on the other hand, you would have gotten 6 and he would have gotten 1. So you'll get less money, and he'll get more.
Adapted from a graphic by David H. Wolpert

Follow this reasoning, and your bid will spiral down to \$2. And according to game theory, which assumes that all the players in a game are both perfectly rational and perfectly selfish, that’s just what you should do.

But that’s crazy!

Indeed, when economists have tested this scenario, called the Traveler’s Dilemma, on real people in the lab, the players hardly ever follow the game theory prescription. The average payment turns out to be not far from \$100. Most people choose the maximum bid — even though \$99 always brings in as much money, and sometimes more. The only situation when people played the “rational” strategy that game theory predicted was when the penalty and reward were made very large.

To find out if that’s just because people don’t do the analysis, a research team repeated the experiment using professional game theorists playing for real money. But even among game theorists, game theory failed: Nearly a fifth chose \$100, and two-thirds chose \$95 or higher. “Game theorists don’t believe their own theory,” says David Wolpert of NASA Ames Research Center, one of the researchers on the study..

It’s hardly surprising that people sometimes behave irrationally. The odd thing, though, is that in this case, irrationality makes people richer. So sometimes, it’s smart to be dumb.

Since Kaushik Basu of Cornell University created the Traveler’s Dilemma in 1994, economists have posited that people play the game as they do because they’re altruistic, or well-socialized, or just haven’t reasoned it out. The harder question, though, is how to fix up game theory so that it can explain and predict how people find and agree upon these better solutions.

Now Wolpert and his colleagues have come up with a solution, one that may have the power to resolve many other game theory paradoxes. They figure that people adopt different personae for different circumstances.

Here’s one example. Have you ever known anyone that you just knew was smart, but consistently acted ditzy? As a result, you have no choice but to treat the person like a ditz. And here’s the aggravating part: sometimes, precisely because you’ve adjusted your behavior to compensate for that person’s foolishness, the dingbat comes out better than you do! Wolpert says he can explain that using game theory. For certain games, ignoring what’s in your own apparent self-interest and choosing a play randomly can increase the expected pay-off—and can decrease that of your opponent. The player is adopting a “persona.”

The key thing for this to work is that you need to know which persona your opponent is using and know that your opponent is going to stick with it no matter what. If you know that your opponent really will choose randomly, then it may be in your interest to change your play in response.

When Wolpert and his colleagues applied this theory to the Traveler’s Dilemma, it explained the data about how real people play the game perfectly. If you know your opponent is going to be irrational and choose randomly, you’re best off playing rationally and bidding around \$97 or \$98. Most likely, you’ll bid higher than she does, so you’ll lose the penalty. But that’s better than bidding less than she does and losing the entire difference between your bid and hers.

Furthermore, as soon as you admit the possibility that your opponent might not play rationally, the \$2 play becomes silly. Bidding \$2 only makes sense when you and your opponent are certain that you’re each perfectly rational and selfish. If you’re not certain of that, Wolpert and his colleagues calculate that players will bid in the high 90s about three-quarters of the time—which is very similar to what experiments show.

The researchers then tried making the penalty and reward bigger than \$2. As it grew, the optimal bid against an irrational opponent shrank. Eventually, it dropped all the way down to \$2, just as in the experiments.

Wolpert admits that their model is crude in that it adopts only two rigid personae (random and perfectly rational), and he doesn’t claim that the model captures our conscious reasoning process as we play the game. But the model does manage to encapsulate the combination of rationality and irrationality that we expect in other humans. “Your experience with other human beings is that they’re not perfectly rational and, intuitively, this means they don’t go down 100 steps of ratcheting,” Wolpert says. “The average of being rational and irrational is to go down only a couple of steps.”

“It’s a very exciting new direction,” says Basu. Personae, he says, have the potential to shed light on many of the famous paradoxes in game theory. Wolpert and his colleagues have already successfully applied their theory to two other games, the Ultimatum Game and a generalized version of the Prisoner’s Dilemma.

Comment

The influence of influence in Prisoner’s Dilemma

For more Math Trek columns [Go to]

Please alert Science News to any inappropriate posts by clicking the REPORT SPAM link within the post. Comments will be reviewed before posting.

• This is just ridiculous, when the answer is totally obvious. The reason people choose \$100 is that NOBODY CARES about the \$2, so it doesn't matter if one person gets more or less. As the penalty increases people decide to get something rather than nothing.

This blithering dimwittery about "personae" by "game theorists" is just another indication that science is no longer attracting people who care about solving problems and actually answering them! Science is, all too often today, not attracting anything like the brightest people. There were many giants in the last century who actually figured things out. Today, far too many successors in science are second to fifth rate sycophants who got where they are by kissing up.

Not all - not yet, but far too many in sciences are in it because they can't make it on the outside. A rapidly metastasizing cancer in science is people gaming the grant system for money, because THAT is what is rewarded.

The hearings that Grassley's congressional committee are holding on corruption in biomedicine are the tip of the iceberg. The real story is that gaming the grant system is becoming the real purpose of "doing science."
Dec. 6, 2008 at 2:46pm
• The problem with their arguments is that they assume that people care what the other person gets. People just care about how much they get. A bid of \$99 yields the highest payoff, and people assume other people will act just like themselves, securing a value close to, or more than the full \$100. It seems that game theorists need to reevaluate their definition of rationality.
Robbie Gleichman
Dec. 7, 2008 at 12:39am
• The other comments posted seem to ignore the fact that Game Theory is not a social science. It is a branch of mathematics. Game theory has some very hard and fast rules. If those rules don't seem to be born out in real life, it represents an opportunity to improve the understanding of the difference between the theory and real life. That's what these researchers are doing. It is a very reasonable subject to explore.
jim.brookhyser
Dec. 7, 2008 at 10:38am
• Wow, how ignorant of John Toradze to use Game Theory as an example of what science is "no longer" doing. Game Theory is a very old subject, which can be very entertaining if you recognise it for what it is.

A comment like that would be expected in a forum of people ignorant of science, not here.
Brian Williams
Dec. 7, 2008 at 11:54am
• I am always amazed at game theorists definitions of rationality. If a person wants to maximize their gain in the traveler's dilemma, the rational answer is not \$2, that is a logical fallacy. The reason for this is because the problem is not a pure logic problem, it is an odds problem. Rationally, one bets the odds the other person will want as much money as possible, which can't happen if they bid ridiculously low. The correct approach is to assume an asymptotic odds curve in which the chances of the other person bidding a certain number will decrease the lower the number. Each player has to calculate how low the other person will go before they decide the risks of losing outweigh the costs of "winning". If one assumes the other person is greedy and wants the most, the number will not be too low. The reason the calculation changes when the penalty and rewards are increased is because the risk is proportional to the relative amount, not strictly loss or gain. No one will care about a loss of a couple of dollars (2% in this case), but if the cost is relatively high (say 40%), then people get much more conservative and "logical" according to game theory.If the penalty or losing is everything, then people will willingly accept the small amount as the most likely outcome over losing everything if they are wrong. I'm not a mathematician, so I don't know how to mathematically write that up, but the principle is pretty simple and logical if one uses the proper math, which game theory doesn't because they are applying "rationality" too specifically and narrowly. Their "rational" line of reasoning is followed to ridiculous lengths and does not take into account proper psychological odds.
jdmimic
Dec. 7, 2008 at 4:32pm
• Gleichman: While I agree that game theorists fail because they have an incorrect interpretion of rationality, I disagree that game theory assumes that people care about what the other person gets. Game theory does not do so. The choices of one person in the game influences the outcome of the other player, so game theory attempts to calculate the best response given all possible plays and then to do this in an iterative fashion back and forth between the two players.
Brookhyser: Yes, game theory is a set of mathematics, but it attempts to explain social behavior, so it is in a way social science. The problem is that it carries out simple logic statements to the point that it becomes illogical because it becomes too reductionist. Applying pure simplistic logic at each step causes the end result to be illogical because it loses sight of the overall goal. A more correct approach would be an iterative approach such that the simple logic statement at each step is rechecked against progress to the overall goal. Since game theory in the traveler's dilemma doesn't do this, it spirals into an illogical final result.
jdmimic
Dec. 7, 2008 at 4:49pm
• I agree with jdmimic on the definition of rationality. I do not think it is reasonable to call a strategy of bidding \$90 irrational. Such a strategy is only irrational, using the common understanding of the term, if one expects the other player to bid less than \$90 and bids \$90 anyway. Furthermore, a player may be intelligent and may think the problem through to many levels, but may have no reason to expect A) that the other player will do so, or B) that the other player having done so will assume that THAT player will do so, ad infinitum (or nauseum :) ). In short, you can quickly run into the sort of quasi-intellectual exchange between Vizzini and Westley in the film the Princess Bride, in which you can reason to a large number of levels, your opponent can do so, and you have no idea the degree to which your opponent will reason, or the choice they will make in response--the assumptions they make about you. So it is NOT automatically irrational to assume that most people will not tie themselves in logical knots, but will instead make a high bid assuming you will do the same, and thus bid high yourself. This is, rather, a demonstration that you understand human behavior and are acting quite rationally. The \$2 bid is only rational if you are certain your opponent will bid likewise. Otherwise it is quite irrational.

I do agree that personae could play a role here if players are allowed to interact with each other, thus giving them a chance to judge each other's personae. However, in most cases I suspect that (in the absence of a large penalty) people will still favor a high bid because this makes the most common sense.

Finally, I'm not sure that Random Bid versus "Rational” (\$2) Bid is a good dichotomy. Are people in this game really going to bid randomly? I suspect (but of course cannot prove) not. However, I also know that in research it is often convenient to make simplifying assumptions, and both scenarios are probably mathematically convenient for analysis.
Stephen
Dec. 8, 2008 at 9:28am
• Ha:) the Princess Bride reference is a perfect example of logic entrapment. Wish I'd thought of that one.
jdmimic
Dec. 8, 2008 at 10:10am
• the investigation is interesting but i think the example is off the mark. the 2% bonus is devalued as compared to the 100% value. It's the same way that casinos call \$5 chips "nickels".
sss sss
Dec. 13, 2008 at 7:04pm
• I believe the largest problem facing game theory (and inherent in most social sciences) is that humans cannot function in a vaccuum where only one or two variables are present. Indeed there are many variables present even if they are not physically represented. Do they believe that lying is wrong? If so what potential punishment do they face? Do they have positive or negative experiences with any sort of group into which the airline falls (transportation industry, large businesses and so on)? The rationality and irrationality of the entire person's existence can have an effect. You can try to limit variables by asking a simple question but these factors are present. If you administer the test via paper then people will answer in a detached manner. If you act it out then you introduce new variables given a number of things from the airline rep's appearance and demeanor to the room and it's environmental effects. Once Game Theory can find a way to gauge and reflect all these variables it will become more viable.
Ansel Bailey-Mershon
Dec. 18, 2008 at 11:41am
• There is a counter to many of the arguments here, but this is a bad example.

If this was a diplomatic scenario, it may be more important to consider what your opponent is getting. Since you will know you have to deal with this person again. Therefore, you may be willing to take a loss now, or at least not the "optimal" solution.

I also disagree with their definition of optimal. When the reward is only 2%, then reducing the bid by anything over 2% makes no sense at all. As then the loss becomes greater than the gain.
William Simpson
May. 7, 2009 at 10:56am
• hımm [Link was removed] Thank you very nice stories