How rational are you?

Suppose you play the following game: There is a container with 50 purple balls and 50 white balls. You must pick a color. If the ball drawn [at random by an official] is the same color as the color you chose, you win $50,000. If it isn’t, you receive nothing. Now you play the following game: There is a container with an unknown proportion of purple and white balls, with a total of 100 balls. The proportion is chosen randomly beforehand by an official. You must pick a color. If the ball drawn [at random by an official] is the same color as the color you chose, you win $50,000. If it isn’t, you receivenothing. How much would you pay to play the first game? How much would you pay to play the second game?

I remember this homework problem from when I was in school!

I’m not sure what this has to do with rationality. One could rationally choose either game depending on their aversion to ambuiguity.

But then again, I don’t think any of the alleged decision theory paradoxes are all that paradoxical. They are only paradoxes (or irrational) if you take a very narrow view of rationality/decision-making/behavior.

isn’t the expected outcome of both games the same?

thereby given expected outcome the chooser should be indifferent?

Depends on how many times you get to play the game imo.

well, expected outcomes imply its infinity or very high number of times right?

It seems like the first draw is random 50/50 in either case. In the second case, if the outcome is not 50/50, there would be an argument about why picking one color is better than they other. I cannot think of such an argument.

With multiple draws, the expected outcome changes. With the first container, assuming no replacement of balls, each failed draw results in known and increasing probability of success in the next round. With the second container, each failed draw changes the conditional expected original distribution. Each failure in the second container increases your expectation for the original number of balls of the opposite color (given that I pulled a black ball, there is a high probability that most of the balls were black to begin with).

So with multiple draws, the first game is more valuable than the second game. Unfortunately, I can’t think of how to derive the value of the second game off the top of my head. With multiple draws, the value of both games are obviously more than 50%.

Agreed, I was just trying to make the point that if you only get to play once, the first option has more certainty, which imo should be more valuable - and to your point, if you play a billion times, same diff no diff (kind of, I’d probably still pay more for the first option). JMH made a good point about aversion to ambiguity.

What is “aversion to ambiguity”? Can’t ambiguity just be expressed in terms of probability?

The first one is less than 50/50. It does not state that the ball is replaced. Therefore, the EV is a little less than $25k, so you should pay $24,747.47 to play (which is what the actual question is).

ohai - Isn’t that the point? You don’t know, with certainty, the probability of each turn using the second option - that’s not the case for the first option.

…if you have no aversion, you pay the same, if you have some aversion, you pay less for the second (to what degree depends on your level of aversion).

Actually the $24,747.47 is the indifference point, so you should pay anything less than that to play.

But everything can ultimately be expressed in terms of expected value. With no information about the ratio, the game is equivalent to 50/50. With a little bit of information (the color of your first draw), the implied ratio changes to say, 48/52. It is definitely possible to calculate the implied probability of winning the second game. The second game is worth less in subsequent rounds not because of lack of information, but because the information revealed by each failed draw implies a lower probability of future success. Thus, “aversion to ambiguity” is meaningless - it’s not a rational concept.

I’m not sure why you’re bringing multiple draws into this - the wording (imo) implies one draw which determines a win or a loss.

“You must pick a color. If the ball drawn [at random by an official] is the same color as the color you chose, you win $50,000. If it isn’t, you receive nothing.” (seems the outcome is decided after one draw win or lose)

Bchad, the man with the PHD, please weigh in here…

Well. the discussion of multiple draws was an extension to the problem. However, even with a single draw, “aversion to ambiguity” is still irrational. With a single draw, the equivalent probability is still 50/50.

The expected return on the 1st one is $25,000. Depending on how much expected premium you’d demand to play, the max anyone should put up is $24,999.99

Even though the second one’s distribution of colored balls is unknown, there’s still a 50% chance its higher than option 1 and a 50% chance its lower.

I think you should be completely indifferent between the 2 options. Most just “feel” like option one is safer.

Disclaimer: I could be completely wrong about all of this.

ohai,

The first is a case of risk (known probability distributions) and the other is a case of uncertainty (unknown probability distributions). Risk aversion is how one trades off the expected return vs. the risk (both of which can be calculated from the distribution). Ambiguity aversion is when you are trading off some known thing (or some risky thing) vs. some other thing where the potential distribution is known.

When resolving these paradoxes, it helps to think like a Bayesian. If the distribution of the balls is uncertain, then I try to take a non-informative prior. For this case, you could just assume a prior that they are evenly split. That’s not quite sufficient to resolve the Ellsberg paradox (come to think of it, I’m not sure it really would resolve this one either). For that paradox (using the ball colors from wikipedia), I assume that the split between the black and yellow balls is a uniformly distributed random variable. I can then produce a distribution of each of the gambles in terms of an actual probability distribution. Basically, I try to convert the ambiguity into a risk. I then evaluate the gambles in terms of the expected payoff vs. the risk.

I thought it was implied in the question, but let me clarify: you play each game only once.

Thank you for the explanation. I still don’t see how this relates to the question though. Re-frame the question as “you have equal probabilities of choosing Color A and Color B”. Which is the superior color? If neither choice is superior, than they must be equal. If both choices are equal, it is equivalent to 50/50 win probability.