< Earlier Kibitzing · PAGE 46 OF 58 ·
Later Kibitzing> |
Jul-23-08 | | sentriclecub: Welcome <f54280>
<I happen to have a math background and worked a bit on that paradox a few years ago,> We're not really interested in the main paradox. Its basically solved completely at http://en.wikipedia.org/wiki/Two_en... The real things we're looking for is the implications of it. In other words, we try to "spin" the strange behavior that causes the original paradox into new variations and try to see how many times we trump our intuition! Feel free to throw out any comments, about the main version, or any thoughts about the last 5 days' posts. This is my favorite page on chessgames for the last few days! If you read the wp article, its basically solved, and we can't really add anything to it, nor can we argue against it. We're basically just throwing out lots of side-tangents and responding to each others posts about whatever is being discussed on a given day. Still would fully like to hear some of your stuff you worked on years back. I for one "solved it" using my limited math, and it may not be correct, without help of outside references. So you can spew all the technical stuff and I'll eager open my math book to decipher your posts if they are too over my head. |
|
Jul-23-08 | | YouRang: <Envelope paradox>
Sometimes, an EV can be represented by a simple formula, and other times it must be evaluated on a case-by-case basis.This is one of the case-by-case problems, and showing the cases (as you've done) helps illustrate this: EV for each case (in your example):
1=(1,2) EV=1.5
2=(2,4) EV=2
3=(4,8) EV=6
4=(8,16) EV=12
5=(16,32) EV=24
6=(32,64) EV=48
Once you are given a pair of envelopes, you are 'locked' into ONE of the cases above. The probability of being in that case is no longer relevant, and thus no longer appropriate to consider in an EV formula such as <EV = (.5)(X/2) + (.5)(2X) = 1.25X>. That formula is like saying that the expected value of the other envelope is (.5) times the value of the other envelope for <the case that I AM in>, plus (.5) times the value of the envelope for <a case that I'm NOT in>. Anyway, this is the simplest way I can think of to understand why the original argument in favor of 'always switching' is flawed. |
|
Jul-24-08 | | sentriclecub: <Yourang>
Glad to have you back, you haven't posted in a long time. Your input is correct. When you make use of limited information, it can lead to improper decisions (like if you are in case 3 and hold $8) the logic is to switch, but doing so loses $4 since the EV of case 3 is $6. Your "loss from switching" is equal to the other person's "gain from switching" be it the house(casino) or another player. The loss can be calculated as the in/out cash-flows... open envelop $8
$8 more than you woke up with
Swap - lose and go home with $4 more
Or hold $8 and EV is +1.25
Swap and your $4 is $5.25 less than the $9.25 you expected. However, what would <you> do if you were actually at the casino and your gut instinct is telling you +EV! switch! Even after all this refuting, I'd still switch for the reason of loss-aversion. http://en.wikipedia.org/wiki/Loss_a... i.e. (what if it really had $16?) poor me! |
|
Jul-24-08 | | sentriclecub: The game-show deal or no deal is played as follows. Everybody quits when there is only one large amount left, relative to the whole board. e.g. if the 3 highest amounts are $750,000, $500,000 $100,000 90% of players just go until they open all but one top amount. But that's why the game is interesting, the banker offers "higher than EV" when only one top amount is left and "lower than EV" when multiple high amounts are left. The game show is most interesting when one high amount remains and each "open a case" could result in that dying feeling for the contestant. From a business POV, you want to maximize risk for the contestant as that is what's exciting. (afterall, evil knevil jump over a schoolbus or a pool of sharks). The game is setup to quickly hurry to the risky part of the game (open 6 cases at a time the first 2 rounds) to hurry to the part of the game that is the most risky, and drag this part of the game out as long as possible. I spent several shows figuring out if it is in the shows best interest to offer less than the EV only to find that when the game is most risky they actually offer more than the EV. At first I thought that it was a lie, that the "then the bank would have offered..." just because well if the contestant already took the deal, the banker could say anything he wants, but I've found that the "then the bank would have offered..." statements to be actually consistent with players who get to the risky stage of the game (1 case left) because that is the most lucrative part of the show (ratings, commercials, etc..!). Afterall that show isn't trying to minimize the average payout per contestant, but instead want to rush the contestant through the early part of the game (giving offers that are way below EV) not to "screw the contestant" but to hurry them along, making their choice to "not take the deal" easier for them to make, and without prolonged reservation. Also, the girls are nice I guess. |
|
Jul-24-08 | | sentriclecub: Also <sneaky>
can you simulate the strategy
"never switch when you're holding $16"
"always switch on $1, $2, $4, $8, $32, and $64"
and give it maybe 10 million runs.
Also if those results favor switching, try and run it on $8 instead of $16, and see if its just a lucky number issue. |
|
Jul-24-08
 | | Sneaky: <sentriclecub> OK, after 10,000,000 trials... Strategy: "Never switch a $16 envelope, but switch all other envelopes." Avg. profit per round = 15.0822508
* * *
Strategy: "Always switch a $16 envelope, but never switch any other envelopes." Avg. profit per round = 16.4215198
* * *
I see your point here. Switching when you draw a $16 envelope (and never any other time) nets 16.42 outperforms simply never switching (which returns $15.75 per trial, right?). So switching a $16 is a good thing. Q.E.D. |
|
Jul-24-08
 | | Sneaky: <sentriclecub> I've watched a few episodes of Deal-no-Deal and I observed pretty much what you did. The EV is so easy to figure out, you'd have to be pretty dense to not be able to play the optimum strategy.
I've discovered that in the beginning, the "deals" the banker offer are quite unfair and the player would be crazy to stop. And very few of them do. But then when it gets interesting there is usually a round where the bank offers MORE than the fair EV. At this point anybody who is smart should stop playing and accept the offer. But a lot of people, they let greed get the best of them, and continue further. One the banker offers his "better than fair EV" offer, he tends to suddenly become stingy and actually lower his offers relative to the EV. Once I saw it get down to two briefcases, one had the million dollars in it, and the other had some trivial amount. So you would expect a banker offer of $500,000 right? But no, he said $464,000. Let's face it, most of us would take a guaranteed $464,000 over a 50% chance to win a million dollars, because this no longer become a mathematical exercise but rather a life-changing decision. |
|
Jul-24-08 | | sentriclecub: <But then when it gets interesting there is usually a round where the bank offers MORE than the fair EV. At this point anybody who is smart should stop playing and accept the offer.> I interpret this a little differently. I think it is to "signal" that the next offer will also be MORE than the fair EV, and say there are the following cases left... $1 $2 $5 $500K $750K
The EV is like $250K
If they offer $273K that is a "tell" to induce rationalizing that... "if my case really does hold a top amount" (which keeps getting brainwashed into them believing) then they believe that they have a 3/4 chance of surving a single-case round, since they concurrently believe that they hold a top amount thus giving them better EV by going again. Also, related to music theory in "consonance and dissonance relationships of musical frequencies" that certain numbers are more likely to induce a feeling of worry vs a feeling of "right" or pleasantness. Afterall, if you're going to splurge and by a PC tonite, what price "looks most aesthetically pleasing" of these three. $3575.00
$4199.99 after a $400 rebate (orginal $4599)
$153.00
Doesn't the last one just twist your stomache?
Does the middle one's rebate almost counter the disincentive of the price difference between it and the one above it? I've noticed stuff like X80,100 where X = 1/2/3/4/5/etc... to be a very "soothing amount" that's not begging to be resolved upward or downward. Amounts which I speculate cause "no deal" are multi-hundred thousand amounts that end in X11,000 X13,000 X17,000 X21,000 X23,000 to X27,000 Prime numbers, don't really look "safe" to accept, 21 looks primish so I included it (and I saw it and it sticks out since it was at the time I was hypothesizing prime thousands after X). I am kinda curious, since I dont know 3 digit prime numbers, how many large prime numbers (between $100 and $1000) followed by 000.000 that they have actually offered. What if the rho^2 was like .90 |
|
Jul-25-08 | | Duck McCluck: Wanted to chime in that the envelop problem, especially the link to the wikipdiea article, were very good reads. |
|
Jul-26-08 | | ganstaman: Deal or No Deal is very interesting. If there were 2 cases left, $1 and $1 million, they could probably offer $50,000 and I'd still take the deal. Am I wrong for doing this? At the least, that they don't do this tells you that they aren't going for just minimizing payouts. Doing just straight up EV calculations on each round is incorrect, IMO. Just to make up some numbers, let's say that there is $3, $10, and $20 left. The average is $11. So many would think that if he offered $11, then taking the deal or not taking the deal would be equivalent -- in the long run, you'd make $11 per trial on average. I don't think we can take this for granted. Sure, if you take the deal, you do leave with $11 per trial. But if you turn down the deal, why do you expect to leave with the average value of the remaining cases? For one thing, psychology does play a role. It can lead you to making less than optimal decisions later on. Also, so long as the banker does not offer the average value of the remaining cases, then the EV of turning down a deal is not the average value of the remaining cases. I think that it does make it interesting to make the game more than a simple math equation. The math used is complex because it takes into account future situations and human psychology. On a related note, people don't value money linearly. That is, the difference between $10 and $20 is more than $10,010 and $10,020. The deals should take this into account at least somewhat, which would mean that we'd need to give different than even weights to each value remaining. Also, most of us can only play the game once. There is no getting to the long run. That's why I'd take the $50,000 from the beginning of my post. I can't wait for the $1 million and $1 to average out over time -- I want my $50,000 now. Surely this plays a role too. |
|
Jul-28-08
 | | Sneaky: In the famous poker strategy book "Super System" by Doyle Brunson, Doyle says that he had a friend who's net worth was about 1 million dollars, mostly due to gambling and a few clever investments. His friend said: "If I had the opportunity to stake my entire net worth on a coin flip, and was given 10 to 1 odds if I win, I would still refuse the bet." His friend's reasoning is obvious: he is already living comfortably with a million dollars, and even with an extra 9 million dollars his lifestyle would not improve drastically. Sure he'd live more comfortably with 10 million than with 1 million, but not 10 times more comfortably. On the other hand, the difference between having a million dollars in the bank and being flat broke, is enormous. Brunson then went on to say that if he was given the same opportunity he WOULD flip the coin, because he's a true gambler at heart. |
|
Jul-28-08 | | YouRang: IMO, if come to a point where you are willing to gamble more than you can affort to lose, then you're at the point where you have a gambling problem. |
|
Jul-28-08
 | | Sneaky: Very true.
Not only shouldn't you gamble more than you can afford for the obvious reason (that you might lose and it will impact your life negatively), but also you will playing at a handicap because you can't make cool emotionless decisions based on the raw mathematics of the situations. This is especially true at poker, where fortune favors the bold and "scared money" gets scooped up by those who are comfortable with the stakes. |
|
Jul-29-08 | | sentriclecub: I think that gambling has its pros and cons. For me, poker was one of the things that I realized about myself that I'm above average at. Poker is like insight into nature. Some of the fundamental rules that govern "rational utility optimizers" and how to exploit them ;) reveal themselves through poker. By creating strategies and playing the strategies, you consciously analyze equations and do conditional probabilities (my happypoint) and apply educated guesses to unanswerable mathematical truths. |
|
Jul-30-08 | | YouRang: <I think that gambling has its pros and cons.> And some of the best gambling pros ARE cons! ;-) |
|
Aug-08-08
 | | Sneaky: Getting back to our favorite topic, the Envelope Paradox... <f54280> Summarized my understanding of it well when he wrote this: <When one says "the envelope contains X dollars", our mind assumes an uniform distribution (ie: that 100$ is as likely as 200$). But, it is impossible to have an unbounded uniform distribution (where 100$ is as likely as 3.14159 trillions), hence the paradoxical result.> However his conclusion, <You have to draw the X dollars from some distribution ... Hence, no paradox anymore> is not clear to me. I am still not convinced that there isn't SOME distribution which can both be formally defined, allows the possibility of any number to be chosen, and still leaves the paradox intact (i.e., that for any amount of money X you discover in the envelope, it is in your best interests to switch). Now getting to <sentriclecub> recent posts on the subject, I promised to address it and now I have some time. He wrote <I'd have to explain my solution to the problem to get you to understand how I mitigated the effect of infinite... here it goes.> OK, roll up sleeves, let's tackle this. I hope senticlecub doesn't mind if I grab snippets out of several of his posts in non chronological order; I am trying to condense the meat of the discussion here. <what about generating a random number between 1 and infinite?e.g.
1 = (1,2)
2 = (2,4)
3 = (4,8)
. . . ∞ = (2^∞-1),(2^∞)
Also, don't make it a "fair dice" but a crooked dice, where the probability of it landing on X follows a normal distribution (guassian curve). ... that way every number has a discrete probability. i.e. the area under the curve between (x) and (x+1)> It's good that they aren't fair dice, because it's impossible to fairly pick a number from 1 to ∞. But to generate a number from 1 to ∞ with crooked dice is child's play. I think it's important to mention, that depending on how rapidly the distribution decays, the total EV for this distribution may or may not converge. Wait a minute--let me clarify. When I say "compute the total EV for this distribution" I mean, what is the EV if you simply receive an envelope and get to keep its contents? (Forget the switching part of the game for a moment.) To put it yet another way: how much would you pay for an unopened envelope? If we wanted to compute the EV for the entire curve we take each number that can be output by the distribution (1...∞) and multiply it by the probability that the number occurs. Then we'd add them all up. The procedure as I just described is rife with infinities but we can handle that, because we took calculus in college. Now I haven't proven that the EV for your gaussian distribution would be infinite, but I know it in my bones. What I have in mind is this: the guassian distribution gently and gracefully swoops down to zero while the values in the envelopes explodes terrifically fast towards ∞. When you multiply these values together to compute the EV, the product of the two will NOT swoop towards zero but actually get bigger and bigger as you get further into the tail. Therefore, I believe that the guassian distribution used to generate envelopes [ 2^X,2^(X+1) ] has an infinite EV. But so what? Why is an infinite EV a problem? In my opinion, it's not. The Wikipedia article (http://en.wikipedia.org/wiki/Two_en...) states < In the words of Chalmers this is “just another example of a familiar phenomenon, the strange behaviour of infinity.” This resolves this paradox according to some authors. But in every actual single instant when an envelope is opened, the conclusion is justified: the player should switch! Not many authors have addressed this case explicitly trying to give a solution.> It goes on to say <However, Clark and Shackel argue that this blaming it all on "the strange behaviour of infinity" doesn't resolve the paradox at all.> I have to side with Clark and Shackel here, whomever they are. (continued...) |
|
Aug-08-08
 | | Sneaky: I believe we can design a distribution so that when we play the game, in each instance the value of X is a known constant, a finite value, and logic dictates that we must switch. What is disturbing here is that our decision to switch is independent of X, we ALWAYS want to switch. <If you assign exorbitantly-diminishing marginal probabilities for values of X that you want to define as "in the tail", then you can do so, by stipulating that the area of the tail is exorbitantly small compared to the range you want to focus on.> Agreed, but in my mind that's a lot like arbitrarily picking a top limit, like in my casino version the limit was 64. In your game there isn't a hard limit, since any number is possible to be chosen, but you are suggesting to wipe out the effect of the higher numbers by making them ridiculously unlikely. Therefore, if somebody knows what your arbitrary cut-off point is (the point that you pick as the start of the deflated tail) they should incorporate that knowledge in their strategy. <Therefore, I can deal with infinite indirectly. I can define the curve to let .000000001% of the area to be to the right of 100 trillion dollars.> I see what you're getting at, but let's not awe ourselves by tossing around big numbers. You could just as easily say "I define the curve to let 1% of the area to be the right of 64 dollars." The principle would be the same. Your point is (correct me if I'm wrong) that we can define the tail so that it is not a beast with infinite EV--it's an infinitely long tail with a finite and even insignificant EV. I agree, we can define it that way. To keep the paradox alive we must reach the conclusion that "It is ALWAYS profitable to switch, even if you know the exact rules (i.e. distribution) of the game, and regardless of what you find in your envelope." If we ever design the game so that this statement is true, we are staring madness in the face. I predict that when we attempt to define the rules (distribution) in an attempt to keep the paradox alive, one of four things will happen: <1> You find that your probability distribution doesn't add up to 1 (either more or less); in other words, it's not a legitimate distribution. And so, no paradox. <2> You will find that the EV of switching = X; it's exactly what's in your envelope already. Therefore, it doesn't matter if you switch or not. And so, no paradox. <3> You will find that its only profitable to switch if X is less than some constant. This is what happens when you deflate the tail sufficiently so that it has a finite EV. You are essentially reducing the problem to my casino version, except that instead of an optimum strategy of "always switch, except when X=T" you end up concluding "always switch, except when X>T". And so, no paradox. <4> To your horror/amazement, you discover that for all values of X, the EV of switching is greater than X. And so, PARADOX! We stare madness in the face! The question now is: show a distribution for which case <4> occurs. (continued...?) |
|
Aug-08-08
 | | Sneaky: So now our job is to find a distribution so that the paradox is alive and my case (4) becomes a reality. Let's take a roundabout way to do this... first, let's look at a distribution where the paradox is destroyed. Consider the distribution that we could call "The number of times we flip a coin before we get heads." E.g., if you flip a coin and it comes up heads, then N=1. If you flip a coin and it comes up tails first, then heads afterwards, N=2. If you get 27 tails in a row, then heads on the 28th flip, N=28. The probability p(N) for various values of N:
p(1) = 1/2
p(2) = 1/4
p(3) = 1/8
p(4) = 1/16
. . . p(N) = 1/(2^N) = 2^-N
Now suppose that we randomly select a value N with this distribution, then stuff the envelopes with [ 2^N, 2^(N+1) ]. Clearly the probabilities all add up to exactly 1, so we know its a valid distribution. In this variation of the game, we always open an envelope to find an integral power of two: 2, 4, 8, 16, 32, etc. Suppose we opened an enevelope to find that N=16. Should we switch? Well, we know that either we are looking at N=3 (8,16) or N=4 (16,32). But the probability of N=3 and N=4 are not the same; N=4 is half as likely as N=3. That means that there is a 2/3 probability that N=3 and only a 1/3 chance that N=4. So if we switch the envelope, we'll either end up with $8 or $32. What's the EV? Simple: (2/3) * 8 + (1/3) * 32 = 16
Sacre bleu! It's exactly the same! Switching offers no advantage (or disadvantage), just as we would assume is the case. This is my case #2 from the previous post. Let's solve this in the general case. If you open an enevelope to find X dollars, you'll be able to pinpoint N to two possible values. Call these possible values q and q+1. The probability that N=q+1 (and not q) is
p(q+1)
-------------------
p(q) + p(q+1)
which equals
2^(-q-1)
-------------------
2^-q + 2^(-q-1)
which equals
2^(-q-1)
-------------------
2^(-q-1) * ( 2^-q / 2^(-q-1) + 1)
which equals
1 / ( 2^-q/2^(-q-1) + 1)
which equals
1 / (2+1)
which equals
1/3
(Sorry about walking through some terribly trivial algebra but I really want to make sure that everything works out here so I'm "showing my work" like my math teachers always insisted.) Ergo, there's a 1/3rd chance of N being the larger value, and a 2/3rds chance of N being the smaller value. The EV when you switch an envelope containing X is: EV(X) =
(1/3)*2X + (2/3)*(X/2) =
2X/3 + 2X/6 =
X This is the borderline case: the situation where it neither is good nor bad to switch envelopes. With the coin-flipping distribution, there is no paradox. (continued...) |
|
Aug-08-08
 | | Sneaky: NOW THEN ... let's try a more graceful distribution that doesn't decay so rapidly. Ever play craps in a casino? If you play craps the way most people do, you make a lot of money by NOT rolling a seven. With that in mind, let's make a distribution which is equal to "the number of rolls we make before we get a seven." So if you get a 7 on the very first roll, N=1. If you roll one non-seven and then roll a seven, N=2. If you roll 27 non-sevens then roll a seven on the 28th toss, N=28. (For those who don't know, the odds of rolling a seven on two dice is exactly 1 in 6.) The probability p(N) for various values of N:
p(1) = 1/6
p(2) = (5/6) * (1/6)
p(3) = (5/6) * (5/6) * (1/6)
. . . p(N) = (5/6)^(N-1)/ 6
(NOTE: I really should demonstrate that this adds up to exactly 1, to make sure it's a legitimate probability distribution. However, it should be obvious by the definition: after all, a seven has to show up eventually! If anybody has their doubts we can walk through that formality later.) Now suppose that we determine N by counting dice tosses until a seven shows up, then we stuff the envelopes with [ 2^N, 2^(N+1) ] Let's dive right in, and solve this for the general case. If you open an envelope to find X dollars, you'll be able to pinpoint N to two possible values. Call these possible values q and q+1. The probability that N=q+1 (and not q) is
p(q+1)
-------------------
p(q) + p(q+1)
which equals
(5/6)^q / 6
-------------------
(5/6)^(q-1) / 6 + (5/6)^q / 6
which equals
(5/6)^q
-------------------
(5/6)^(q-1) + (5/6)^q
which equals
1 / ( (5/6)^(q-1)/ (5/6)^q + 1)
which equals
1/ ( 6/5 + 1)
which equals
5/11 or 0.4545454545...
NOW, let's compute the EV for switching an envelope with X dollars. Given that X can either be the result of N=q or N=q+1, we know that EV(X) = (6/11) * (X/2) + (5/11) * (2X) =
6X/22 + 10X/11 =
26X / 22 =
(13/11) * X
Because 13/11 * X is bigger than X, we've done it.
Do you understand what this means? WE'VE DONE IT! THE PARADOX LIVES AND BREATHES! This means with the "craps distribution" keeps the paradox in full force. If the envelopes are stuffed using the craps distribution, you will ALWAYS want to switch envelopes if you are given the opportunity, even if you have not yet peeked inside the envelope. Conclusion: Although the paradox has been resolved in some trivial forms (e.g. my casino game, or the Poker article's format which deals with a selection between $50 and $800) the paradox is still in full force when you apply a distribution like the craps method shown above. It doesn't matter if your envelope contains $2 or $65,536 ... you want to switch. What's more, the paradox has nothing to do with a "doomsday envelope" at the very end of the distribution. There is no maximum envelope. No matter how big of a sum you've got in your hand, there might just be (5/11ths) a sum twice as large waiting for you in the other envelope. This means that if you were given the opportunity to play this game a million times over, keeping all the money you collected from each round, you would net an average of 18.2% more by switching every single time, as opposed to keeping every single time. Even if you haven't looked inside your envelope, you still want to switch. I know this doesn't make any sense whatsoever, but I can't think of a good reason why it shouldn't be true. 13/11*X is bigger than X, so switch, of course! And once you've switched, you should switch back. Keep switching all day long, back and forth, if they let you. That's what the math says, so it must be true. Is this mind blowing? Have we finally stripped the paradox naked for all to see? What will happen if I write a software simulation of the game and test various strategies (e.g. always switch, never switch, switch twice, switch five billion times...) ? Comments? |
|
Aug-12-08 | | ganstaman: <Sneaky> Very nice work. It's always a bit hard to follow this sort of math on a forum like this, but I'm pretty sure everything you've written was clear so far. If I didn't feel so strongly that switching is never ever ever correct, I think I would have been convinced by your work. But I looked hard at every line. Unforunately, it's getting late and I need to sleep. So I was not able to fully think this part through yet: Is the following assumption that you asserted actually correct: <The probability that N=q+1 (and not q) is p(q+1)
-------------------
p(q) + p(q+1)
> ? |
|
Aug-12-08 | | ganstaman: Well, I can't sleep for the moment because I keep thinking about this. Anyway, I think that unless p(N=q+1 and not q) = 1/3, we won't get a neutral EV for switching over not switching based on these calculations, so maybe my previous question was irrelevant (afterall, for every distribution you come up with, will p(N=q+1 and not q) really be 1/3??). |
|
Aug-12-08
 | | Sneaky: Some interesting developments. After I posted all that stuff I went online and read a number of articles on this subject. After much reading I concluded that the Wikipedia article is a good summary of the multiple opinions on the subject, and it fairly states that there is no consensus of opinion. I also came across some much deeper analysis than Wikipedia would allow for which sheds even more light on this weird paradox. More on this soon, including the links. <ganstaman> About the equation p(q+1)
-------------------
p(q) + p(q+1)
It looks a little weird but it makes perfect sense in this special scenario. Keep in mind, my method of stuffing envelopes required generating a whole number N (based on a distribution), and ends up stuffing each envelope with an integral power of two (2, 4, 8, 16, 32, etc.). Therefore once you open an envelope you can deduce what N is, but not exactly--there are always two possibilities. E.g. if you see 16 in an envelope you know it could either be N=3 (corresponding to envelopes containing 8 and 16) or N=4 (in which the envelopes equal 16 and 32). However, due to the properties of these distributions across (1...∞) it's no longer correct to say "it is equally likely that the other envelope contains half as much, as opposed to twice as much." This is because the probability of N=3 and N=4 are not the same. Let me walk through this very scenario where N must be either 3 or 4 as an example: The probability that N=3 is 1/8.
The probability that N=4 is 1/16.
(That distribution is what I call the "coin-flip distribution", as opposed to the "craps distribution" that followed.) But we know that N must equal either 3 or 4. Therefore the probability that N=4 is 1/16
-------------------
1/8 + 1/16
which equals 1/3.
This is a little counterintuitive, at least for some people. On one hand, you picked an envelope at random, and yet once you see that it contains 16 you are fairly certain that the other envelope contains 8 and not 32. This is because the (8,16) configuration comes up more frequently than the (16,32) configuration. This is what is called (I think) a "conditional probability." It's like saying, the chances of you drawing a card at random and it being the ace of spades is 1 in 52. But if I know that you drew an ace, I can say that the chances that you have drawn the ace of spades is 1 in 4. |
|
Aug-12-08
 | | Sneaky: Time to regress ...
It turns out, to my surprise, that the envelope switching paradox has more in common with another paradox than I ever dreamt possible. This other paradox is popularly called the "St. Petersburg Paradox." I think it is necessary to discuss the St. Pete paradox to achieve any level of advanced understanding of the envelope paradox. <The St. Petersburg Paradox> A casino in St. Petersburg wants to offer a game to its patrons based on their ability to flip a fair coin that comes up "heads" many times in a row. The player will flip a coin until tails appears, and then the game is over and they get paid (or not). The payoff structure for the game is like this:
If you can't flip any heads at all, you get nothing.
If you can flip 1 head, you get $1.
If you can flip 2 heads, you get $2.
If you can flip 3 heads, you get $4.
If you can flip 4 heads, you get $8.
If you can flip 5 heads, you get $16.
. . . If you can flip N heads, you get 2^N dollars.
The casino owner hires you, a mathematician, to figure out how much to charge people to play this game, so that it is a "fair bet". Then the casino will predictably charge people slightly more to play, so that they can eke out a profit. You set to work, calculating the EV for the St. Petersburg game in the normal manner: you multiply each payoff tier with the odds of it happening, then adding up the numbers. Since there is no upper limit on how much you can win with this game, it is an infinite series. (1/2) * $0 = $0.00 (flipping no heads)
(1/4) * $1 = $0.25 (flipping 1 head)
(1/8) * $2 = $0.25 (flipping 2 heads)
(1/16) * $4 = $0.25 (flipping 3 heads)
(1/32) * $8 = $0.25 (flipping 4 heads)
etc.
Therefore our infinite series equals 0.25 + 0.25 + 0.25 ... which of course is equal to ∞. The St. Petersburg game has an INFINITE EV!!! You turn to the casino owner and tell him "Your game is flawed. Even if you charge patrons one hundred million dollars to play, the house will go bust in the long run." Now why is this a paradox at all? Maybe it's not; perhaps it's just a poorly constructed game. But it leads us to lots of very counter intuitive conclusions. For example: <Even if you demand nearly impossible performances before you get paid, it's still infinite.> The casino owner says, "In that case let's make the game harder on them. Let's say that they get no payoff at all until they flip one hundred heads in a row, and only then do they get the $1 payoff. Then the progression continues as normal." Sorry mister casino owner, it's STILL INFINITE. All you've done is added a hundred "zero" terms to the beginning of the infinite series, but then you have to add up all of those $0.25 terms that continue on to infinity. <When you play the game, no matter how much you get paid, it's a disappointment.> Let's say that the casino owner doesn't believe his on-staff mathematician, and decides to offer the game anyhow. Each person who rents a room in the St. Petersburg Hotel gets a free ticket which allows them to go to the main floor and flip coins and then get paid based on the rules. You get your ticket, and you go on an incredible coin-flipping streak. 20 heads in a row!! You get paid $1,048,576 (which is 2^20). Just as they are counting out your winnings, another patron of the hotel walks up to you with one of these tickets, and says "I will trade you your million dollar winnings for this unclaimed ticket." Logically, you should say "yes" to this other patron. Even though your payoff is very large and you were very lucky to have received it, that uncashed ticket has an INFINITE EV! This is true whether you get paid two dollars or two trillion dollars. No matter how large your payoff is, you should be happy to trade your winnings for the chance to play the game again with a fresh ticket. Do you see the correlation to the envelope switching paradox? In the St. Pete game you are always willing to switch your ticket, once you know what its worth, for an unclaimed ticket of unknown value. If you don't know what your ticket is worth, then switching is silly and a waste of energy. But once you know, you want to switch. LINK: http://consc.net/papers/stpete.html
"The St. Petersburg two-envelope paradox" by David J. Chalmers |
|
Aug-12-08
 | | Sneaky: With the background of the St. Pete paradox, and the concept of "infinite EV", we are ready to discuss what I consider the big revelation in the envelopes paradox. As we've seen, attempts to define the envelope paradox using a normal limited distribution such as the rolling of a single die (1..6) are met with failure. My casino-version of the envelope game and the computer simulation I wrote convinced me of this. It doesn't matter if the die has 6 sides or 6000 sides, the existence of that top envelope ruins everything, as far as the paradox is concerned. Some people like <sentriclecub> have suggested that a different distribution is used which has no top limit (that is, it ranges from 1...∞), but its still a sound mathematical distribution. And he's absolutely right--if you use a distribution like that, carefully selected, you will keep the paradox in full force. The trick is to have an infinitely long tail of finite size, and to have it decay at a nice slow rate. Also as we've seen, we can apparently keep the paradox in full force by using a more crafty distribution such as my example "craps distribution". (That distribution is only an example, there are of course many others which also keep the paradox alive.) However, my craps distribution has the weird property of having an infinite EV, and to some people, that is somewhat unsettling. Here's what is interesting: it has been PROVEN mathematically that any distribution which keeps the envelope paradox intact absolutely must have an infinite EV. The proof deals with infinite series and what we call "convergence tests", so its a little hard to follow if you haven't brushed up on your calculus lately, but here is the link: <"The Two-Envelope Paradox: A Complete Analysis?" by David J. Chalmers>
http://consc.net/papers/envelope.html
So where does this leave us? Here is my new summary of the situation as best I understand it: <1> The original and trivial form of the envelope paradox is flawed, because it implies that the envelopes are stuffed with X and 2X dollars, where X is "any number". But there is no such thing as a distribution where any number can be equally as likely as any other number, e.g. where 100 dollars is no more or less likely than 1.378 trillion dollars. Therefore the problem is poorly stated and can be dismissed on those grounds alone. <2> Attempts to "fix" the original statement by defining a simple distribution, e.g. rolling a 6 sided die, will not leave the paradox intact. When you compute the EV when using a simple finite distribution it no longer is true that switching is better than keeping. <3> It is however possible to restate the problem using a carefully selected infinite distribution, so that the paradox seems to be in force. HOWEVER, in all such cases, the distribution chosen will have the curious property of having an infinite EV. Therefore, it becomes an analog of the St. Petersburg paradox, in which any known payoff is deemed to be worth less than an unknown payoff. |
|
Aug-12-08
 | | Sneaky: (conclusions continued)
<4> The jury is still out on the meaningfulness of "infinite EV" situations. Some mathematicians have described it as "absurd", while others say that if you can define it mathematically it is no more absurd than transcendental numbers, or "i", or the concept of infinity itself. Whatever you believe, it is clear that this concept goes hand in hand with the envelopes paradox. <5> In a previous post I wrote with glee: <Have we finally stripped the paradox naked for all to see? What will happen if I write a software simulation of the game and test various strategies?> Sadly, there will be no simulations of the new craps-distribution envelope paradox; there cannot be, due to the infinite EV. Any finite number of trials that I run, will be insufficient. Why is it absurd to write a computer simulation of an infinite EV situation? Well, these computer simulations are just a form of statistical analysis by brute force and large sample sizes, right? So by comparison, what if I wanted to determine the EV of purchasing a ticket for the Florida Lottery? What if I told you that "I will buy three tickets and divide my total winnings by three to determine the EV." Of course, that's statistically insignificant! "Three tickets" is no good when the chances of hitting the jackpot are 14 million to 1. But with a situation like the St. Petersburg game, or any infinite EV scenario, no finite number of trials will ever be statistically significant. <6> Just because I cannot simulate the craps-distribution version of the envelope paradox in total, I still could simulate a tiny cross section of it. For example, only analyze cases where you open an envelope to find $16 inside. If we did this, I am quite positive that we would learn that switching is advantageous. Likewise, we could do the same experiment with $8, or $32, or $65336, or any number you can name, and we'd get the same result: switching is good! So we are still left with the bothersome core of the paradox: that regardless of what sum you find in your envelope, it's to your advantage to trade it for the other envelope. Since this is true for "whatever sum you find", it stands to reason that you should even want to switch without looking. This, of course, is absurd. <7> So where does this leave us? Is math flawed? Well, maybe it is, in a way. The decision making ability of math seems to break down in scenarios where infinite EV is introduced. But we must temper this with the realization that infinite EV situations have no meaning in the "real world", where monetary supplies are finite, and the universe will die a heat-death before any infinite series can be totaled. |
|
 |
 |
< Earlier Kibitzing · PAGE 46 OF 58 ·
Later Kibitzing> |
|
|
|