< Earlier Kibitzing · PAGE 45 OF 58 ·
Later Kibitzing> |
Jul-21-08
 | | rinus: <Sneaky> By picking 1 envelope you get information! What information??? - There is or there is not $64 in it. The same thing goes for $1! |
|
Jul-21-08 | | ganstaman: <Sneaky> Your modified version of the problem shows exactly why you shouldn't switch. Perform an EV calculation over the whole range of envelopes where you always switch and compare it to never switching. You'll find that the value gained by switching if your envelope contains 1, 2, 4, 8, 16, or 32 is exactly cancelled by the value you lose by switching when your envelope contains 64. Therefore, if you have no idea what the range of envelopes is and can not to any degree predict it, then the overall EV for switching is the same as the EV for not switching. That is, you will often gain some money by switching, but the time that you are at the top of the range of envelopes, you will lose a lot, and so this keeps your EV at nothing. I would consult with the links I provided some time ago as they basically stated this, only likely more clearly. |
|
Jul-22-08 | | sentriclecub: <Let's say that the casino version of the game works like this. The player pays some sum of money for the right to play the game. Then the croupier rolls a 6 sided die in secret. The croupier then consults a chart which tells him what sums to put in the envelopes: 1 = (1,2)
2 = (2,4)
3 = (4,8)
4 = (8,16)
5 = (16,32)
6 = (32,64) >
Very creative, let me think about this one. First of all, assuming that the casino always has a slight edge <even if> the patron uses optimal strategy... but a very interesting reapplication of what we were talking about. You have done a great job at addressing the source of the paradox. The EV of "switch unless you open $64" is dice/payout
6 / 64
5 / 24
4 / 12
3 / 6
2 / 3
1 / 1.5
x / $18.42
The "fair price" to charge for the game, assuming it isn't a crooked dice, is $18.42 |
|
Jul-22-08 | | sentriclecub: $15.75 if you open whatever you have and keep and $15.75 if you open and are forced to switch (even if you open $64!) |
|
Jul-22-08
 | | Sneaky: I was about to say "Aha! But if you say that the $64 envelope is such a big factor, then we can wipe that factor out and turn into a non-issue." And how do we do that?
Well my first idea was simply to roll a bigger die than a 6-sider. Roll a 20 sided die, or generate random number from 1 to 1000. That way the boundary condition will only occur very rarely. e.g.
1 = (1,2)
2 = (2,4)
3 = (4,8)
. . . 1000 = (2^999,2^1000)
PROBLEM: Even though the grand prize occurs rarely only 1 time in 1000, its still the BIGGEST prize, by a factor of two, so it will greatly effect the EV even if it only happens once in a thousand times. So this method to crystalize the paradox is a failure. I thought it was worth mentioning, though. |
|
Jul-22-08 | | sentriclecub: Lastly, as a tangent thought. Strictly answering the problem you presented, if the casino tells you how much they charge you to play the game (assuming they dont ask you to pay an unknown amount as it would give you information) obvioiusly you could infer some information about the amounts just by how much they charge you. Assuming the casino makes an upper limit of 20% theoretical profit off of the game, then you ALWAYS switch, if you open a number less than you paid to play. And ALWAYS keep the envelop if it is 100x more than you paid. Obviously, you cant make a certain judgment on if you opened 1.5x whatever you paid in, which is close to the situation of opening the max-payout (thus the one time you guarantee negative EV by switching). Actually, the casino could charge the "profit free price" since if they deny you knowledge (ie hand us over your credit card and we'll deduct a variable amount), they will make EV profit in the sense that only they know what optimal play is, and anyone who plays the game will do so unoptimally, so they can just make profit due to the fact that no one knows the optimal play. Any envelop game is about how to infer information about the range of values which have a higher probability of being the "doom envelop" (the only envelop where you can't double) If you get any information which allows you to make even the slightest assigned probability of either 1) what range of amounts are the doom-envelop likely to take 2) given the amount you opened, can you assign a non-zero certainty of it being the doom envelop. If you can, then you can make a slight +EV by switching/keeping if you can infer even the slightest information about (1) or (2) |
|
Jul-22-08
 | | Sneaky: If you were to play my casino version, you could make a table and keep track of your average profit depending on what the initial envelope contains. Take my casino game and instead of analyzing the EV of every possible round, from the $1 pick to the $64 pick, lets just analyze a subset. The subset where the player opens an envelope with exactly $16 in it. Now we know that in 50% of these cases the deal has rolled a 4, and in the other 50% of theses cases the dealer has rolled a 5. Does it not stand to reason that the player who keeps the $16 envelope every single time will fair worse than the player who switches the $16 envelope every time? Let's see. EV for keeping the $16 envelope = $16. (duh)
EV for switching the $16 envelope = (1/2)*8 + (1/2)*32 = $20. Ergo, the player who switches every time will profit an average of $20 every time he picks a $16 envelope. This means more profit than if you simply keep the $16 envelope every time. Sorry if I'm being dense, but please explain to me again why this is false. |
|
Jul-22-08 | | sentriclecub: Re: your recent post
what about generating a random number between 1 and infinite? e.g.
1 = (1,2)
2 = (2,4)
3 = (4,8)
. . . ∞ = (2^∞-1),(2^∞)
Also, don't make it a "fair dice" but a crooked dice, where the probability of it landing on X follows a normal distribution (guassian curve). That way you can use calculus to determine the probablity of it landing on any random number (any number! no matter how high!) that way every number has a discrete probability. i.e. the area under the curve between (x) and (x+1) This method lets ANY number, no matter how high, (thus a number can't be "beyond the range" since the probability of incremental integers (or irrational numbers) will diminish at a diminishing rate, but will never get to zero). |
|
Jul-22-08 | | sentriclecub: <EV for keeping the $16 envelope = $16. (duh) EV for switching the $16 envelope = (1/2)*8 + (1/2)*32 = $20. Ergo, the player who switches every time will profit an average of $20 every time he picks a $16 envelope. This means more profit than if you simply keep the $16 envelope every time. Sorry if I'm being dense, but please explain to me again why this is false.> Where specifically (in my posts) are you referring to? |
|
Jul-22-08 | | ganstaman: <Sneaky: If you were to play my casino version, you could make a table and keep track of your average profit depending on what the initial envelope contains. Take my casino game and instead of analyzing the EV of every possible round, from the $1 pick to the $64 pick, lets just analyze a subset. The subset where the player opens an envelope with exactly $16 in it. Now we know that in 50% of these cases the deal has rolled a 4, and in the other 50% of theses cases the dealer has rolled a 5. Does it not stand to reason that the player who keeps the $16 envelope every single time will fair worse than the player who switches the $16 envelope every time? Let's see. EV for keeping the $16 envelope = $16. (duh)
EV for switching the $16 envelope = (1/2)*8 + (1/2)*32 = $20. Ergo, the player who switches every time will profit an average of $20 every time he picks a $16 envelope. This means more profit than if you simply keep the $16 envelope every time. Sorry if I'm being dense, but please explain to me again why this is false.> This isn't false. It is correct. But it doesn't mean that switching always is profitable, as explained in my previous post. |
|
Jul-22-08 | | ganstaman: <sentriclecub: what about generating a random number between 1 and infinite? Also, don't make it a "fair dice" but a crooked dice, where the probability of it landing on X follows a normal distribution (guassian curve). That way you can use calculus to determine the probablity of it landing on any random number (any number! no matter how high!) > Since the upper limit is unbounded, but the lower limit is bounded, does this still work? I don't know the answer -- I've forgotten too much math. |
|
Jul-22-08
 | | Sneaky: <sentriclecub: Re: your recent post what about generating a random number between 1 and infinite? ...
Also, don't make it a "fair dice" but a crooked dice, where the probability of it landing on X follows a normal distribution (guassian curve). That way you can use calculus to determine the probablity of it landing on any random number (any number! no matter how high!)> I thought about this, but there is a wrinkle. Let's look again at the situation where the player opens an envelope containing $16. We know that the dealer (croupier) rolled either a 4 or a 5 again, however if it follows a guassian curve then the probability of 4 and 5 are NOT exactly equal. We can no longer claim that it is equally likely for the other envelope to contain $32 as containing $8, no, in fact the $8 is the more likely outcome. This still doesn't mean that the EV for switching is less than $16 (I'd have to go back to my college statistics book to refresh on the guassian distribution to answer that) but conceivably it could be if the die roll was sufficiently large. <Where specifically (in my posts) are you referring to?> I wasn't referring specifically to your post. You are one of the mavericks who advocates switching, i.e. sticking to the math and ignoring what might be called "gut intuition". It's just that I feel that the attitude that "switching is profitable" leads you to a contradiction, even if you add the caveat "except when you open a $64 envelope." We run into a situation where there are two envelopes, A and B. And logic (math) dictates that if you open A, you will want to switch it for B. And if you open B, you will want to switch it for A. Suppose it wasn't you but instead the croupier who has the option of switching. Would the croupier (working in the best interests of the house) want to switch envelopes? Yes! For the exact same reason that the player wants to switch envelopes: a higher EV. But how can both players in a zero-sum game conclude that a certain move has a positive EV? |
|
Jul-22-08 | | sentriclecub: <Since the upper limit is unbounded, but the lower limit is bounded, does this still work? I don't know the answer -- I've forgotten too much math.> Well, the reason I chose a guassian curve instead of a flat curve (equal probability regardless of the highest # in the set) is because it has a tail, which drops off SIGNIFICANTLY according to your equation. The beauty is that I can tell my equation to drop off and tail off that the SUM of the area to the right of $100,000,000,000,000.00 is a constant. Therefore, I can deal with infinite indirectly. I can define the curve to let .000000001% of the area to be to the right of 100 trillion dollars. Therefore, I have exorbitantly diminished the tail, to any desirable level so that my calculations focus on "finite set of numbers" even though this set includes ∞ I have eliminated virtually all of its "weight" in the weighted average because I told the "tail" to only have .000000001% of the total area under the curve. This way...
actually I'd have to explain my solution to the problem to get you to understand how I mitigated the effect of infinite... here it goes. The +EV gets distorted because of dividing regions thinner as the value of the regions get taller, thus the "area" of the tail never gets resolved. The poker article explained that the sum of all +EV switchings get wiped out by the lone occurences of holding the "doom envelop". If you assign exorbitantly-diminishing marginal probabilities for values of X that you want to define as "in the tail", then you can do so, by stipulating that the area of the tail is exorbitantly small compared to the range you want to focus on. Here's a situation that occurs in business finance. (1 + 1/x) ^ x
Here are some sample values of x
0.00 -> 1
0.25 -> 1.49
0.50 -> 1.73
0.75 -> 1.89
1 -> 2
2 -> 2.25
4 -> 2.44
10 -> 2.59
100 -> 2.70
No matter how large you chose X, you can always make it bigger by substituting x+1 Both the input set and output set are infinitely many terms. This limit exists, fortunately. But if the limit doesn't exist, you can mitigate the high values of X by stipulating that "X-values higher than C1 shall lie in the tail of the curve, and their total area under the curve is .0000000000001% of the total area under the curve" This method of shrinking the output, more than wipes out the effects of ∞. "Shrinking the tail" is one method for mitigating the effects of outliers. Shrinking the tail is like adding a clause "now keep the general shape of the trend, but flatten it so much that can chose how much effect these outliers will have on the main focus of the results". <Sneaky>
This calculus stuff is only to get around the "unattainable premise" for the version of the problem we talked about 3 days ago. To answer your question about player vs the house...
Lets say the dice rolls 3 and neither side knows it. The house gets $4 and the customer gets $8. They will both want to switch because house's EV is +$1 and customer's EV is +$2. They both want to switch because it is not a zero sum game. One person's loss is NOT the other person's gain. You could think that if one envelop has $8 and one envelop has $4, then the person with $4 gains +4 and the person with $8 gains -4. Not the case because the Loss of Person holding $4 (expecting $5 weighted average of the two conditional probabilities) his gain is +3 (expecting $5 and getting 8) and the house's loss is -6 (expecting 10 and getting 4). Lets now let the house and player get dealt envelops according to a dice roll of 4. $8 in one envelop
$16 in the other
The House holding $8 is expecting $10
The person holding $16 is expecting $20
The patron will "lose" 12 dollars
The house will gain 6 dollars
Notice how the house held $8 in both examples.
Figure this out on paper, and you'll see a new property of this paradox. Now imagine the house opens $64 and the house knows all the information about the game. What is the players expected value? It is +8 dollars. However, he isn't allowed to switch because the house wont switch, thus he "loses" 8 dollars and has to keep $32. But ahah! What if the house opens $16 and he opens $32. Since he knows the house won't be willing to switch if the dice rolled 6, he will avoid the loss, thus he "expects $32" since he hypothesized what a 6-roll would playout. So now, whether he holds $64, $32, or $1 he'll know the outcomes. Now the only envelops he'll switch on are $1, $2, $4, $8, and $16 These are the set of envelops where you gain +EV by switching. Thus he'll never run into the problem of switching for a lower EV envelop. Two new things to think about. I'm going to sleep, be back tomorrow, hope to see what you guys come up with! |
|
Jul-22-08
 | | Sneaky: <You could think that if one envelop has $8 and one envelop has $4, then the person with $4 gains +4 and the person with $8 gains -4.> Exactly how I did think about it, and since (+4) + (-4) = 0, that made me say "it's a zero sum game." For simplicity imagine that a casino was allowing people to play the game for free, one time, if they rent a room at their hotel. The hotel knows they will be giving away money and of course they want to minimize their losses with this promotion. I mention this to specifically obliterates these arguments about "switch if your envelope is less than the entry fee." <Not the case because the Loss of Person holding $4 (expecting $5 weighted average of the two conditional probabilities) his gain is +3 (expecting $5 and getting 8) and the house's loss is -6 (expecting 10 and getting 4).> Now I'm confused. You're comparing the player's real earnings (he walks away with $8 in his pocket) against his EV to conclude that he only won 3. Likewise, the house who really did lose $8 somehow should be chalked at up minus 6? Is this normal procedure? It seems we have a different definition of "zero sum". I'll try to digest and respond to your other comments later. |
|
Jul-22-08
 | | Sneaky: Also: I speculate that if I write a computer program to play the casino version of the game, it will NOT perform better with an 'always switch unless I picked $64' strategy. It will be the same either way. I know the math doesn't justify what I'm saying but my guts do. If I write the program and am proven wrong then I'll be the first to admit it. And continue scratching my head. |
|
Jul-22-08 | | Ziggurat: <Also: I speculate that if I write a computer program to play the casino version of the game, it will NOT perform better with an 'always switch unless I picked $64' strategy.> Really? I would be truly flabbergasted if it did not perform better. I guess we have different gut feelings here. (BTW, I have been lurking here for the whole envelope debate, but haven´t felt like I was able to formulate any better arguments than some of those already made.) |
|
Jul-22-08
 | | Sneaky: Wait a minute, I rescind my previous statement. Of course the "always switch unless I pick $64" will outperform the "never switch" strategy. If you "always switch unless I pick $64" then you'll get the $64 prize 1 time in 6. If you "never switch" you'll get the $64 prize 1 time in 12. And we know that your performance in this game will be strongly related to the frequency that you hit the top prize. But the program might still be worth writing, just to analyze the specific behavior at a middlish-figure, like the $16 envelope. Since the switching strategy is really just a bid to get your hands on the top prize, it would seem that looking at the middle numbers would not be effected. This is frustrating like the lump in my mattress, every time I try to flatten it out, it just reappears somewhere else. |
|
Jul-22-08
 | | Sneaky: <Here's a situation that occurs in business finance. (1 + 1/x) ^ x >
Oooh! I know that equation. That's like saying, "If you put 1 dollar in the bank today and receive 100% interest compounded yearly, you'll end up with 2 dollars. If your bank compounds the money semiannually you'll get 1.5*1.5 = 2.25 dollars. If your bank compounds it quarterly you'll get 1.25*1.25*1.25*1.25 = 2.44. So what do you end up with if your bank compounds the interest "nanosecondly" ? Answer = e dollars (2.718281828...)" |
|
Jul-22-08 | | sentriclecub: Also, one very important thought is this one. Lets consider the set of infinite numbers, both rational and irrational. Every number can be represented as a 2^N times a constant. $8,374,120.50 is actually an irrational number between close to 2, raised to the 23-power. In fact, it is
1.001730032 times 2^22.99750625
and if you made a mission to conquer logarithms, this translates to a base of .998272955 for the 2^x exponential growth which corresponds neatly into dice stuff
<And how do we do that?
Well my first idea was simply to roll a bigger die than a 6-sider. Roll a 20 sided die, or generate random number from 1 to 1000. That way the boundary condition will only occur very rarely. e.g.
1 = (1,2)
2 = (2,4)
3 = (4,8) >
e.g.
1 = (.998272955, 1.996545911)
2 = (1.996545911, 3.993091822)
3 = (3.993091822, 7.986183643)
4 = (7.986183643, <.998272955*2^x>)
...
...
...
23 = (4187060.25, 8374120.5)
So if you open an envelop with $8374120.5, that means the dice landed on either 23 or 24. This shows how all numbers, can be broken down as functions of a dice-roll but a variable base close to 1*2^x In other words, all irrational numbers can be constructed as a base number multiplied by 2^x (where x is the number the dice rolls on) and the "doom envelop" is handed out to one person when the infinitely sided dice lands on the highest roll possible. This last post probably indicates how much time I put into solving the problem to my highest math skills allowed. First you roll an infinitely sided dice whose values are between .5 and 1 and includes <all irrational numbers> and the second dice roll is a multiplacation factor 2^x where <x> can be any positive integer <1, 2, 3, 4, etc...> thus the doomsday envelop is given when the <second dice> reaches its maximum value. The first dice isn't weighted evenly, because if it was, then the function of F(dice-2, dice-1) wouldn't return a randomly generated number between 1 and ∞. The weighting of the first die depends on the roll of the second die. If the second die rolls 23, then the first die must only fill in the gaps between <all real numbers> without overlapping numbers generated by a 24 on the second die, or a 22. The output should thus "fill the positive real number line" smoothly. A flat curve of height 1/(b-a) where (b-a) is the range, such as 1 to ∞ or 0 to ∞. Thus the area of the curve is a rectangle of height 1/(b-a) and of width (b-a) thus the area is (b-a)/(b-a) = 1 Now your second dice has the range of 1 to N
N solves the equation log(base 2) of b
b is the highest value in the range
it is when BOTH dice roll the maximum value.
There is a special number for when the 2nd dice rolls the highest value and dice 1 rolls the lowest value. Lets call that term "minimum doomsday amount" Now continue this line of thinking and consider the 2nd highest roll on dice 2 and the highest roll on dice 1 This number is the highest non-doomsday amount. As a matter of fact, it is 1/2 of the highest doomsday amount, and is virtually equal to the "minimum doomsday amount". They are in fact, successive irrational values on the real number line. Once you construct the output as a function of a 2-variable equation F(dice 2, dice 1) to give an output of continuous irrational, equally-probable numbers. Then the 2nd dice can be the only one to study. If anybody still follows, I'll write more, but its extremely tough to not make mistakes in explaining the super fine details of defining how to generate random numbers between 0 and ∞. Basically if you understand
<$8,374,120.50 is actually an irrational number between close to 2, raised to the 23-power.> why the second dice couldn't have rolled 22 but with base 1.996545911 and couldn't have rolled 24 with base .499136478 and why the first dice must be weighted so that every irrational number can be fairly selected using a base (determined by dice 1) and a power of 2 (determined by dice 2) such that the outcome <F(dice1, dice2)> is a completely fair number picked at random between 1 and ∞. This lets us seperate ∞ into a maneagble range. If anybody still follows, but isn't fully clear, let me know, since these details are super wordy, and I'll proceed. Basically the thing to realize is that there are a finite number of "events in which a doomsday envelop" is assigned to one of the participants. |
|
Jul-22-08 | | sentriclecub: The # of doomsday envelops is when the 1st dice lands on its highest value, then the number of faces on the second dice = the # of doomsday envelops that exist considering all irrational numbers near ∞. The probability of triggering a doomsday envelop is 1 divided by the # of faces on dice 1. I may have made a mistake and been inconsistent with what dice-1 and dice-2 represent. Dice 1 is an infinitely sided dice of integers only. This is used in the formula 2^x The second dice is between .50 and 1.0 such that F(dice 1, dice 2) has an output of continuous real (irrational) numbers all equally likely to be generated. By rolling dice 1 first, and then dice 2. |
|
Jul-22-08 | | sentriclecub: <Ziggurat: <Also: I speculate that if I write a computer program to play the casino version of the game, it will NOT perform better with an 'always switch unless I picked $64' strategy.> Really? I would be truly flabbergasted if it did not perform better. I guess we have different gut feelings here. (BTW, I have been lurking here for the whole envelope debate, but haven´t felt like I was able to formulate any better arguments than some of those already made.)> Feel free to participate. I value all constructive inputs, even if you contribute a question. The whole debate is about math vs intuition vs parodox or isn't a paradox. And if it isn't a paradox (which is only a perspective error since it can't actually be a paradox) then what are its implications. What are the most interesting implications?
Which implications seemingly contradict "gut feeling" the strongest! |
|
Jul-22-08
 | | Sneaky: <sentri> It will take me a few reads to catch up to what you're saying but I'm confident I can. Just going through what you posted once I see a lot of the ideas that have been in the back of my mind, about creating a long sloping distribution and cutting off the tiny tail so that there are true boundaries on the envelope contents. I've probably taken as much math as you have in college, but it's been so long since I've given it a workout that you have the edge here. <meanwhile...> I went ahead and wrote my "Casino envelope-game simulator." Here are the preliminary results. To reiterate, this program simulates a game wherein the house randomly stuffs two envelopes with values determined by a 6 sided die: 1=(1,2)
2=(2,4)
3=(4,8)
4=(8,16)
5=(16,32)
6=(32,64)
The player chooses an envelope, peeks inside of it, then gets the chance to switch for the other envelope or keep the contents of what he's holding. First I did 10 million trials using the "never switch strategy", and the average income per round was: 15.7475914 (Very close to the predicted $15.75 mentioned earlier, so this gives me a good feeling that I didn't write a buggy program.) Next I did 10 million trials using the "always switch strategy", mind you here we even switch (dumbly) if we open a $64 envelope. The average income per round = 15.7501645. Identical to the above within the expected deviation. So we know that "ALWAYS SWITCH" and "NEVER SWITCH" fare the same. This makes sense, since they both have their drawbacks. "Never switch" stupidly keeps the $X when it could receive an EV greater than $X, and "Always switch" stupidly switches the $64 for a $32 every single time. We may call these strategies "stupid" because they both make manifest strategy errors, but if the game was designed so that you are not allowed to peek in the envelopes they would both be equally valid strategies. Now here's the $64 thousand dollar question (or should I say the $64 dollar question? har har) What happens with the strategy "Always switch, unless you opened up the $64 envelope?" After 10 million trials I get a result = 18.4285629, almost exactly equal to the anticipated $18.42 mentioned previously. Just for the sake of completeness, let's figure out what strategy would be the absolute worst for the player. The player who abhors money and wants to minimize his income would never switch at all (for that would only increase EV) except when he picks the $64, in which case he'd switch it for the $32. After 10 million trials using the "idiot strategy" I get a mean income of 13.0874898. (It would be easy to do the math to verify this result, but since the first three were verified already I'm willing to take this one at face value.) I'm starting to see that there is no paradox in the casino version of the game that I presented. I'm not quite there yet, but getting there. |
|
Jul-22-08 | | sentriclecub: Run a "switch always except on $16" |
|
Jul-22-08 | | f54280: Sorry if everybody already agreed on that paradox solution (I didn't read all the pages since the original paradox). I happen to have a math background and worked a bit on that paradox a few years ago, so I can't resist posting here... The paradox was: <If you are holding an envelope contain X dollars, and the other envelope contains either double or half of X, then the expectation value of switching envelopes is(0.5)*X*2 + (0.5)*X/2 = X*1.25
Ergo, it's 25% more profitable to switch.>
It is an extremely interesting and difficult paradox. It is, of course, not profitable to switch, but it is hard to see why, because the problem definition is incorrect. When one says "the envelope contains X dollars", our mind assumes an uniform distribution (ie: that 100$ is as likely as 200$). But, it is impossible to have an unbounded uniform distribution (where 100$ is as likely as 3.14159 trillions), hence the paradoxical result. You have to draw the X dollars from some distribution, and as soon as you do that, the probability that the other envelope contains 2*X is dependant on the value of X [think about it: if, for instance, the probability that X is big gets lower and lower, then, when you find an envelope with X, it is more probably an instance of (X/2, X) than an instance of (X, X*2)], so you cannot simply use: (0.5)*X*2 + (0.5)*X/2 Now, you don't need to use that X in reasoning, and just the mean of the distribution of the dollars in the smaller envelope (which we will call E) 1/ how much will you get if you stick to your envelope ? You had 50% chance of picking either one, so it is: 0.5*E+0.5*2*E = 1.5E 2/ how much will you get if you swap envelopes ? Well, 0.5*2*E+0.5*E = 1.5E Hence, no paradox anymore.
If one want to deal with the problem the way the original problem did (by picking an X and finding probabilities from that), it is much more difficult, and you would need calculus for rigorous general approach (and I don't feel like writing integrals in ASCII, unless someone really really wants to see me do it :-) ). Anyway, if we stick to a simple distribution, we can play the game with numbers: let's say that the smaller envelope contains between 0 and 1000$, with an uniform distribution (hence, the bigger envelope contains between 0 and 2000$) If you are holding an envelope that contains X dollars, how much will you get if you switch ? If X is bigger than 1000$ (as in 25% of the cases), you will get X/2. As the expectation for X is 1500, in average, you'll get 1500/2 = 750
=> In 25% of the cases, you get 750$
If X is smaller than 1000$ (as in 75% of the cases), the probability that you have the smaller envelope is 2/3, so you'll get 2/3*X*2 + 1/3*X/2 = 1.5X, with X being 500$ in average. So, in average, you also get 750$
=> In the other 75%, you get 750$
So, if you switch, you get 750$
How much would you have if you didn't switch ? Well, if you were stuck with the small envelope, you would have an expectation of 500$ and with the bigger one, 1000$, so, in average, you get 750$ if you don't switch. No paradox. English is not my first language, so, if it is unclear, just ask for clarifications. |
|
Jul-22-08
 | | Sneaky: <f54280> English may not be your first language but you expressed the ideas behind the paradox as clear as any of us have so far. What I'm still muddy on is whether or not it is possible to construct a distribution, so as you say it's no longer simply EV = (1/2)X + (1/2)2X, but you STILL retain an EV greater than 1. If you can do that, then the paradox remains, even though the distribution is uneven. Since I haven't proposed such a distribution (my "roll a die" method proved to be a failure upon analysis) I can't put my finger on the paradox. Maybe what I will learn is that it's impossible to construct such a distribution that converges to 1, like all good probability distributions do. |
|
 |
 |
< Earlier Kibitzing · PAGE 45 OF 58 ·
Later Kibitzing> |
|
|
|