< Earlier Kibitzing · PAGE 43 OF 58 ·
Later Kibitzing> |
Jul-17-08 | | YouRang: <rinus> Interestingly enough: User: cantor And even his pal,
User: hilbert |
|
Jul-17-08
 | | rinus: <YouRang>
<It would be nice to have a finite domain, but can we really?We want to select a random number X from some finite domain, and then have 2X also be in that same finite domain, with the same probability distribution for each value. It can't be done.
Suppose our domain is integers from a low value of L to a high value of H. If we pick X=H (or any value above [(L+H)/2]), then 2X > H. Also, for any value of X we pick, the value 2X excludes all odd numbers. If the user opens an envelope and sees the number 23 (say), they know for sure that the other envelope must have 46.> Finite domain of reals; 0 <= 2X <= H
Pick at random 2X; now X has a value too; I don't see any problem. |
|
Jul-17-08 | | YouRang: <rinus> ♔ Finite domain of reals; 0 <= 2X <= H ♔ But if they're reals, it's not finite. There are infinitely many real numbers between 0 and H (assuming H is positive). |
|
Jul-17-08
 | | rinus: <YouRang> The domain is finite (bounded), not the number of reals. |
|
Jul-17-08 | | YouRang: <rinus> It's bounded, but the fact that it's an infinite set leaves the basic problem unresolved. You still can't pick a number at random from this set, with equal probabilty for every number. BTW, another problem is the 2X should be a member of the same set that X is in. After all, you should be able to look at one of the numbers and not know if the other number is half that or double. In your case, if the number we see is greater than H/2, we know it must be the 2X value, and that the other number MUST be half of that. |
|
Jul-17-08 | | hms123: <yourang> cantor and hilbert haven't been that active!--hms |
|
Jul-17-08 | | hms123: <yourang> <rinus> these infinities are tricky business. Ordinary logic doesn't work at all. I think if an infinite set has an upper bound then it also has a mean (and thus an EV). I was raised on George Gamow's book "one, two, three...infinity" (1947)--very strange world--classes of infinities and all that. |
|
Jul-17-08 | | YouRang: <hms123: <yourang> cantor and hilbert haven't been that active!--> Yes. Unfortunate that. :-(
Then again, considering that they've both been dead for many decades, one must be rather impressed that they even registered at all. :-) |
|
Jul-17-08 | | sentriclecub: You guys have made extreme progress at resolving this. <BTW, another problem is the 2X should be a member of the same set that X is in. After all, you should be able to look at one of the numbers and not know if the other number is half that or double.> That is the key idea in bring this thing to bed. If you hold an envelop, and there is NOT 50% probability that the other envelop is 2x and 50% probability that the other is x/2, then you do not gain +.25 EV by switching. The one time you hold the highest envelop in the set, your negative EV by switching wipes out ALL of the sums of .25EV's that you stand to make by switching smaller elements. The poker analogy was vital to showing that. Your sums of EV's at smaller envelops gets wiped out the one lone time you hold the highest envelop and when you have 0% probability of doubling. This is the only way to show the EV example of the x, 1000x, x/1000 example. In the saudi royal prince example, if you open $40 and have 50% chance of going to $40,000 and 50% chance of going to 4 pennies, you are correct not to switch, because the EV you lose when you're holding the highest envelop is precisely equal to the sum of ALL the +EV gains you make by correctly switching at lower envelops. olbers paradox says reads something not related but interesting The paradox lies in the fact that the premise is unreachable. The idea so simple, to be given an envelop and there is 50/50 chance to double or halve--there is no possible way to setup a game that meets those criteria. http://en.wikipedia.org/wiki/St._Pe...
<In a game of chance, you pay a fixed fee to enter, and then a fair coin will be tossed repeatedly until a tail first appears, ending the game. The pot starts at 1 dollar and is doubled every time a head appears. You win whatever is in the pot after the game ends. Thus you win 1 dollar if a tail appears on the first toss, 2 dollars if on the second, 4 dollars if on the third, 8 dollars if on the fourth, etc. In short, you win 2k−1 dollars if the coin is tossed k times until the first tail appears.> The problem is that if you do conditionally accept the premise, then YES, you do gain by switching (interestingly though, you gain by looking first, then switching!). If you debate the premise, then you have debated the correct part of the paradox. The result is not debatable by itself. The full list of "answers" is indeed accurate at the wikipedia article. http://en.wikipedia.org/wiki/Two_en...
Please go on to read it as it is presented, it "solves" the simplest case, then says "but what if you dont inspect one envelop, is it still worth it to switch?" and will increasingly solve incrementally more challenging variations of the paradox. Soon though, the paradoxes become too overtly mathematical and are no longer enjoyable, but the first 3-4 are great. |
|
Jul-17-08 | | sentriclecub: Also, my posts have been "role playing" trying to stir up excitement and enthusiasm. The only way to want to learn more is to have your interest peaked. The conclusion at the poker article is invaluable for poker players and game theorists. The conclusion here to chess, is that if intuition has ever contradicted math, then you're safe to question your intuition about all parts of life. |
|
Jul-17-08 | | ganstaman: Yeah, this has all been fun, but I feel that wikipedia goes over it all rather well and leaves me, at least, with no questions. |
|
Jul-18-08 | | YouRang: <Sneaky> Thanks for letting us pollute your forum. :-) It's nice to have that paradox finally solved.
...or IS IT?! :-o |
|
Jul-18-08 | | YouRang: Hey, I just noticed: That last post above was #6000! :-D |
|
Jul-18-08
 | | Sneaky: <ganstaman I I feel that wikipedia goes over it all rather well and leaves me, at least, with no questions.> I like their article too but I certainly don't think it resolves the paradox. After all, it begins stating: <The two envelopes problem is a puzzle or paradox within the subjectivistic interpretation of probability theory ... This is still an open problem among the subjectivists as NO CONSENSUS HAS BEEN REACHED YET.> (my emphasis) |
|
Jul-18-08
 | | rinus: <YouRang: <rinus> It's bounded, but the fact that it's an infinite set leaves the basic problem unresolved.> Due to the discretization in amount of dollars even the resulting set is finite. <You still can't pick a number at random from this set, with equal probabilty for every number.> So the problem above is resolved.
<BTW, another problem is the 2X should be a member of the same set that X is in. After all, you should be able to look at one of the numbers and not know if the other number is half that or double.> I cannot conclude that from the initial setting of the problem. <In your case, if the number we see is greater than H/2, we know it must be the 2X value, and that the other number MUST be half of that.> I said, that we could reach useful conclusions only if the domain for 2X was finite. ********
The paradoxical problems with infinite spaces were very well addressed in obtuse.doc. BTW. IF 0 < X < inf THEN my expectation for X or 2X in the initial envelope is very, very , very large too; finding 5 dollars in the first envelope would let me want the contents of second one. |
|
Jul-18-08 | | YouRang: <rinus: <YouRang: <rinus> It's bounded, but the fact that it's an infinite set leaves the basic problem unresolved.> Due to the discretization in amount of dollars even the resulting set is finite.> Wait, earlier, you wanted to avoid the 'odd number' problem by going with 'real' numbers. Okay, that does eliminate the 'odd number' problem. Now you are limiting the set to discrete monetary amounts instead of real numbers in order to make the set finite. But in so doing, you've brought back the 'odd number' problem. :-) If I open an envelope and see $5.17, then (assuming that the discretization point is pennies) I know that the other envelope must have $10.34. |
|
Jul-18-08
 | | Sneaky: I am confused at the explanation found in the Wikipedia article. Most posters here seem to agree with Wikipedia, but it's not obvious to me. With the "no peek" version of the paradox, Wikipedia says: <1. Denote by A the amount in the selected envelope...
7. So the expected value of the money in the other envelope is (1/2)*(2A) + (1/2)*(A/2) = (5/4)*A
...
The most common way to explain the paradox is to observe that A isn't a constant in the expected value calculation, step 7 above. In the first term A is the smaller amount while in the second term A is the larger amount. To mix different instances of a variable or parameter in the same formula like this shouldn't be legitimate, so step 7 is thus the proposed cause of the paradox.> The part that confuses me is the claim "to mix different instances of a variable or parameter in the same formula like this shouldn't be legitimate". Why shouldn't it be? Isn't that the normal way to compute EV? I'm thinking of various EV computations I've done or have seen, such as the EV computation which tells you not to buy insurance when playing Blackjack. In that formula you are betting $X and you have two different instances, that of the dealer having Blackjack and that of the dealer not having Blackjack, side by side in the same formula. So what? Since when is this cause for alarm? Perhaps I am being confused over the meanings of simple terms like "variable" and "constant." That sounds silly, but it's not as crystal clear as some believe. For example, I have a jar of coins on my desk, but nobody in the world knows the exact total amount. Is the amount of money in the jar a "variable" or a "constant"? I can think of good arguments either way. As an aside: am I the only one who thinks it's weird that Wikipedia used the word <shouldn't> instead of <isn't> in the part I quoted? It's as if they aren't sure of themselves. |
|
Jul-18-08 | | YouRang: <Sneaky> Perhaps things clear up a little if you consider two *similar* cases, and percieve the difference between them: CASE #1 -- You hold an envelope containing A dollars. I have another envelope which contains, depending on the result of a secret coin toss, either A/2 (if heads) or 2A (if tails). You have the option to swap your envelope for mine. CASE #2 -- There are two envelopes on a table. One of them contains an amount X and the other contains double that amount, 2X. You pick up one of them (depending on the result of coin toss), and call it's amount A, while I take the other other one. Again, you have the option to swap your envelope for mine. Do you see the difference in how A is defined in each case? In case #1, A is a constant variable. It's value is the same regardless of my coin toss. In case #2, X is a constant, but A might be either X or 2X, depending on the coin toss. This is why A is not a constant. As they say, the A/2 term presumes that A=2X, while the 2A term presumes that A=X. It's two separate cases represented in the same forumla. <As an aside: am I the only one who thinks it's weird that Wikipedia used the word <shouldn't> instead of <isn't> in the part I quoted? It's as if they aren't sure of themselves.> I suppose that's because they are presenting different explanations of the paradox, and some explanations may disagree with others. The encyclopedia writers desire to appear as unbiased as possible, so they choose the less commital language. That's my guess anyway. :-) |
|
Jul-18-08
 | | rinus: <YouRang> <Now you are limiting the set to discrete monetary amounts instead of real numbers in order to make the set finite. But in so doing, you've brought back the 'odd number' problem. :-)> Take a random real ... 0 <= 2X <= H ...>> Fill one envelope with Round(2X> and the other with Round(X); put this in the game conditions. |
|
Jul-18-08 | | YouRang: <rinus> Well, now there are two problems: Problem #1: <Take a random real ... (in the range from 0 to H)> You are asking me to randomly pick a number from an infinite set. There are infinitely many real numbers between 0 and H (I assume that H>0), so it cannot be done. ~~~~~~~~~~~~
Problem #2: <Fill one envelope with Round(2X) and the other with Round(X)> Firstly, assuming that Round(x) means "x rounded to the nearest integer", why not just select a random *integer* in the first place? Secondly, I gather that after rounding, 2X may be any integer from 0 to H, right? If 2X is an even number, then X is an integer between 0 and H. If 2X is an odd number, then X is an integer between 0 and H plus one half (.50) -- for example if H is 100, and 2X is 61, then X is 30.50. This is just a variation of the odd number problem. If I open one envelope and see that it contains any value ending with .50 (like 33.50), then I KNOW that the the other envelope contains 67 (since it can't be 16.75). ~~~~~~~~~~~~
There are other problems too, related to the boundaries. What if I select 2X=0. Now both envelopes are 0, and opening one reveals the other. Also, if I open one and it has the value H, I know that the other must be H/2 (2H is outside our range). Anyhoo, I think you're trying to do the impossible here. :-) |
|
Jul-18-08 | | YouRang: <rinus> I just thought of another problem with your proposed method of filling the envelopes. Ignoring (for now) the problem of selecting a random number from the range 0-H, suppose I happen to get the 'real' number 8.62 (What luck! It can be represented with finite decimial digits!) So 2X = 8.62, which means that X = 4.31, right?
Now we round them:
First envelope = Round(8.62) = 9
Second evelope = Round(4.31) = 4
Oops! We've now lost our property of having one envelope contain double the value of the other, since 9 != 2*4. :-( |
|
Jul-18-08
 | | rinus: <YouRang>
put this in the game conditions.
and do it cent/penny wise
|
|
Jul-18-08
 | | rinus: 2X= 8.134567 ; X=4.067...
In the envelopes 8.13 and 4.07 |
|
Jul-18-08
 | | rinus: <YouRang> Ever heard of Monte Carlo methods? |
|
Jul-18-08 | | YouRang: <rinus> Rounding to the penny doesn't really change anything, mathematically. The same amounts can be expressed in pennies (e.g. $8.12 = 812 pennies), so whatever problem you have with integers still exist with pennies. Anyway, if I understand you correctly, it doesn't matter because you've now changed the problem. We no longer require one envelope to contain double the amount of the other. I guess we can say that one is half of the other, rounded to the nearest 1/100. You're now heading into the murky land of approximations, and I'm not sure I want to follow. But I'll give one example: if we see that one envelope has 0.03 in it, then the other envelope might have either 0.01, 0.02, 0.05, 0.06 or 0.07 (each with varying degrees of probability). Do you really want to deal with such possibilities? And you still have problems at the boundaries. For example (just considering the lower boundary), if one envelope has 0.00 the other envelope can only be the same or higher. |
|
 |
 |
< Earlier Kibitzing · PAGE 43 OF 58 ·
Later Kibitzing> |
|
|
|