< Earlier Kibitzing · PAGE 41 OF 58 ·
Later Kibitzing> |
Jul-15-08
 | | Sneaky: <hms123: <All> I couldn't stand it another second. Your intuitions are right but much of the math is wrong. .5*X+.5*2X = 1.5X
>
I don't understand: If you are currently holding X, there is a 50% chance of getting ONE HALF OF X, and a 50% chance of getting 2X. So it should be .5*(X/2) + .5*(2x) = 1.25X
like I said before, right?
<YouRang: <sentriclecub><When you hold X dollars there are 3 outcomes... half-x, double-x, or keep your envelop.> I think you're confusing the 2-envelope case (which <hms123> answered) with the 3-envelope case> I don't think he is. In the three envelope case there are 4 possibilities. He's such highlighting that there are three possible outcomes, if you include "sticking to your guns" as an outcome. <sentriclecub> I'm still absorbing all of your input on this subject. I'd like to propose two different paradoxes. One is the version of the enveloping switching paradox where you do NOT look at what is in your envelope, and the other is where you do look. Why should this make a difference? Frankly I'm not sure. But like you said, if you have a very low sum then it makes sense to switch, because even the billionaire Arab has a limit to his fortunes. You start to run into the problem that there is no such thing as a purely "random number" without first specifying the distribution for the number to fall into. |
|
Jul-15-08
 | | Sneaky: <hms123> I think I see the cause for our conflict: my formula is based on the assumption that you are holding an envelope containing X and you are given the option of switching with the other envelope. If you define "X" to be the envelope with the lessor amount of money, then your formula is correct. |
|
Jul-15-08 | | YouRang: <Sneaky: <hms123> I think I see the cause for our conflict: my formula is based on the assumption that you are holding an envelope containing X and you are given the option of switching with the other envelope. If you define "X" to be the envelope with the lessor amount of money, then your formula is correct.> Right. And we can't have two EV formulas that produce different results, and are both correct, can we? :-) So the question is: Which one is correct?
I think <hms123> has it right. As I said above, the fault with the <.5*(X/2) + .5*(2x) = 1.25X> formula (in the 2-envelope case) is that it is assigning a .5 probability to BOTH (X/2) and (2X), even though one of them is impossible. Just because we don't know which one is impossible doesn't make them both possible. In the 2-envelope case, there's just X and 2X (or if you prefer, X and .5X), but you don't have X and 2X and .5X. |
|
Jul-16-08 | | sentriclecub: <Just because we don't know which one is impossible doesn't make them both possible. In the 2-envelope case, there's just X and 2X (or if you prefer, X and .5X), but you don't have X and 2X and .5X.> NO! If you have $5 and given the opportunity to "double or halve" you should take the opportunity! The math has gotten to complicated! What do we do when the math gets too complicated and we lose sleep? Lets forget about it, and create an answer to a simplified version! NO! ! ! ! As a matter of fact, <yourang> if you have a paypal account, i'll send you $5 and let <Sneaky> flip a coin. If it lands heads, you send me $2.50 and if it lands heads send me $10.00 I'll pay for the paypal fees (which will be covered by the +EV). If you believe your math, then lets do it! Afterall, I am willing to pay the paypal fees of 25cents. Let he who says the "one switch variation" isn't a true paradox, then I'll gladly bite the paypal fees to allow you to learn the hard way :) Same goes to all. If nobody accepts my challenge, then I'll take it as a sign that if given the opportunity to "double or halve" or keep your envelop, you gain 25% +EV by switching. Also, to be fair, I'll allow 14 days once i send my initial $5 for you to let the money clear and put it into your checking account, and let you stuff $10 into one envelop and $2.50 into another envelop, and you are allowed to randomly mail me the envelop (and I can mail you the $5 via postal mail if you think the "trick" lies within the spatial properties of the envelop itself). So long as I have 50/50 odds of "doubling or halving" as this argument keeps trying to dodge issue on... Any takers? Also, here are my poker results at $1/$2 NL holdem at the tampa seminole hardrock casino since they opened up "no limit" poker a year and a half ago. -66
+350
-185
+180
-10
-180
+300
+200
+300
+200
-160 (Dec 31st)
+373 (July 2nd)
-15 (July 6th)
Each "sesh" was ~8 hours, 35 hands/hr. Relevance--to show I have a appreciation for +EV. |
|
Jul-16-08 | | sentriclecub: Alright alright. If you guys can wait until July 18th, I'll post the answer. For those who want to begin solving it, here's the way to begin. If you have $20 in your envelop, than your opponents options are "switch $40 for $20 or switch $10 for $20" however, if the opponent is holding $10, his opponents' perceived options are $5 for $10 or $20 for $10. If the opponent is holding $40, his options are $80 for $40 or $20 for $40. However if the opponents' opponent is holding $80, then ... cont the final answer in a couple days. The current blaze is the fallout of DanL now that he has been exposed as a likely mole, as all the backstage email going around is getting people to sit and spend 30 minutes combing through his posts with a magnifying glass, to see what he actually does here and what he contributes. |
|
Jul-16-08
 | | Sneaky: OK, time for some comments.
sentri: You write
<Your computer program is FLAWED!!!!! You misstate the "1 switch version" of the paradox (the 1 switch version is the only "valid" paradox)My computer program was written out in hand.
I receive an envelop with $5 and switch it one time. I ran this experiment 1,000,000 times using "keep the envelop" and 1,000,000 for switch.> I argue that it's you who is misstating the paradox. I completely agree with you that if you write a program to play the game for $5 each time, with the possibility of switching to ether 2.50 or 10, that it would perform better if it switches every time. I can't imagine anybody arguing that this is not the case. But the paradox as I understand it is that two envelopes are loaded with money so that one contains exactly twice of the other, and they are shuffled, and the player chooses one at random, and is then given the opportunity to switch. If you play the game that way, you'll never end up looking at exactly $5 every single time you play. My program worked like this:
For each iteration, it would follow these steps:
(1) Put $10 in envelope (variable) $a.
(2) Then it would flip a digital coin. If it came up heads, then it would put $20 in envelope $b. If it came up tails, it would put $5 in envelope $b. (3) Then it would flip another digital coin, and if it comes up heads it would swap the values of $a and $b. This is what I call "shuffling the envelopes", and is probably an unnecessary step but I wanted to emphasize that we don't know which envelope is the greater of the two. (4) Then it presents the "player" the opportunity of picking an envelope. The player, having no idea which one is best, flips a digital coin, heads means "I pick A" and tails means "I pick B". (I could have just as easily made the player pick A every single time but again I wanted to emphasize the pure randomness of the game.) (5A) For one run of the program (across a million trials) the player simply kept the value in the envelope thus chosen. I called this the "stick to your guns" strategy. (5B) For the other run of the program the player would pick an envelope as normal, but then change his mind and switch. I called this the "waffle" strategy. Although the final figure was slightly different every time the program ran (it is a game of luck, after all) after a million trials the luck factor is almost entirely obliterated. It turned out that the player earns roughly 15 million dollars regardless of whether strategy 5A or 5B is used, just as predicted by hms123's formula which concluded that the EV=1.5X (where X = the value of the lessor envelope). This behavior was intuitively obvious, because the action of switching is no different from declaring that heads=B and tails=A instead of the other way around. As a friend of mine put it to emphasize the irony of the switching strategy, "If you planned to switch envelopes then why didn't you just pick the OTHER envelope in the first place?" Note that in my program, the "no peek rule" was in effect at all times. The player was not allowed to look into the envelope and see what you had before deciding whether to switch or not. Sentri, please don't misunderstand my opinion of all of this. When mathematics contradicts common sense, I tend to place my money with mathematics. I hate it when I hear people say things like "The Monty Hall Paradox" (referring to the famous 'do you want to switch' game involving the three-way decision), or the "Birthday Paradox" (referring to the fact that if 23 people are in a room it's likely that two of them share a common birthday). These are not paradoxes! It's just that human intuition commonly fails in these situations, except for bright people who see through the subterfuge and use math to get at the truth. This situation is different however, in a bothersome way. The math leads me to an answer which I want to believe, but it is so crazy I can't accept it. It's demonstrably incorrect as my above program proved. As far as I can tell, this is still an unresolved paradox, as close to a true paradox as you'll ever run into. |
|
Jul-16-08
 | | Sneaky: <hms123: <All> I couldn't stand it another second. Your intuitions are right but much of the math is wrong. Here is the situation: 1. There are two envelopes, one with X dollars and one with 2X dollars. 2. First case--you have the one with X. If you flip a coin either you will hold (50% probability) or switch (50%). Your expectation is .5*X+.5*2X = 1.5X 3. Second case--you have the one with 2X. Your expectation (hold or switch) is .5*2X+.5*X = 1.5X 4. It is exactly the same in each case no matter what X happens to be. Thanks--hms>
I love where you are going with this, but I was confused temporarily where you say <If you flip a coin either you will hold (50% probability) or switch (50%)>. The point of the paradox is the player has the choice of switching. The player doesn't flip a coin to determine if he switches. But there is a coin-flip involved, it's not with whether you switch or not, it's about which envelope you picked in the first place. The math works out exactly the same but I think the way you stated it was a little confusing. (Or maybe I'm just easily confused, YouRang followed your argument fine.) So with all due respect let me restate your idea so I think it reads better: <♔ The hms123 argument ♔> DEFINITIONS/EXPLANATION: X dollars are placed in an envelope, and 2X dollars are placed in another envelope. The envelopes are shuffled so nobody knows which contains the larger sum. You are given a choice of envelopes, and you pick one at random. You are then given the opportunity to either keep the money in your envelope or switching it for the other one. Assess these two strategies (keeping and switching) and determine which, if either, is more profitable. <STRATEGY ONE: KEEPING> First case (50% likely): you have the envelope that contains X. You stick to your guns, open the envelope, and you keep X. Second case (50% likely): you have the envelope that contains 2X. You stick to your guns, open the envelope, and keep 2X. Therefore the EV for this strategy is (.50)*x + (.50)*2x = 1.5x. <STRATEGY TWO: SWITCHING> First case (50% likely): you have the envelope that contains 2X. You waffle, and take the other envelope and keep X. Second case (50% likely): you have the envelope that contains X. You waffle, and take the other envelope and keep 2X. Therefore the EV for this strategy is (.50)*2x + (0.50)*x = 1.5x <CONCLUSION>
Your EV is 1.5x if you keep your first pick, and 1.5x if you switch. There is no advantage (or penalty) in switching. |
|
Jul-16-08
 | | Sneaky: Now <hms123> I love your line of reasoning and agree 100% with your computations, but I still don't see this as a resolution to the paradox. Let's keep your (X,2X) as values of the envelopes, but let's play the game slightly differently, this time where you are allowed to look at what you picked before you decide if you want to swap. You take an envelope at random and it contains (for example) $40. Seeing this, we now have learned something about the scenario. Either (1) the other envelope contains $20 and you picked the bigger one, or (2) the other envelope contains $80 and you picked the smaller one. Both (1) and (2) are equally likely possibilities with a 50% chance of each. <STRATEGY ONE: KEEPING> You keep $40. EV = 40. Nothing more to say.
<STRATEGY TWO: SWITCHING> The other envelope contains either $20 or $80, with equal probabilities of each. Therefore the EV should be: (0.50)*20 + (0.50)*80 = 50.
Since $50 is more than $40, the smart gambler would want to switch. <CONCLUSION>
Although <hms123> showed positive proof that switching contains no advantage, here is positive proof that switching is profitable. That's why this is a paradox. There are always multiple ways of solving a math problem but the answer should conform to reality and not be dependent upon the method used. So if the hms123 arugment is correct then the above argument MUST have a flaw. But where is the flaw? Also... <sentriclecub> suggested something cunning: exaggerating the effect by playing a game where one contains a very large multiple of the other. So instead of playing "double or half" we can play "1000x or 1/1000th" with the billionaire Arab. I like this approach because it really makes you FEEL the paradox in your bones. Suppose you play the version with (X,1000X) with the rich Arab. You open an envelope and again find $40. The other envelope might contain 1/1000th of the prize (a check for four cents), or it might contain $40,000. Who in their right mind wouldn't risk $39.96 for the chance to win $40,000? Is it not true that you would stand a 50% chance of scoring the 40K prize? I know what User: SwitchingQuylthulg would do... he'd SWITCH! |
|
Jul-16-08 | | sentriclecub: yeah, agreed, I was just role-playing trying to add some life to the subject, generate more interest. I'm amazed with your 3 page post. I'm glad each time I come back to this page and see yourang or hms123 or yourself has left a reply that I can tell you thought a great deal about. I'm going to spam some other users pages who might want to put in their 2 cents (double or halve!) before I give my 2 solutions (which I'm no expert, nor have I actually confirmed it). I am VERY glad that the (x, 1000x) clarified what I thought I couldn't explain! That the problem is "you can double or halve the contents of your envelop". There is no "do you want envelop A or B" which automatically eliminates uncertainty. You have a 50% chance of (switch:x keep:2x) and a 50% chance of (switch:2x keep:x) which obviously relies on luck! Luck is being lucky enough to either switch the bad envelop or keep the good envelop. Its no longer a question of uncertainty, because there is only 2 conditional probabilities, and there easily solved. This is NOT a solution to the "1 switch double/halve variation". As a matter of fact, the wikipedia article I just checked for "envelop paradox" is not the same as the one here--they apparently updated the article to reflect concensus (i.e. the paradox is presented in the way that most people understand it--simplifying it to envelop A or B). The one at ed-trice forum, is the one I laid out, which I had always been famaliar with. (and the one which tortured me!) As a matter of fact, I used the same things you guys are saying, saying that its like paper-rock-scissors or a coin flip. If there is no upper bound on what the highest amount you can win (or keep), then the paradox holds. As soon as you say between $5 million and $50 million, and well your envelop has $36 million, then its an easy keep. In conclusion, I'm super happy you "saw it" in the (x, 10x) problem with no upper bound. Now, try to "see it" for (x, 9x) and then (x, 8x) and work down to (x, 2x) and fight those mental obstacles. Its like 3-d glasses the first time, for the first 5 seconds, you can't relax and let your eyes adjust--you think to hard and work against your understanding. Lastly for anyone else joining and want yet another optional twist. pretend you can buy "insurance" from a insurance broker who charges 1% commision plus the cost of doing business (i.e. average cost of taking millions of these risks from different people all at different times). In other words, his business has no overhead costs, and only wants a 1% profit, you're getting a great deal. Thus you can try to figure out under what payout structures its worth it to try to scam the insurance guy by unloading all your risk on him at 1% profit to him. When would you keep the risk, and just go for it? Also if stock A goes up 10% in year one, then down 8% in year two. And stock B gainst a steady 9% per year, which is the better investment for a 2 year investment term. Also, if you had $1000 and could create a mix of all three stocks and can buy/sell freely/unlimited--how would you tweak your +EV expected value. (dont necessarily actually answer this, just read it and focus on it, if you need to reapproach the double/halve problem and want to erase your short-term memory) |
|
Jul-16-08 | | YouRang: <sentriclecub><As a matter of fact, <yourang> if you have a paypal account, i'll send you $5 and let <Sneaky> flip a coin. If it lands heads, you send me $2.50 and if it lands heads send me $10.00> But lets look at the situation you've just described. It's essentially what I've been calling the <3-envelope> case, and in THAT case I agree that you <should> exchange your $5 for an opportunity to halve or double it. Note the mechanics of it:
1. I have TWO envelopes (1) $2.50 and (2) $10, and you have your one envelope that you know contains $5. 2. Your choice is to decide between (A) keeping your $5 or (B) swapping it for one of my envelopes, randomly chosen by Sneaky's coin toss. But that's <NOT> the same as the <2-envelope> case that was oringally presented. The 2-envelope case goes like this: There are two envelopes, where one contains double the amount of the other (i.e. one contains X and the other 2X). You pick one. Clearly the EV is 1.5X. Now, you might say it differently. You might say that one of the two envelopes is 'in your hand' and the other is 'on the table' and your choice is to (1) keep or (2) swap. But it amounts to the same thing: you're just picking one of the two. I think it is interesting to look at the different mathematical constructs for the 2-envelope case: CONSTRUCT #1: (introduced by <hms123>) <
1. There are two envelopes, one with X dollars and one with 2X dollars.2. First case--you have the one with X. If you flip a coin either you will hold (50% probability) or switch (50%). Your expectation is .5*X+.5*2X = 1.5X 3. Second case--you have the one with 2X. Your expectation (hold or switch) is .5*2X+.5*X = 1.5X 4. It is exactly the same in each case no matter what X happens to be.> CONSTRUCT #2: This is what <sentriclecub> appears to be using: <There are two envelopes. The one on your left contains X, and the one to your right contains either .5X or 2X., with a %50 chance either way.Therefore, you take the one on the right since the EV for left one X while the EV for the right one is 1.25X (i.e. .5(.5X) + .5(2X) = 1.25X)
>
So which construct is correct for the 2-envelope case? They can't both be correct. The problem with CONSTRUCT #2 is the part about the right envelope. It can't contain a 'probablistic' value. We don't know whether it contains .5X or 2X, but we know that it's value doesn't change as the result of a coin flip. (If it did, it would be equivalent to the 3-envelope case.) With X defined as being the value of the left envelope, we can't use it in the EV formula shown above. If X is the larger of two envelope values, then the formula is flawed because there is no "2X" value, and if X is the smaller value, it's flawed because there is no .5X value. We don't know if it is the smaller or larger value, but our ignorance doesn't remove these flaws. The fallacy of CONSTRUCT #2 can be also seen by merely switching the envelopes by putting the left one on the right and the right one on the left. You can use the same reasoning to argue that you should still take the envelope on the right (which previously was the rejected one on the left). ~~~~~~~~~~~~~~~~~~~~
But the interesting part is what happens if you open an envelope, as <Sneaky> suggests? Suppose we open the left envelop and see $5. Now, <hms123>'s EV formula doesn't work because X is not a constant. It is not valid to say that in the First case, $5 = X and in the Second case, $5 = 2X. So, <after opening an envelope>, I'm back to my original statement that you should swap once, if we can still assume that there are equal chances that the envelope can contain either $2.50 or $10. However, I think <Sneaky> may be right by suggesting that once you know a value, the 'equal chances' condition may not be remain in effect. It seems unreal that the 'value space' of the problem can be infinitely large. Also, is there no 'lowest possible value', say one penny? If I open the envelope and see a penny, is there a really a chance of getting a half-penny? Anyway, my head is starting to spin. :-P |
|
Jul-16-08 | | ganstaman: <YouRang: But the interesting part is what happens if you open an envelope, as <Sneaky> suggests? Suppose we open the left envelop and see $5. Now, <hms123>'s EV formula doesn't work because X is not a constant. It is not valid to say that in the First case, $5 = X and in the Second case, $5 = 2X. So, <after opening an envelope>, I'm back to my original statement that you should swap once, if we can still assume that there are equal chances that the envelope can contain either $2.50 or $10.> This is incorrect. You are suggesting that if you look at the value in the envelope, then it becomes correct to switch. Note that it doesn't depend on the value that you see. So you don't care if you see $5 or $100, you'd want to switch based on the math. Then why do you even look inside? You are able to ignore the value you see when you look anyway, so simply seeing the number can't actually affect the correct decision. Let's say that we play this game together. We pick an envelope together. You look inside and I don't. So would it make sense for it to be neutral EV for me to keep or switch, and at the same time +EV for you to switch? |
|
Jul-16-08 | | ganstaman: I think this is a fairly good explanation of what's going on, and why it's never correct to switch envelopes unless you are very bored. http://www.poker1.com/absolutenm/te... Some quotes:
<I've assigned the 50% probabilities to having the higher and lower values arbitrarily, with insufficient information. This is really a Bayesian problem, and since Moss didn't tell me the prior distribution on the amounts in the envelopes, I can't solve it.> Ooops, that was only the statement of the problem. Possible solution: http://www.poker1.com/absolutenm/te... with quotes:
<Johnny decides that a range of $50 to $800 is about right for this experiment.> <When I'm finished, I've created four pairs of envelopes. Now I randomly select one pair.> <Now we see that there are two exceptions to your lose-half-or-double expectation. If you open a $50 envelope, there's only one thing that can happen by switching: You gain $50. And if you open an $800 envelope, there's only one thing that can happen by switching: You lose $400.> So does this work for an infinitely bankrolled envelope stuffer? |
|
Jul-16-08 | | sentriclecub: <But that's <NOT> the same as the <2-envelope> case that was oringally presented. The 2-envelope case goes like this:> Yeah, I was continuing from over at the ed rice forum. When you stuff the 2 envelops before the person receives one, then there is no uncertainty, just luck. However suppose, lets go along with it, the other envelop is destroyed, and at random is replaced with an envelop that is either double or halve. Sorry Yourang, I only realized that you have brought this up like 3 times, and I just now paid attention to it (and compared it to the other two times it was brought up, and indeed both times by you). I dont read the usernames, I just read the most recent post since I'm trying to keep this environment in a slight climate of confusion, to help keep open the brain. |
|
Jul-16-08 | | amadeus: <Sneaky: Either (1) the other envelope contains $20 and you picked the bigger one, or (2) the other envelope contains $80 and you picked the smaller one. Both (1) and (2) are equally likely possibilities with a 50% chance of each.> I don't think so. One is 100%, and the other is 0%, the choice of values was already set. That's why this problem is different from flipping a coin to determine if the next envelope will be half or double. Switch always and you'll still get only (Big Money+little money)/2. Well, just my 2 (-1 +2) cents.
KEEP THE ENVELOPE:
50% Big Money
50% little money
SWITCH THE ENVELOPES:
50%Big Money => little money
50%little money => Big Money
Expected Value = (Big Money + little money)/2 |
|
Jul-16-08 | | ganstaman: Right, I think this all makes sense now. With <amadeus>'s explanation (others have said it before, I believe, but I don't want to go searching for names, sorry), we are working with precisely 2 envelopes. The math assumes there are only 2 envelopes and works out pretty well. But, if you say "I have $X and can switch for 2X or X/2" then you are dealing with a range of envelopes. Then, you can't just do the math for this one instance to get "switch for 1.25X." Instead, you have to do the math for the whole range of envelopes as shown in the links I provided just before. Then the math will also show that switching is a waste of time. |
|
Jul-16-08 | | hms123: <egads> If you open the envelope and look inside then you have additional information that changes the situation. As an example, (assuming whole numbers) sometimes you will look and find $5, then you will be certain that other envelope has $10 (because $2.50 is not allowed. If you want to allow pennies, then imagine that the first envelope has $7.13 in it). In short, sometimes you will be certain that you should switch--thus giving the edge to switching. <Sneaky> Your re-write was great. I tend not to explain enough--and you guys seemed to be on top of things anyway. <All> I hope I can drop in occasionally. thanks--hms |
|
Jul-16-08 | | hms123: <ganstaman> I read the explanations in the poker column--the argument is similar to mine. Basically, opening the envelope gives you useful information some of the time. But that's all you need to get an edge. |
|
Jul-16-08 | | ganstaman: <hms123: <ganstaman> I read the explanations in the poker column--the argument is similar to mine. Basically, opening the envelope gives you useful information some of the time. But that's all you need to get an edge.> Except your conclusion is the opposite of what the links say. This is because you are assuming factors into this problem that aren't stated anywhere else. No one said anything about whole numbers only, or even not allowing half-pennies. All we know is that 1 envelope contains half as much money as the other. Under only these assumptions, there is absolutely no edge to switching. |
|
Jul-16-08 | | YouRang: <ganstaman><So, <after opening an envelope>, I'm back to my original statement that you should swap once, if we can still assume that there are equal chances that the envelope can contain either $2.50 or $10.> <This is incorrect. You are suggesting that if you look at the value in the envelope, then it becomes correct to switch.> Two points:
- My main concern here is that the mathematics (first mentioned by <hms123>) used to show that both envelopes have the same EV breaks down once you know the value of one of the envelopes. So in that sense at least, knowing the contents of an envelope does change things. - Also, notice that I put a conditional in my statement: <"if we can still assume that there are equal chances that the envelope can contain either $2.50 or $10">. It was not my intent to suggest that this assumption is correct. Conversely, I went on to suggest that this assumption may be invalid. If indeed the assumption is invalid, then it's another way that knowing the contents changed things. |
|
Jul-16-08 | | hms123: <ganstaman> Again, I wasn't clear enough (or possibly not right enough!)
What I meant was that opening the envelope changes the problem in ways that gives information--that information would let you know to switch (or not). Whether the info is in the form of odd numbers or low numbers (poker example) really doesn't matter, I think. thanks--hms |
|
Jul-16-08 | | hms123: <sc> One way to solve probablility problems is to write down ALL of the cases. The "ALL" is the hard part. It is easy to forget one, or to conflate two. As a trivial example, if asked to list the cases for flipping two coins, someone might write down "2 heads" "1 head, 1 tail", and "2 tails" forgetting that "1 H, 1 T" is really two cases (HT, TH). |
|
Jul-16-08 | | YouRang: Another little thought:
Forget money. In order to make the logic/mathematical arguments work cleanly, we have to eliminate the issue with pennies, odd numbers, lower bounds, etc. Suppose then, that each envelope contains a piece of paper with a number written on it, and that its value may be <any positive rational> number (*). We must assume then that in order to to fill the envelopes according to the terms of the problem, we must select some positive rational number at random for the first envelope, and then double that number for the second envelope. BUT THIS RAISES A QUESTION: Is it really possible to pick any positive rational number at random? If not, then the 2-envelope problem, as posed, is subtly fallacious. Side question: If we could pick a rational number at random, can we safely assume that it can be expressed in a finitely sized envelope? (*) A rational number is any number that can be expressed as P/Q, where P and Q are both integers. Examples of positive rational numbers:
0.00003445,
83982393288.92383289,
7.7843843843843... [endlessly repeating 843..] |
|
Jul-16-08 | | sentriclecub: <Well my reasoning is the same but my conclusion is the opposite> that's quite an exception! (not exact phrase, but wanted to add emphasis!!!) PARADOX! |
|
Jul-16-08
 | | rinus: <YouRang>
<BUT THIS RAISES A QUESTION: Is it really possible to pick any positive rational number at random?> Ever heard of the 'obtuse problem' from Lewis Carroll? www.ratio.huji.ac.il/dp_files/dp235.doc
|
|
Jul-16-08
 | | rinus: Same thing over there. |
|
 |
 |
< Earlier Kibitzing · PAGE 41 OF 58 ·
Later Kibitzing> |
|
|
|