< Earlier Kibitzing · PAGE 243 OF 243 ·
Later Kibitzing> 
Oct1514
  al wazir: <Sneaky>: I went back and read the description of the second envelope problem  carefully. (The first time I only skimmed it, and that's why I misunderstood what the problem poser meant by "random." Sorry about that.) It's really a cute problem. Let me try to analyze it step by step. As he says, if the envelope you get contains 1, then the other is certain to contain 3  so of course you switch. Now suppose the envelope you get first contains 3. The the other one can contain either 1 or 9; there are no other possibilities. If it contains 1, the first toss of the coin must have yielded a head, i.e., the sequence of tosses must have been simply H; if 9, then the sequence of tosses must have been TH. The probability of the first sequence is 1/2; the probability of the second sequence is 1/4, so the probability that the second envelope contains 1 is twice the probability that it contains 9. Therefore the probability that it contains 1 is 2/3 and the probability that it contains 9 is 1/3, because 2/3 is twice 1/3, and the two probabilities have to add up to 1. This means your *expectation* when you switch is (2/3)x1 + (1/3)x9 = 3 2/3. Since 3 2/3 is more than 3, it makes sense to switch. And likewise, if your envelope contains 9, the other one contains either 3 or 27, with respective probabilities of 2/3 and 1/3. So the expected value of the contents of the other envelope is (2/3)x3 + (1/3)x27 = 11, which is more than 9, so again you switch. And in general, if the envelope you start with contains 3^N, then the other one contains either 3^(N1) or 3^(N+1), and the expected value of the contents resulting from a switch is (2/3)x3^(N1) + (1/3)x3^(N+1) = 2x3*(N2) + 3^N, so by switching you stand to gain 2x3^(N2). (This formula reduces to the result for the two special cases I already discussed.) Simple. BUT WAIT A MINUTE. If you always gain by switching, you don't even have to look at the contents of the first envelope. You should *automatically* switch, every time. And on average, you must come out ahead. Huh?
The resolution of this paradox is similar to that for the old envelope paradox. Whatever units the numbers in the envelopes refer to (dollars, pounds, euros, rubles), the number that exist is finite. Under some circumstances three times the number in the envelope *might* exceed the total in the entire known universe. That is, if the result of the coin tosses is TTT...TH (a long, long, long string of tails before the first head), the number 3^(N+1) might be so big that it exceeds the total of the dollars, pounds, etc., that exist. In that case, the number in the second envelope *must* be smaller, and if you switch you'll take a bath. You'll lose the biggest fortune anyone ever lost in the history of the world. Of course that TTT...TH string of tosses is highly unlikely, but in the highly unlikely case that it occurs, the amount you'll lose will just balance the gains from all the other swaps that can occur. And in fact, it doesn't take an impossibly long streak of tails: 3^19 is more than a billion, but the odds against the corresponding sequence of tosses  18 tails before the first head  are "only" a little more than a half million to one. Now I'll go back to the poser's site and read his answer. Once again, his explanation exceeded my attention span, so I'll stick with my own. 

Oct1514
  Sneaky: <Whatever units the numbers in the envelopes refer to (dollars, pounds, euros, rubles), the number that exist is finite.> I think you have the right idea in mind but just complaining that we might run out of money isn't good enough. For the sake of argument let's say that this a game show and the values in the envelopes are merely "points" that you accumulate on the show. If you get more points than the other contestant you win a brand new car. So now you should have no problem if we have any arbitrarily large number n with ridiculously huge 3^n envelope values. But the fact that 3^n gets big so quickly is at the heart of this riddle. Here's my resolution: You compute that if the envelope you have contains X, the other one has an EV of 11/9 * X. Since that's bigger than X, you conclude it's smart to switch. However, what happens when you try to solve the question "What is the EV of one randomly selected envelope of a pair of envelopes?" You learn that it has an infinite EV, just like the EV in the St. Petersburg paradox. So given that, we know that the above conclusion 11/9 * X > X is as meaningless as saying 11/9 * infinity > infinity. That cursed infinity has led us astray once more. 

Oct1514
  al wazir: <Sneaky: I think you have the right idea in mind but just complaining that we might run out of money isn't good enough.> Yes, that was more in the nature of a motivation than a proof. <For the sake of argument let's say that this a game show and the values in the envelopes are merely "points" that you accumulate on the show. If you get more points than the other contestant you win a brand new car.> Suppose you are a contestant in that game show. There must be an (unknown, possibly unknowable) number beyond which any number is "too big" to be contained in an envelope, even in the gameshow version of the problem. Suppose you have been *told* what that biggest number is. Suppose the biggest number an envelope can possibly contain is 3^20 (approximately 3.5 billion). You open your envelope and see that the number it contains is 59049 (3^10). Do you switch? Yes, you know that on average switching will increase your point total. But two times out of three it will get you a smaller number, 19683. You are competing against other contestants who have their own envelopes and have to make their own decisions. They *might* have the same number you have, but how likely is that? It would have taken ten coin tosses to generate your number, 19683, and the odds against that are more than 1000 to 1. Unless there are at least a thousand contestants competing against you, or the game lasts more than a thousand rounds, or some combination of these, your number is probably going to win. It will be almost as likely to win even if scores are cumulated over many rounds. And in the extremely unlikely case that one of your competitors has an envelope containing a bigger number, you can't do better than earn a tie even by switching. So you would be foolish to switch. <"What is the EV of one randomly selected envelope of a pair of envelopes?" You learn that it has an infinite EV, just like the EV in the St. Petersburg paradox.> I thought of pointing out the connection with martingales, but for that I would have had to do a formal calculation. And then I would have gotten bogged down in something as hard to plow through as the answer the proponent of the problem gave. I will simply state  again  without proof that once an upper limit is imposed, the expected value of switching is zero, because the extremely unlikely risk of a big loss from switching away from the biggest value cancels out all the (more likely but smaller) expected gains from switching when the first envelope contains a smaller number. 

Oct1514
  Sneaky: <Suppose you are a contestant in that game show. There must be an (unknown, possibly unknowable) number beyond which any number is "too big" to be contained in an envelope, even in the gameshow version of the problem.> I thought of that while composing my game show metaphor. True, even if it's just a number written down on a card, or displayed by a computer, even if the number is presented in scientific notation, or with any fancy bignumbernotation you can devise, it's still possible to have a number that can't be expressed that way using every particle in the universe. But this is an idealized math problem, we can ignore such physical constraints for the sake of argument, can't we? <You are competing against other contestants who have their own envelopes and have to make their own decisions. They *might* have the same number you have, but how likely is that?> I had thought for a moment stipulating that both contestants receive the same envelopes simultaneously, e.g. my "n" value is your "n" value each round. But I didn't want to complicate matters needlessly. I reasoned that if switching conveys a benefit, it would do so with or without that stipulation. <It would have taken ten coin tosses to generate your number, 19683, and the odds against that are more than 1000 to 1. Unless there are at least a thousand contestants competing against you, or the game lasts more than a thousand rounds, or some combination of these, your number is probably going to win.> I see what you're saying, but I think you've taken the game show metaphor further than is useful. I'm just trying to answer the question: what strategy will net you the most points, on the average? To pursue that, we have to assume the contestants are trying to maximize their total score (even though as you pointed out, that's not a good strategy to actually win the car.) If you want, perhaps stipulate that there is some bonus you get if you win by a very large amount, so they have a motivation to turn a big score into a bigger one. <the expected value of switching is zero, because the extremely unlikely risk of a big loss from switching away from the biggest value cancels out all the (more likely but smaller) expected gains from switching when the first envelope contains a smaller number.> The fellow at http://web.mit.edu/~emin/www.old/wr... covered the upperlimit scenario to help clarify the paradox but he reached a different conclusion. However his conditions were different. In his variation, the upper limit is known to the contestants. He writes: <Let us modify the original game so that we always stop after T coin flips. This modification prevents infinities from entering the picture and signficantly affects the situation. Specifically, in the modified game the always switch strategy is no longer optimal. This is because the maximum possible amount of money in an envelope is 3^T. Thus the optimal strategy is to keep an envelope if its value is 3^T and otherwise exchange it.> Let's look at a very small case, the most degenerate case that still has meaning. Suppose T=3, so the only possible values are 1, 3, and 9. I'm sure we are both in agreement that if you get a 1 you should switch it, and if you get a 9 you should keep it. The question then is, what to do when you get a 3? According to the author, you should switch anything except the 9s, so the 3 should be switched. I find it hard to argue against his logic, given those very specific conditions. But here's where the weirdness creeps back in: if T=3, we say "always switch unless you get 3^3." If T=100, we say "always switch unless you get 3^100". So what do we do if T is infinity? Always switch! 

Oct1514
  al wazir: <Sneaky: If T=100, we say "always switch unless you get 3^100". So what do we do if T is infinity? Always switch!> And this is paradoxical. But as you yourself have pointed out, when infinity is treated as if it is an actual, attainable number, paradoxes always result. I don't think we're disagreeing here. 

Oct1614
  Sneaky: No, we're not disagreeing—I took some exceptions with the way you were expressing your discomfort with the problem. The mere existence of infinities in a problem doesn't render it meaningless, but it does mean that you have to be on your toes, lest you arrive at absurd conclusions. I'm sure we can agree with what David Chalmers said of the more orthodox envelope problem: this is "just another example of a familiar phenomenon, the strange behaviour of infinity." (By the way, I said T=3 in my previous post, but of course I meant T=2. T can be 0, 1, or 2.) 

Oct1614
  WannaBe: Anyone who disputes/disagree or find faults with a MIT paper must be superbrilliant guy. 

Oct1614
  al wazir: <WannaBe: Anyone who disputes/disagree or find faults with a MIT paper must be superbrilliant guy.> Or from Princeton. 

Oct1614
  Shams: <al wazir> Did you ever work with anyone at the IAS? 

Oct1614
  al wazir: <Shams>: I had to google to find what 'IAS' refers to, which should tell you something. Princeton University and the Institute for Advanced Studies are separate organizations, though there are some people who have connections with both. I know Freeman Dyson from when he was working there, but in connection with something else, not from collaborating with him in research. Possibly someone I have worked with at some time has been employed there  academics move around a lot  but I can't think of anyone offhand. My work has been almost entirely in applied physics, which is by definition not the "advanced" kind. 

Oct1714
  Sneaky: You know Freeman Dyson? I am humbled. 

Oct1714
  al wazir: <Sneaky: You know Freeman Dyson? I am humbled.> It's no big deal. Why, Dyson's nextdoor neighbors, his grocer and bank teller and dentist and postman know him. Lots of people do. 

Oct2314
  al wazir: It's time for another stumper.
You've probably all seen those calendar cubes that are used to show the date. (Maybe you even have them where you work.) Two cubical blocks made of wood or plastic, with a single digit printed on each face, nestle in a little tray that holds them so that each block shows just one face. By turning the blocks every day to show different faces and sometimes interchanging the two cubes, it is possible to display every pair of numbers from 0 1 to 3 1 and hence every possible date. So the first question is, what are the digits printed on the two blocks? And how many different numbers can be displayed altogether? That one is really easy  hardly worth calling a stumper, especially if you've already examined the blocks, as I did this afternoon, in which case you already know the answer  but the next one is much harder. Suppose that each face displays a *binary* number, i.e., a number of the form 0, ǀ, ǀ0, ǀǀ, ǀ00, ǀ0ǀ, etc. You read the number shown by the two cubes together by *concatenating* the numbers on the two faces that can be seen. For example, if the cubes display ǀǀ (left) and ǀ00 (right), you read that as ǀǀǀ00, the binary version of 28. Is that clear? So the question is, how many *consecutive* numbers can be displayed this way, starting with 1? The catch is, you have to make sure you don't skip any. You're allowed to choose the 12 binary numbers  one on each of the six faces on each cube  any way you like. This is really tough. I don't know the answer, but I can prove it's less than 288. 

Oct2314   beatgiant: <al wazir>
Are we allowed to flip the cubes to use the same face for two different numbers? 

Oct2314   beatgiant: <al wazir>
If not allowed to flip, seems to me it couldn't be more than 84. 

Oct2414   beatgiant: <al wazir>
I would get 288 as the upper bound by assuming we are allowed to flip, but we aren't allowed to use only one cube. Were those your assumptions too? 

Oct2414
  al wazir: <beatgiant: Are we allowed to flip the cubes to use the same face for two different numbers? I would get 288 as the upper bound by assuming we are allowed to flip, but we aren't allowed to use only one cube. Were those your assumptions too?>? You're working on my second question, but you haven't answered the first one yet. 

Oct2414   Schwartz: Nice trick in the first one.
In the second case I reached I0I I0I0 or 170 with:
000, 00I, 0I0, 0II, I0I, III
000I, 00I0, 0II, 0I0I, III, I00I 

Oct2414
  al wazir: <Schwartz: I reached I0I I0I0 or 170>> I don't understand. The largest number I can make with those choices is 122. 

Oct2414   Schwartz: Oh.. ..oops!! Embarrassing.. Seems I need to relearn to count. My apologies, <al wazir>. Okay.. so.. 90, not 170, is the number to beat. 

Oct2414
  Shams: <al wazir> My ears prick up whenever you say a problem is "really easy". Thanks for the Mondaylevel puzzle. 

Oct2414
  Sneaky: I had a devil of a time just figuring out the decimal calendar cubes problem. I've seen them in banks (etc.) but never played with one to see how it works. Here's what I finally came up with: cube 1 = 1, 2, 3, 4, 5, 6
cube 2 = 0, 1, 2, 7, 8, 9
It's worth mentioning that there will be situations where you can't actually use both cubes, for example the number 8 cannot be expressed as "08", you have to just show the cube with 8 on its face, and hide the other one until it's needed again. I thought of one other method which eliminates the problem of not using two blocks, but it has a kind of a cheat: cube 1 = 0, 1, 2, 3, 4, 5
cube 2 = 0, 1, 2, 7, 8, 9
The cheat is, there is no face that reads "6" but you can just turn the 9 upside down. 

Oct2414
  Sneaky: Just to make sure we're asking the same question on the binarycubes problem: #1  <Are we allowed to flip the cubes to use the same face for two different numbers?> In other words, can I use the same face as both 1101 and 1011? #2  It's perfectly OK to only use one of the two blocks, right? So if I have a 001 on one block, I can use that for my number one, right? I don't need to also have a 0 on the other block to stick in front of it, or do I? 

Oct2414
  al wazir: <Sneaky: Here's what I finally came up with: cube 1 = 1, 2, 3, 4, 5, 6
cube 2 = 0, 1, 2, 7, 8, 9
It's worth mentioning that there will be situations where you can't actually use both cubes, for example the number 8 cannot be expressed as "08", you have to just show the cube with 8 on its face, and hide the other one until it's needed again.> By saying "every <pair> of numbers from 0 1 to 3 1," I hinted that you have to use both cubes (if you used only one in the bank, where would the other cube be, in a vault?), but maybe I should have made that condition explicit. <I thought of one other method which eliminates the problem of not using two blocks, but it has a kind of a cheat: cube 1 = 0, 1, 2, 3, 4, 5
cube 2 = 0, 1, 2, 7, 8, 9
The cheat is, there is no face that reads "6" but you can just turn the 9 upside down.> Bingo! 

Oct2414
  al wazir: For the record, in the decimal problem both cubes have to have 1 and 2 so that you can make 11 and 22. There has to be a 0 to allow for 10 and 20, and the 3 has be on the other block to allow for 30. So now you have 0, 1, 2, _, _, _ and 1, 2, 3, _, _, _.
That leaves six spaces for the six remaining digits: for example, 4, 5, and 6 on the first one and 7, 8, and 9 on the other. But you're required to use both cubes, which means there's no way to make 04, 05, and 06. There has to be a 0 on the second cube too. Now you don't have enough places for all the digits. That's where the trick of inverting the 9 to make a 6 comes in. So the two cubes contain the numbers 0, 1, 2, 3, 4, 5 and 0, 1, 2, 6/9, 7, 8, respectively. This is what I found on the calendar cubes I saw yesterday in a doctor's office. But they could just as well have had 3, 5, and 7 on one and 4, 6, and 8 on the other, or other arrangements. And of course the order in which they're placed on the faces doesn't matter, in contrast to ordinary dice, where the numbers on opposite faces add up to 7 (1 and 6, 2 and 5, 3 and 4). On to the binary problem! 


< Earlier Kibitzing · PAGE 243 OF 243 ·
Later Kibitzing> 


