ariels noded the two envelope paradox. This is a spolier if you haven't read that or thought about it for yourself.

The assertion "proved" in the paradox is of course wrong. It's easy to come up with an alternative line of reasoning, showing you profit nothing by switching envelopes. Malik has done this below, if you can't come up with the "proof" yourself. But where does the reasoning of the two envelope paradox break down?

It's actually more of a trick than a paradox. The "proof" depends on the (implied) assertion that any sum is equally likely to be in the envelope. If you knew the distribution according to which the sum in both enevelopes was chosen and you saw $x in your envelope, you could calculate the a posteriori probablity of the other envelope holding $2x or $0.5x (just by comparing the probabilities of the total sum being $3x or $1.5x). Half of the time you'd have to guess you probably got the smaller sum, and half of the time that you got the larger.

Now, it seems at first sight to be a fair solution to say "but I don't know the distribution by which they chose the sum, so we'll just say those probabilities are the same". But it turns out there just isn't any probability distribution like that! If you want "all x to have equal probability", call it p, then either p>0, in which case the integral (sum) of the distribution is obviously infinite, when it should be 1. But if p=0, the integral is 0, which is again wrong.


The new PHB comes into the office one morning. The sales force is despondent. The tech force is out of ideas. But the powers on high tell him that everything will be just fine. They show him his new office. He opens the main drawer, and there are two envelopes. One is labeled, "Open after 3 months." The other is labeled, "Open after 9 months." So, in respect for whomever left the envelopes, he obeys.

After 3 months, things are worse than ever. The heat is coming down hard and heavy from the suits upstairs. He opens the first envelope. It reads: "Blame your predecessor." So he proceeds to do just that. "It was all his fault," is all he repeats day after day. This works for a while. But things continue to deteriorate.

After 9 months, he's at his wit's end. He shakily reaches for the second envelope and opens it slowly. It reads: "Prepare two envelopes...."

Since I wasn't satisfied with the previous response, I thought I'd give it a try, although I'm not the best at this sort of thing.

Read the two envelope paradox if you haven't already, since I'm going to refer to it a lot without repeating it here.

Okay, there's two odd conclusions here... The first is that you will always want to switch envelopes sight unseen, no matter which one you start with. That's a pretty funny conclusion, so let's have a look at the reasoning. I think the fallacy here is substituting M for "N or 2N", so let's try it with just N's.

We have four cases:

  • You get N and you stay: You have N.
  • You get 2N and you stay: You have 2N.
  • You get N and you switch: You have 2N.
  • You get 2N and you switch: You have N.

As you can see, the results are exactly the same whether you switch or not. If you always stay you always get N or 2N with .5 probability of each. If you always switch you always get 2N or N with probability .5 each. Expected value therefore being 1.5N regardless of switches.

So why is this different from the $100 scenario? In the second scenario we don't always start out with $N. In order to make the initial value $100 for all cases, we would have to change the value of N for two of the cases, and therefore we change the probabilities. That's what happened with the "M" unknown in the above example. "M" was 1/2N or N, depending on whether the first envelope was the "high" envelope. So this (Perhaps unintentionally) tricks us into thinking the second problem is the same as the first. Effectively, we're lowering the stakes when the "high" envelope is chosen, so the actual gain when switching is raised even though the probabilities are otherwise unchanged.

As for the second part... at first glance the reasoning seems sound despite the bizarre conclusion, but in actuality there's a difference between the first switch and any subsequent switches. Obviously switching twice is the same as not switching at all, so the expected gain will be 0 when making the second switch.

Two cases for switching once:

  • You had N and switched. You have 2N.
  • You had 2N and switched. You have N.
Both cases are .5 probability so the expected result is 1.5N.

Two cases for switching twice:

  • You had 2N and switched twice. You have 2N.
  • You had N and switched twice. You have N.
So no matter what you do you still have what you started with; expected result is 1.5N.

The fallacy here is a continuation of the previous one, switching M for a complex probability and trying to compare to the initial problem.

Too bad, cause it'd be a good way to pick up some extra cash if it worked.

I think part of the reason people are confused with this is that they didn't read the problem closely:

I give you the first envelope (with $100); you may either keep it or switch.

You know you have $100, which means you can think objectively about the game. The way I think of the game is in terms of a gambling game - tossing a coin. The rules (taken from the original problem) are simple:

1: You must always bet half, and only half of your money.
2: Heads you win 2:1 (ie $50 wins you an extra $100)
3: Tails you lose

Looking at it like this, we can see two things:
1: You will never lose all your money - you could end up with a minute fraction of a cent, but you can still play with half of it.
2: The game favours the player - the odds are 50/50, yet the game is paying 2:1. 50/50 odds normally (in a casino) pay 1:1 (except nothing in a casino is actually 50/50).

If you were lucky enough to be playing this game, the strategy is simple:
1: If you have less than $100 you should play. The $100 was given to you, so you have nothing to loose.
2: Only ever quit when you are ahead.

Rule 2 is the catch - since the odds are 50/50, playing an infinite number of games should leave you with $100 (roughly 50% wins and 50% losses). You have to decide when you feel you have enough money.

The game is a free ride. You only need to win 4 times in a row (from $100) to leave the game with $1200 dollars.
I looked around the internet for solutions to this problem and everything I found written about it was muddled and didn't really resolve the implicit conflict. I worked through the problem myself and found something that I haven't seen anyone else here mention, so I'll give it a shot. (Feel free to yell if you disagree with my analysis; I'm far from a perfect being.)

Since the two avenues of logic involved are contradictory (that switching cannot matter, and that switching must be in your best interest), clearly there must be an implicit assumption somewhere that is wrong. You KNOW switching can't matter; that is utterly, utterly intuitive and indisputable. However, you also KNOW that given a 50/50 shot of doubling or halving, it is beyond doubt that you should change. Both of those statements seem positively true, and it turns out that they both are. So what gives? It turns out that the key is this:

Your odds are NOT 50/50 of getting a larger or smaller amount if you switch!

Turns out that, magically, it is actually MORE LIKELY that you have chosen the envelope that contains more money. This is, of course, given a couple of reasonable assumptions about the problem.

Assumption 1: The person stuffing the envelopes has a finite (but unknown) amount of money at his disposal.

If you knew the amount he had, it would be easy. If he has only $150 to spare, and you open $100, you wouldn't switch, because you would always lose. If he has $1000, and you open $100, then you would switch, because the switching argument will hold true. It turns out that if you know the limit L, you should always switch if you've opened less than L/2, and stay if you've opened more.

Also, the problem really doesn't make sense if you're working with an infinite amount of money, because for one, all sorts of weird contradictions happen with infinity, and besides, there really is a finite amount of money in the world.

Assumption 2: The person stuffing the envelopes is equally likely to put any amount into the envelopes based on the amount of money he has.

Meaning, if the guy has $300 to work with, he's just as likely to fill them with $100 and $200 as $1 and $2 or anywhere in between.

The problem with 50/50. Let's say, for example, that I am stuffing the envelopes, and I only have $1500 to blow. (The real limit doesn't matter; it only matters that there is some limit, as you'll see.) This means that the most you'll ever see in the envelopes are $500 and $1000.

Say I randomly choose an amount between $1 and $500 to put in Envelope A, and then double that amount and stick it in Envelope B. It turns out that you are twice as likely to open an envelope with $500 or less as you are to open one with $500-1000. The reason is that only B will ever contain one of those higher amounts (since if it was A, and you saw $600, $1200 would be in B and therefore above my means.)

However, either A or B could contain any amount below 500. If you open and see $400, it could either be A you've opened, in which case the other has $800, or it could be B, in which case the other has $200. Since in effect more envelopes (twice as many) will contain amounts in this bottom half, it follows that given a random envelope, two-thirds of the time it will be <$500.

Now that you know this, the math checks out. You pick an envelope. Call x the amount in the envelope, and L the unknown amount equating to the most you could ever find in an envelope. If you just keep the envelope, you have made x, plain and simple. What if you always switch?

Two-thirds of the time, x < L/2 , and you will gain by switching. One-third of the time, x > L/2, and you will lose by switching.

The revised situation: (2 / 3) ((2x + x/2) / 2) + (1 / 3) (x / 2) = x

...and order is restored.

Log in or register to write something here or to contact authors.