|
Post by Russell Letson on Jun 17, 2009 20:54:48 GMT -5
If this game is held in many parts of the world, the unchosen goat is the lucky one--first goat gets the epitaph "Mighty good eatin'."
|
|
|
Post by millring on Jun 17, 2009 21:16:46 GMT -5
Maybe it makes more sense using 10 doors. Actually, that doesn't help one bit. The only thing that swayed me at all was when Russell queried about the meaning of "odds". That is, I suddenly remembered what little I had of probability and statistics, and a flicker of recognition occurred that is simply... Odds (probability) is SUPPOSED to be based on a HUGE (infinite) sampling wherein all the anomolies and clumpings can be evened out. The fact that the first choice made by the contestant is a 1 in 3 probability is based, not just on the notion that it is one in three of the choices, but "probability-wise" the closer we'd come to an infinite number of games played, the more likely, with all the anomolies and clumps evened out, we'd see a 1 in 3 payoff realized. By exactly that same token, and no other token, and no other explanation.....the other two choices that were not the initial choice of the contestant will ALWAYS still be the other 2 out of the three choices. That means that in an infinite number of games played, the probability lies in those two doors always remaining the 2 out of 3 that is the exact complement to the contestant's initial 1 out of 3 choice. As the contestant no longer has to decide between the 2 of the 2 out of 3 choice that will always be the 2 out of 3 choice (Monte does that for him), he gains the advantage of probability. Still, I'm with Russell in his last summary, and I wonder just how many games would have to be played before there was a statistical difference detectable between the switching or not switching.
|
|
|
Post by Supertramp78 on Jun 17, 2009 22:16:35 GMT -5
Not many. If I give you one game where you will win 33% of the time and another where you will win 66% of the time, how many times do you think you would have to play it before you figured out that you won one of them more often than the other? In reality you could play both about ten times and count up the results on your fingers. Assuming you have 10 fingers.
Let's put this a different way.
You will win twice as often if you switch.
I think you would be able to detect that pretty fast.
|
|
|
Post by omaha on Jun 17, 2009 22:21:50 GMT -5
Tramp's right, John.
Go to the simulation site I linked to earlier. Try each strategy. You'll see almost immediately how much difference there is.
|
|
|
Post by Fingerplucked on Jun 18, 2009 7:18:07 GMT -5
Tell you what. Anyone who thinks the odds are different than what Jim and I are saying, I will gladly play this game for money. Here's the deal. It costs $100 per game to play. If you win, I'll pay out $250. If you believe this is a 50/50 game, then you should love those odds. The only stipulation is that you have to stick to the "hold on to your initial choice" strategy. No switching. I'm surprised no one took you up on this. If you play one game, you'll probably win $100. But there's still a 33% chance that you'll have to pay out $250. Those are your best odds. From there it all goes downhill. If you play 9 games, you should win 6 of them and win $600. However, the other guy will win three of them. Since you're paying $250, he'll win $750. The more games you play, the more you'll pay.
|
|
|
Post by billhammond on Jun 18, 2009 7:19:10 GMT -5
Tell you what. Anyone who thinks the odds are different than what Jim and I are saying, I will gladly play this game for money. Here's the deal. It costs $100 per game to play. If you win, I'll pay out $250. If you believe this is a 50/50 game, then you should love those odds. The only stipulation is that you have to stick to the "hold on to your initial choice" strategy. No switching. I'm surprised no one took you up on this. If you play one game, you'll probably win $100. But there's still a 33% chance that you'll have to pay out $250. Those are your best odds. From there it all goes downhill. If you play 9 games, you should win 6 of them and win $600. However, the other guy will win three of them. Since you're paying $250, he'll win $750. The more games you play, the more you'll pay. I think I know how you can finance an OX, Jim.
|
|
|
Post by Fingerplucked on Jun 18, 2009 7:25:38 GMT -5
I know. But I'd feel like a traitor.
Still, if we play enough games, I could get an OX, a Cargo, and a GX.
|
|
|
Post by billhammond on Jun 18, 2009 7:30:15 GMT -5
I know. But I'd feel like a traitor. Still, if we play enough games, I could get an OX, a Cargo, and a GX. Drive to Madison, Wis. -- there is a CA dealer there.
|
|
|
Post by omaha on Jun 18, 2009 7:42:50 GMT -5
Tell you what. Anyone who thinks the odds are different than what Jim and I are saying, I will gladly play this game for money. Here's the deal. It costs $100 per game to play. If you win, I'll pay out $250. If you believe this is a 50/50 game, then you should love those odds. The only stipulation is that you have to stick to the "hold on to your initial choice" strategy. No switching. I'm surprised no one took you up on this. If you play one game, you'll probably win $100. But there's still a 33% chance that you'll have to pay out $250. Those are your best odds. From there it all goes downhill. If you play 9 games, you should win 6 of them and win $600. However, the other guy will win three of them. Since you're paying $250, he'll win $750. The more games you play, the more you'll pay. Think you're going to beat the house? Any time, any place, any stakes.
|
|
|
Post by millring on Jun 18, 2009 8:13:53 GMT -5
Go to the simulation site I linked to earlier. Try each strategy. You'll see almost immediately how much difference there is. Of course you understand that that simulation site is a plant by the cult of Montehall? It's rigged. You get the results that the Montehallics want you to get to prove their religion. The car moves from door to door as needed and the goats are under-fed and angry.
|
|
|
Post by omaha on Jun 18, 2009 8:18:46 GMT -5
Its hard to over-estimate the extent of the conspiracy.
It wasn't enough to rig a few computer simulations. That was child's play. The real challenge was altering the very nature of reality, logic, and mathematics so that when simulations are run "off-line" (like Russell's proposal of doing it with Scrabble pieces), the illusion still holds.
|
|
|
Post by Greg B on Jun 18, 2009 8:20:04 GMT -5
Will a large number of iterations finally reveal a 2/3 success rate when the always-switch strategy is used? THAT is the question! And here is the answer. I ran my program to try each strategy 100000 times. First: We don't switch. Results for shouldSwitch = false Total Test = 100000 Wins = 33204 Losses = 66796 Next, we do switch Results for shouldSwitch = true Total Test = 100000 Wins = 66765 Losses = 33235 Lastly, after Monty removes a door we toss a coin and take a 50%-50% chance at a door. Fully random choice Total Test = 100000 Wins = 49898 Losses = 50102 If you don't switch you lose 2 out of 3 times. If you do switch then you win 2 out of 3 times. The coin toss wins about half the time. A totally random selection wins about half the time.
|
|
|
Post by millring on Jun 18, 2009 8:24:36 GMT -5
Its hard to over-estimate the extent of the conspiracy. It wasn't enough to rig a few computer simulations. That was child's play. The real challenge was altering the very nature of reality, logic, and mathematics so that when simulations are run "off-line" (like Russell's proposal of doing it with Scrabble pieces), the illusion still holds. I just hate conspiracies that have the power to alter reality. They are the most believable and insidious conspiracies EVER.
|
|
|
Post by millring on Jun 18, 2009 8:25:29 GMT -5
And here is the answer. I ran my program to try each strategy 100000 times. When did you find time to sleep?!
|
|
|
Post by aquaduct on Jun 18, 2009 8:49:01 GMT -5
Tell you what. Anyone who thinks the odds are different than what Jim and I are saying, I will gladly play this game for money. Here's the deal. It costs $100 per game to play. If you win, I'll pay out $250. If you believe this is a 50/50 game, then you should love those odds. The only stipulation is that you have to stick to the "hold on to your initial choice" strategy. No switching. I'm surprised no one took you up on this. If you play one game, you'll probably win $100. But there's still a 33% chance that you'll have to pay out $250. Those are your best odds. From there it all goes downhill. If you play 9 games, you should win 6 of them and win $600. However, the other guy will win three of them. Since you're paying $250, he'll win $750. The more games you play, the more you'll pay. No, I'm still working through it (damn you for pointing out that he'll lose either way). I think Tramp's big chart a couple pages ago pegs the error. It essentially cherry picks 3 of the 9 independent possible outcomes of the game (3 doors means 3 possible contestant picks times 3 possible pize locations = 9 independent outcomes). If the chart were expanded to include the other 6 outcomes, it would be easy to cross out the six that are eliminated with the contestant's first pick and then the seventh that is eliminated with the openning of the door. I'm pretty sure that the problem statement as diagramed then is the systemic reason for the over estimation of probability. Note that it lays out 3 outcomes that cannot exist together once the contestant choice is made. That has the effect of implying that Monty can move the car after the game starts. The best way to prove this would have been to make an analysis of the couple decades worth of shows that Monty tried this with, but I guess that data doesn't exist. The next best option would be some kind of Monte Carlo simulation that literally plays the game 50 or 60 thousand times to see if the outcomes trend to 50% winning. I'm suspicious of computer algorithms because the same bias can be inherent in the problem statement. I suspect the algorithm does something to inadvertantly cull possible outcomes leading to the skewed game. That's my next quest (I note that Jeff's link has some information on the program, maybe even the code). Once I've established that the game is accurate, I'll take Jeff's wager. Not for money since that's just too messy. I'll settle for public humiliation on an internet forum. Why? With the recent financial meltdown and the climate change debate, I've become fascinated with the ways statistics and computer modelling are used and abused, more often than not by supposed "experts" (academics, economists, and commentators). Just gotta run this one down. Note to FP: The statistics work the same way with 10 doors only you'll have 100 (10 choices x 10 prize locations) possible outcomes, 10 of them winners. The door choice knocks it down to 10 possible outcomes. Monty's showing you one door takes out one possible outcome, leaving you with 2 chances in 9 of winning. However, both of those chances have an equal shot at winning. So switch or don't switch. That move doesn't increase (or decrease) your odds.
|
|
|
Post by millring on Jun 18, 2009 9:10:18 GMT -5
"Note that it lays out 3 outcomes that cannot exist together once the contestant choice is made. That has the effect of implying that Monty can move the car after the game starts."
That's one of the things that bugged me about the charts. And I still say that some of the explanations for why it pays to switch are dead wrong. And part of me still wonders if the entire question is other that it's been framed so far. And I concur that if the program written to project results is written from a pov that accepts a false premise to begin with, then it is just as flawed as the reasoning -- like a power tool, it can just make mistakes faster than we can by hand.
But to me, the clincher for the "it pays to switch" thesis is no more complicated than this: The contestant's first choice was one in three.
Unless there is some other way of looking at the problem wherein that simple statement is disproved, then I think there's something to the switching thing. I'd be happy to be proven wrong because part of me still reads what I wrote yesterday and "gets" that side of the debate too.
And that, as an aside, is another thing I think is fascinating. The folks who have the right answer (switching is better) can't frame the other side of the debate.
|
|
|
Post by Supertramp78 on Jun 18, 2009 9:33:23 GMT -5
NOBODY is saying the player's first choice ISN"T one in three. It is. And the odds of picking a Donkey is 2 in 3 (agreed?) regardless of how you arrange the doors or where you put what. The odds of picking a car initially is 1 in 3 (agreed?) That's why expanding it out to all the possible options doesn't make any difference. But, once he picks, THE OTHER DONKEY is removed from the options and he can pick again.
Pick again or switch. Two doors. You have picked one. Behind the two doors is EITHER a car or a donkey. The second donkey is no more. So whatever you START with will NOT be what you end up with if you switch. Remember that.
Now the situation is keep your pick or go to the alternative. You can't go from donkey to donkey because the other donkey is gone. If you picked a donkey and switch, you get a car. If you picked a car and switch, you get a donkey. Again, doesn't matter how many possible options and arangements there are for starting points. Since the odds of picking a donkey initially is 2 in 3, the odds of getting a car after a switch is 2 in 3. The odds of getting a car if you don't switch remain the odds you had at the beginning for picking a car - 1 in 3.
Look it up. Do the math. Consult experts. Run the simulations. Whatever.
I'm still considering Omaha's payouts to see if I wouldn't mind getting in on that action. If his payout for him losing is more than twice what I pay to lose then I'm all over it. If he keeps the $100 ante regardless of who wins, no way. The assumption is I pay $100 and never switch my pick. For every car I get he pays me $250 BUT he keeps my original $100 entry fee so I only net $150 which is less than the two times it would have to be for me to break even in this game.
For every donkey I get he keeps my $100. I will win this game 33% of the time if I never switch. Over 99 games I'll win 33 of them and he will pay me $8,250 BUT he will have kept my original $100 entry fee for each game which would be $9,900. He comes out ahead to the tune of $1,650.
|
|
|
Post by omaha on Jun 18, 2009 9:53:04 GMT -5
I'm still considering Omaha's payouts to see if I wouldn't mind getting in on that action. Its this simple: Every game played, the house (that would be me) receives $100. Every third game, the player wins $250. But over the three games it takes (on average) for the player to win, the house collects $300. $300 > $250. For every cycle of three games, the house is $50 ahead. If I keep them playing, I can't lose. ****************************** BTW, here's another way to look at it: Simplify the game. Just say "There are three doors. Two have goats, one has a prize. The game is you pick one and get to keep whatever is behind the door." That's indistinguishable from the "hold" strategy, and it should be obvious that in such a game the player has a 1/3 chance of winning. Which means as long as the house pays less than 3x the bet for a win, the odds are in the house's favor.
|
|
|
Post by aquaduct on Jun 18, 2009 10:00:57 GMT -5
Since the odds of picking a donkey initially is 2 in 3, the odds of getting a car after a switch is 2 in 3. That is simply not true. Once one of the donkey's is eliminated, you've got a 50% shot at getting a car no matter which strategy you choose. I have. And I am. Your turn to try.
|
|
|
Post by Supertramp78 on Jun 18, 2009 10:17:28 GMT -5
"That is simply not true. Once one of the donkey's is eliminated, you've got a 50% shot at getting a car no matter which strategy you choose."
Sigh......
Do we agree with the statement that the odds of picking a donkey with your first pick is 2 out of 3? I mean that is pretty damn easy to understand, right? There are TWO donkeys and ONE Car. 2/3. Easy.
I'm now going to give you a choice. Stay where you are or flip whatever you have to what it isn't. If you flip, you get a car if you started with a donkey. You get a donkey if you started with a car. Since the odds of STARTING with a donkey is 2/3, the odds of ENDING with a car if you switch is the same - 2/3.
Watch this.
or this....
|
|