Like it? Share it!

Sign up for news and updates!






Enter word seen below
Visually impaired? Click here to have an audio challenge played.  You will then need to enter the code that is spelled out.
Change image

CAPTCHA image
Please leave this field empty

Login Form



Innumeracy, Gambling, and Absorption Walls PDF Print E-mail
Swift
Written by Dr. Steven Novella   
My second course for The Great Courses has been recently released (shameless plug) -Your Deceptive Mind. It's a skeptical romp through the workings of the human brain, and a good primer of skeptical thinking.  

In the course I cover the topic of innumeracy - the fact that humans have a poor innate grasp of math and statistics. I expected to get a disproportionate amount of feedback from these two lectures, based on my experience with the SGU. The topic that has lead to the greatest number of e-mails over the years has been the Monty Hall problem.  

OK, quickly, here it is again for those who don't know. You are on the show, Let's Make a Deal, and you are offered three doors. Behind two doors there is a goat, behind the third is a new car. Monty Hall, the host, knows what is behind each door. After you pick a door Monty Hall opens one of the other doors to reveal a goat (which he always does), and then asks if you would like to switch your pick to the other remaining door. What should you do? The answer is that you should switch, because it will increase your chances of winning from 1/3 to 2/3. Some people have a hard time grasping the statistics involved - and that's the point.  

I discuss the Monty Hall problem in the new course and warned the publisher that this will likely result in a great deal of feedback. However, my first e-mail regarding innumeracy concerns a separate issue. In the course I claim that if the odds in a casino were exactly even then the casino would still make money over the long haul. The reason is this: if the odds are truly even and fair, and we assume that magic, lucky charms, prayer, and psychic powers do not work (just for the sake of argument), then individual winning and losing will follow a drunkards walk of randomness. This will average out over any stretch of time, with the error bars narrowing to zero over longer stretches of gambling.  

However, there is an asymmetry to this process - the point at which people stop gambling. In order for winning and losing to average out across all gamblers for the casino, there must be no asymmetry in when winners and losers stop gambling. If gamblers stop at a random point, or after a set point (by time, number of bets, or number of free drinks - whatever), something that has nothing to do with whether or not the gambler is currently winning or losing, then the casino will break even over time.  

However, the above scenario is not the case. There is an asymmetry. Some gamblers go bust. They lose all their money (or all the money their spouse lets them bet) and then are forced to stop gambling. The house, however, never goes bust because their pockets are just too deep, and there are limits on how much individuals can bet. You can describe this as most gamblers having an absorption wall on the losing side of their drunkards walk of winning and losing.  

I did not think that this claim would turn out to be controversial, but I recently received the following e-mail:  

"In Lecture 11 of the Critical Thinking Series, around 5 minutes in, you discuss the Absorption Wall, and state that the casinos would still rake in lots of money because of it." 

"An Absorption Wall is there alright, but it has absolutely no effect on the casino winnings. If the Expected Value of a fair coin toss is $0,00, it will still be 0 with a player that has been losing or a player that has been winning, as you've said yourself many times before explaining other fallacies. But then you go on to confound regression to the mean thinking that if a winning player keeps playing, he will go back to his average of zero dollars won in the long run. But the regression to the mean will make sure that his average winning per bet, in all his sample size, will keep decreasing and reach 0 at infinity, but not his absolute winnings. In fact, if he won $5000 so far, he has exactly 50% change of having more than $5000 after 100 more bets; what will probably go down is his average winning per bet, not his absolute winnings." 

"I made a program to simulate that just to make sure I was not crazy or missing something, by the way, which confirmed I was not. After several minutes simulating, the casino busted more than 9k gamblers, but is still losing 200k for an average of -$0.00006 per 10 dollar bet... Some lucky gamblers have a lot of money, and while their average winnings per bet are appropriately approaching 0 as they keep playing, the money they won so far does not."  

First, I do agree with many of his premises. It is true that once someone has won $5000, if you start tracking statistics at that point then $5000 becomes his new baseline and he is 50% likely to have more or less than $5000 at some arbitrary point in his gambling future. This reflects the fact that each bet is an independent event, and does not "remember" the result of previous bets. I also agree that regression to the mean affects average winnings, and not total winnings - if we reset our baseline.  

I admit there is more than one way to look at this problem, with different approaches giving different intuitive answers. The e-mailer is looking at the problem this way – the absorption wall removes gamblers from the pool of all gamblers (because they go broke). However, from the point at which they are removed they are just as likely to win and regain their losses as they are to lose even more money, therefore their removal has no effect on the Casino’s net profits.  

You can look at the problem another way, however. Because of the absorption wall each individual gambler is more likely to walk away a loser than a winner. Think about it this way – if you had $10 to gamble in Vegas and were vacationing there for a week, what are the odds that you would go home a winner? It seems obvious that it’s pretty small, as the chances that you would go bust are pretty high with such a small starting amount.

The same is true, however, of larger amounts of money – hundreds or even thousands of dollars, even though the chance of going bust is smaller than if you start with $10.

The absorption wall of going bust, therefore, makes it more likely that you will go home a loser than a winner, and since you are playing against the house it makes sense that the house has a greater chance of being a winner than a loser.  

This effect could be offset if winners win more than losers lose, to compensate for the greater number of losers. This may be true. The absorption wall has two effects - it increases the chance of going home a loser, but it also sets a limit on how much an individual can lose. There is no practical limit on how much a winner can win, however, since the casino has too much money to go bust.  

This latter view is based on a classic statistical problem known as the gambler’s ruin. In this problem you have two gamblers each with different amounts of starting money. They will compete against each other in a game of chance until one goes broke. The problem is to devise an equation that will give the probability of either gambler winning.  

The statistics here clearly show that the gambler with the larger amount of starting money is more likely to win. In statistical discussions of the gambler’s ruin the implication for casinos is often raised. 

And that is why you are statistically doomed to lose in Las Vegas. Even if the odds are not against you, unless you can match the resources of the house you will eventually end up on the short end.”  

But this is not an exact analogy to the question at hand – because in this statistical model the gambling continues until one person loses everything. If that were the case, then you would have no chance against the house. But that is not how Vegas works for most people. Most gamblers are visiting for a finite period of time, so they will stop gambling when they either run out of money or go home (again, assuming they do not self-impose a stopping point).  

The e-mailer also mentions a computer simulation, which could be a nice way to solve the dilemma. However, the details of the simulation are important. Are gamblers that go broke replaced from an infinite pool of new gamblers, or is there a finite pool of gamblers who have multiple stopping points - going home or going bust. Then again, perhaps it doesn't matter, but that is the very thing we would be testing.  

Here is a related thought experiment. The gambler's ruin equation states that, with a finite bankroll, every gambler will eventually go broke. So let's say we have a hypothetical casino in which gamblers enter with a finite stake and always play until they go broke. When they do they are eliminated and replaced by another gambler with a finite stake. Therefore all gamblers enter the casino with a finite amount of money and leave with no money.  

It seem that in such a scenario, even with 50/50 odds, the casino must be making money. But I suppose it's possible the influx of money is not going to the casino but is being held by temporary winners.  

Here is one more thought - let's say you flip a fair coin a random number of times. With more flips the outcome approaches the predicted outcome of 50% heads and 50% tails. But let us also say that you employ one rule - you must always stop after flipping heads. Wouldn't that bias the statistics in favor of heads, even a little bit?  

Likewise, if a gambler goes broke, then it stands to reason that they lost their final bet. Wouldn't that also bias the win-loss ratio toward the gambler losing? Would this explain how the casino could win even with 50/50 gambling odds?  

Conclusion  

After researching this problem further and thinking it through as above I now believe that the e-mailer may be correct. I could not find a definitive answer online - but I did find many sites repeating my original claim, that casinos would still make money even with even odds.  

Sites seemed to be split between those citing my original logic, and those citing the e-mailer's logic. That logic, I admit, is compelling - every bet has even odds and so is statistically neutral for the casino, regardless of who is making the bet, their current bankroll, and their prior history of winning and losing. Each bet is a statistically independent event.  

But I am not convinced that the phenomenon of going bust does not create a statistical bias toward gamblers losing, perhaps for the reason I stated above (always losing the final bet).  

So I leave it my readers to further explore this dilemma and see if we can come up with a definitive solution.

I guess I can take some solace in that the whole controversy proves my original point - people generally lack an intuitive sense of probability and statistics.

 

Steven Novella, M.D. is the JREF's Senior Fellow and Director of the JREF’s Science-Based Medicine project.

Dr. Novella is an academic clinical neurologist at Yale University School of Medicine. He is the president and co-founder of the New England Skeptical Society and the host and producer of the popular weekly science show, The Skeptics’ Guide to the Universe. He also authors the NeuroLogica Blog.

Trackback(0)
Comments (40)Add Comment
...
written by Zounds, March 24, 2012
I think you may be confusing two separate issues: the probability of individual gamblers going broke and the overall profitability of the casino.

I agree with your analysis of the chance of individuals going broke. Even if some casinos have a smaller account than some individuals, the casinos can restrict the size of bets to ensure that longer trends are in their favour and in general most gamblers will not be super wealthy.

However, I disagree with your claims that this will affect the overall profitability of the casino. If each bet is an independent event then when one gambler loses and goes bust, another will have won and will keep playing. When one gambler busts and leaves, another will take his place and if the odds are 50/50 then there will also be a large number of gamblers that are up overall. The net effect of these independent events is that, while individuals may bust, the casino will only break even. If the casino only breaks even on gambling winnings, it will lose slowly as marketing, wages and upkeep slowly drain its reserves.

As a practical example, poker tables are a constant source of discussion in casinos. The house takes a small rake which means that the casino can't lose, however the rake is often so small relative to the size of the pot that poker is one of the least profitable games in a casino. Poker is still offered as a goodwill gesture to the hard-core gamblers and because they ironically lead to more slot players as people will play slots more often and for longer if their partners can play poker.
report abuse
vote down
vote up
Votes: +0
Flipping a coin until you land heads
written by daveFNbuck, March 24, 2012
If you flip a coin until you get a heads, the expected number of heads and tails are both 1. This is normally proven by pointing out that the expected number of heads and tails is equal at each flip, but since this was specifically mentioned as not being a proof in the article, here is another proof. We know the expected number of heads is 1 by design. There is a 1/2 probability of 0 tails, a 1/4 probability of 1 tails, a 1/8 probability of 2 tails, etc so the expected number of tails is the infinite sum

(1/2 * 0) + (1/4 * 1) + (1/8 * 2) + ...

which sums to 1.

Even if we assume a finite stopping point, we still get equal expected numbers of heads and tails. The sum for the expected number of tails is no longer infinite and therefore less than 1, but we're also no longer guaranteed a heads.
report abuse
vote down
vote up
Votes: +1
...
written by Willy K, March 24, 2012
I bet that this post gave me a headache! smilies/cry.gif
report abuse
vote down
vote up
Votes: +3
...
written by Mark P, March 24, 2012
The Monty Hall problem does cause trouble, and I've struggled to explain it without use of formal probability. This version often works for people who really won't believe:

You have 1,000 doors. Behind one is a car, and behind the other 999 are goats. You many pick one.

The host then opens 998 doors with goats. You are invited to keep the door you picked, or choose the one remaining other unopened door.


People realise the chance of the first pick was only one in a thousand, so they know they must change their pick. It's obvious to even the most innumerate that the car is almost certainly not the one they picked first.

Now do the same with ten doors. Again they follow that you should change your door to the other unopened one.

Then you do the same maths, but with three doors. Now they can follow the logic, because you have broken their previous (false) reasoning.
report abuse
vote down
vote up
Votes: +14
More simulation...
written by sturmrutsturm, March 24, 2012
I don't know a lot about statistics, but I tried following up with some of your ideas with a computer simulation. I played around with a few variables: How many rounds a player plays until they quit, how big their starting bankroll was, and some betting behavior. No matter what I did, with a 50% chance of winning, the end results always had about a break even (+/-0.005%).

In some tests, most players walked away (~1% going bankrupt) and in other cases most players were bankrupt (~80% bankrupt)... no matter what I did, it ended up with a break even.
report abuse
vote down
vote up
Votes: +0
...
written by OldProf, March 24, 2012
As they say, the devil is in the details.

For example, the classic Monty Hall problem isn't as clear as you present, as it depends on the assumptions:
a) Does Monty ALWAYS open a losing door?
b) If you have picked the winning door, how does Monty choose which door to open?

A simple answer to your question - Suppose that each casino has a large number N of patrons, each with a bankroll of B units, playing the Gambler's ruin scheme (double or go bust, then stop) in an even-money bet with no house advantage. The probability of double/bust is 1/2, so each casino will see a sample from a symmetric Binomial distribution, with mean 0.
report abuse
vote down
vote up
Votes: -10
RE:Devil in the details
written by MikeCargal, March 24, 2012
>for example, the classic Monty Hall problem isn't as clear as you present, as it depends on the assumptions:
>a) Does Monty ALWAYS open a losing door?
Yes, that's part of the definition of the Monty Hall problem
>b) If you have picked the winning door, how does Monty choose which door to open?
In that situation, it does not matter. Monty selects either at random and it has no impact on the results.
report abuse
vote down
vote up
Votes: +5
Gambler's ruin only looks at probability, not expected value
written by daveFNbuck, March 24, 2012
The gambler's ruin problem is really just about the likelihood of an individual going bankrupt, but it doesn't take into account the magnitude of the upside for a player who doesn't. Let's consider a simple example. You have 10 dollars, I have 1 dollar. We repeatedly wager 1 dollar on the result of a fair coin flip until one of us goes bankrupt. To determine the probability that I win, let's compute the probability that I end up with n+1 dollars before going bankrupt given that I've already won n dollars:

For n = 0, that probability is clearly 0 as you've already gone bankrupt. For n > 0, let's call this probability p(n). If we have n dollars, we have a 1/2 probability of immediately increasing to n+1. In the alternative where we decrease to n-1 dollars (with probability 1/2 as well), we have to first make it from n-1 to n, then from n to n+1. By definition, this has probability p(n-1)*p(n). So we get the formula

p(n) = 1/2 + 1/2*p(n-1)*p(n)

which solves to

p(n) = 1/2 / (1 - 1/2*p(n-1))

So knowing p(0), we can compute p(1) = 1/2 / (1 - 1/2*0) = 1/2, p(2) = 1/2 / (1 - 1/2 * 1/2) = 2/3, p(3) = 1/2 / (1 - 1/2*2/3) = 3/4, etc. It turns out p(n) = n/(n+1) (which we can prove using the same induction). So what's the probability that I win? First, I have to get to 2 dollars, then 3, then 4, etc all the way up to 10 dollars. So the probability is just the product of all the numbers p(1)*p(2)*...*p(9) = 1/2 * 2/3 * 3/4 * ... * 9/10. Most of the numbers in this product cancel out nicely to give us a result of 1/10.

So I have a 9/10 probability of losing my dollar, but a 1/10 probability of winning 9 dollars. So my expected win is 1/10 * 9 - 9/10 * 1 = 0. While I'll probably lose, this is completely canceled out in expectation by the upside of my winning. In other words, this is not a good way for a casino to make money in the long term against a lot of players.
report abuse
vote down
vote up
Votes: +3
Monty hall with different reveal strategies
written by daveFNbuck, March 24, 2012
MikeCargal - In the Monty Hall problem, if the host always picks door number 2 if you pick door 1 and both 2 and 3 have goats, this can change the odds a bit. Now if you pick door 1 and the host reveals a goat behind door 3, you know that door 2 has the car with 100% probability. This information comes at a cost, though. If the host shows door 2, you now have equal probability of winning regardless of whether you switch or stay because it's equally likely that the car is behind door 1 or 3 (which is not the case if we only pick 2 half the time when the car is behind door 1). Switching still has a 2/3 probability of winning overall, but not once you know which door has been revealed.
report abuse
vote down
vote up
Votes: +0
...
written by Caller X, March 24, 2012
I could not find a definitive answer online...
So I leave it my readers to further explore this dilemma and see if we can come up with a definitive solution.


You couldn't find the answer online, so you throw the question to the mob? Why not just call one of those math guys, you know, those guys who do math stuff? What are they called? Mathematicians? You did have a mathematician fact-check your book, right?
report abuse
vote down
vote up
Votes: -4
Your correspondant is correct
written by mdw, March 25, 2012
The casino cannot have a positive expected return on a fair game, no matter what the gambler's leaving conditions are. You state:
"You can look at the problem another way, however. Because of the absorption wall each individual gambler is more likely to walk away a loser than a winner."
However, the winners will each on average have won more than the losers have each lost. There is bound on how much anyone can lose, but no bound on what they can win.
Here's a quick example. The gambler starts with $1. They bet ($1 each time) on the result of a fair coin toss, winning on tails. They leave when they go broke (i.e. as soon as there have been more heads than tails tossed), or after 4 coin tosses. Here are all the outcomes:
Tosses payout
HHHH -1
HHHT -1
HHTH -1
HHTT -1
HTHH -1
HTHT -1
HTTH -1
HTTT -1
THHH -1
THHT -1
THTH 0
THTT +2
TTHH 0
TTHT +2
TTTH +2
TTTT +4

We have 10 losers, only 4 winners, but the casino breaks even.
Yes, in your scenario where gamblers play until they're broke, their losses are temporarily held by the winners. Again, we'll gamble by coin toss, and gamblers start with $1 each. There are always (say) 100 active gamblers, and as each goes broke they leave and a new gambler with $1 replaces them.
Now lets change the situation a little. Each gambler comes in with $1, makes *just one bet* and then leaves, with either $0 or $2. However, before the bet they put on a hat with a number on it. After the bet, if they won, they increase that number by 1. If they lost, they decrease the number by 1. Then they hand the hat on to the next gambler in line behind them. HOWEVER, if they received a hat with the number '1' on it, and lost (reducing the number to 0), instead the hat is thrown away, and the next gambler gets a new hat with the number 1.
It is easy to see that the casino is not (on average) making any money from this: gamblers enter with $1, and leave with on average $1. However, now consider the situation from the point of view of the hats: each hat starts life with number 1, and goes on a random walk until it hits the boundary 0 and gets discarded. This is exactly your scenario: a gambler comes in with $1 and gambles until they reach $0 and leave.
From the casino's point of view, it makes no difference whether you consider the players to be the endless stream of one-bet gamblers, or the hats which they wear (which persist until they 'go broke.') Either way, it is a fair game, and the casino, on average, breaks even.
report abuse
vote down
vote up
Votes: +2
...
written by mdw, March 25, 2012
OldProf: "As they say, the devil is in the details."

However, the author has been around this particular bush before, and addresses both of your objections: it is specified that Monty knows where the car is, and that he always opens a door to reveal a goat. I agree that sometimes people are imprecise in presenting the Monty Hall problem and leave open other possibilities, but this statement of the problem does not. It falls just short of perfection in that it does not explicitly state that you, the contestant, also know these two facts.
daveFNbuck is correct that the details change a little bit if Monty has a pattern to which door he opens (when he has a choice.) But switching is still the dominant strategy (to use a game theory term): it is never worse than not switching, and is sometimes better.
report abuse
vote down
vote up
Votes: +2
...
written by rwpikul, March 25, 2012
The way I explain the Monty Haul problem is based on looking at it as a comparison between picking one door and picking two from the get-go.

If you know you are going to switch then that initial choice is actually of what door you think doesn't have the prize.
report abuse
vote down
vote up
Votes: -2
...
written by OldProf, March 26, 2012
@mdw - I was not arguing that switching is not the correct strategy, just that the probabilities are not necessarily 1/3 and 2/3. For example, if Monty has the choice of two doors with goats, suppose that he always opens the larger number door, and the contestant knows this?

Then suppose that the contestant chooses door #2. If Monty opens door #1, the contestant knows for certain that the prize is behind door #3.
report abuse
vote down
vote up
Votes: +1
...
written by Puddinhead, March 26, 2012
It seems that the odds of the Monty Hall problem are dependent on how observant the contestant is; the awareness of the contestant is one of the defining aspects of the problem. If a second person walked into the studio after Monty exposed one of the doors, and without knowing which door the contestant had chosen, their chance of finding the car would be 1/2. The two events, the first choice and the second choice, are tied together (not independent) via Monte giving pertinent information by removing all of the wrong answers but one after the first choice. If the contestant is unaware that his second choice is dependent on the first, I'd suggest that he is then dealing with an independent problem, and his chance of guessing correctly is 1/2.

I'm sure some of you can point out a flaw in my reasoning here, but it would appear to be a neat situation where awareness of the nature of the problem actually defines the problem, and the correct answer depends on an assumption being made about the contestant.
report abuse
vote down
vote up
Votes: -1
Monty Hall Problem is not Real
written by Michael Dawson, March 26, 2012
Who cares about the odds of prevailing in an undisclosed trick? The real odds in that scenario are always 50/50, so switching is irrelevant.
report abuse
vote down
vote up
Votes: -8
...
written by Throckmorton, March 26, 2012
Quote: "However, there is an asymmetry to this process - the point at which people stop gambling."

There is also asymmetry in winning. During the course of a casino's operation there are people who win big. For example, imagine someone who was playing roulette for an hour and has turned $100 into $200. He then decides to quit, but first he placed $50 on his lucky #22 saying, "I'm up today, so really I'm just playing with my winnings here". He wins at 38 to 1 (or whatever the fair odds are) and leaves.

Sometimes the pattern of how winners win effects their behavior, which is not mentioned in the original article, and could contribute to "asymmetry", and in my example shows an effect that would decrease the house's take.
report abuse
vote down
vote up
Votes: -1
...
written by Puddinhead, March 26, 2012
written by Michael Dawson, March 26, 2012
Who cares about the odds of prevailing in an undisclosed trick? The real odds in that scenario are always 50/50, so switching is irrelevant.


The trick is to realize that Monte is keeping the right answer and getting rid of all of the possible wrong answers but one. It's not so obvious when you are dealing with only three doors but, as pointed out, if you look at 100 doors, it becomes clear. One car, ninety-nine goats. Pick any one of the 100 doors, say door "A", and your chance of finding the car behind door "A" is 1/100, while the odds that it is behind one of the other doors is 99/100. Monte does you a big favor by getting rid of 98 of the remaining doors, leaving behind only 1 and telling you that either the door you picked at 100:1 odds, or the only one left from the other 99, MUST have the car behind it. All of the distractors have been eliminated for you, so the odds are 99/100 that the door you did not originally pick has the car. It works the same with 3 items.
report abuse
vote down
vote up
Votes: +2
Monty Hall still gets them
written by torlackware, March 26, 2012
Puddinhead

The contestants knowledge of probability doesn't play into it. The contestant picking a door and then MH picking another will always been dependent on each other as long as the person making the switch decision is aware of which door the contestant picked (that person usually being the contestant, but that isn't required). You are correct that if someone doesn't know which door the contestant picked, then he has a 50/50 chance of guessing the correct door.

Michael D.

The problem is basically an issue of partitioning. When the contestant picks a door, he has split the probabilities up into two distinct partitions, the door he picked that has a 1/3 chance of being correct and the doors he didn't pick that have a 2/3 chance having the prize. The key is that when MH opens one of the two remaining doors, the probabilities assigned to the two partitions do not change. He still has a 1/3 chance of being correct with his guess. Thus the remaining door he didn't pick has a 2/3 chance of being the correct one. Switching always is the best choice.
report abuse
vote down
vote up
Votes: +1
...
written by torlackware, March 26, 2012
How about this for the people who don't believe the simple version of the MH problem.

What if there were 1,000,000 doors and you picked one. Then MH opens 999,998 doors leaving only your door and the one that wasn't opened. Do you really think that you still have a 50/50 shot at guessing correctly?
report abuse
vote down
vote up
Votes: -2
...
written by Caller X, March 26, 2012

What if there were 1,000,000,000,000,000,000,000,000 doors and you picked one. Then MH opens all but two of the doors leaving only your door and the one that wasn't opened. Do you really think that you still have a 50/50 shot at guessing correctly?

It's not so obvious when you are dealing with only three doors but, as pointed out, if you look at 100 doors, it becomes clear. One car, ninety-nine goats. Pick any one of the 100 doors, say door "A", and your chance of finding the car behind door "A" is 1/100, while the odds that it is behind one of the other doors is 99/100. Monte does you a big favor by getting rid of 98 of the remaining doors, leaving behind only 1 and telling you that either the door you picked at 100:1 odds, or the only one left from the other 99, MUST have the car behind it. All of the distractors have been eliminated for you, so the odds are 99/100 that the door you did not originally pick has the car. It works the same with 3 items.
report abuse
vote down
vote up
Votes: +2
...
written by OldProf, March 26, 2012
Here's the paradox for the problem, illustrating a problem in Bayesian analysis:

Take the 100 doors. Person A chooses door M, and Monty opens 98 of them, leaving doors M and N. At this point, person B comes on stage, and can only see doors M and N.

To person A, the probability that the prize is behind door N is 99/100

To person B, that probability is 1/2
report abuse
vote down
vote up
Votes: +1
...
written by BBC, March 26, 2012
One should also be skeptical about how much probability and statistics has to do with casino-style gambling. I once saw a croupier demonstrate how he could put the roulette ball in any number he wanted by timing the release and the speed the wheel was turning. He couldn't do it 100% of the time, but enough to play havoc with what you think the odds are. This also makes them very adept at keeping away from numbers they don't want to see come up.
report abuse
vote down
vote up
Votes: -1
Simulation
written by MikeCargal, March 26, 2012
(Apparently, putting a URL in your comment sends it to a black hole for admin review.)

I've written a ruby simulation of the Absorption Wall question. You can see the code at http://pastie.org/3674809.
Here's an excerpt from a run of 100 simulations...

Casino won (net:15980.0)
Winning Gamblers=2322 of 5000
Casino won (net:19110.0)
Winning Gamblers=2301 of 5000
Casino lost (net:-190.0)
.
. deleted 95 iterations
.
Casino won (net:8440.0)
Winning Gamblers=2332 of 5000
Casino won (net:890.0)
Winning Gamblers=2301 of 5000
Casino lost (net:-21870.0)
Winning Gamblers=2375 of 5000
Casino won 49 out of 100 simulations
Casino net:440920.0
total losers:266459
total winners:233541
percentage of winners:46.7082
winning bets to losing bets:0.999961898327434 (to evaluate overall 'fairness' of simulation bets)

Bottom line... You do wind up with more gamblers losing than winning due to the absorption wall taking them out of the game when they go broke. However, if the game is really fair (i.e. truly even odds), then the Casino doesn't come out ahead. (So, obviously, winning gamblers win more on average than losing gamblers lose on average).

Feel free to pick at the code and point out any problems.
report abuse
vote down
vote up
Votes: +0
...
written by Puddinhead, March 26, 2012
Here's the paradox for the problem, illustrating a problem in Bayesian analysis:

Take the 100 doors. Person A chooses door M, and Monty opens 98 of them, leaving doors M and N. At this point, person B comes on stage, and can only see doors M and N.

To person A, the probability that the prize is behind door N is 99/100

To person B, that probability is 1/2


Precisely. However, If person A misses the whole value of the set-up and pulls out a coin, it is still just a 50/50 chance he'll get it correct, hence my suggestion that person A's awareness of the set-up is important in determining the odds of him picking the right door. For those of us in the audience watching the whole thing, and following the set-up, we know which door to pick with odds of 2/3 or 99/100, or whatever...

Hold on, I found my problem. The nature of the question is not simply "which door?", but "Do you stick with what you have, or take the other option?". Ostensibly the same question at first glance, but "Stick or change" necessitates some prior event which you are using as a reference point for your choice. Person A's awareness of the situation is presupposed by the wording of the problem.
report abuse
vote down
vote up
Votes: +1
...
written by thebeeg, March 26, 2012
While I appreciate the various scenarios and their effect on the player, in the end the probability doesn't change. The player has two choices, each with a 50% probability of being correct. History does not affect probability.
report abuse
vote down
vote up
Votes: -4
...
written by lytrigian, March 26, 2012
Computer simulations can be helpful. As a callow college student I was sufficiently unconvinced by the correct answer to the Monty Hall problem that I wrote one to show how wrong it was. Unfortunately for my way of thinking, the correct answer fell out in the refactoring stage, and I didn't even have to run the thing to understand that I'd been mistaken.

Concrete examples can help a lot. Physicist Richard Feynman, brilliant as he was, claimed he never could properly understand a problem without an example in front of him. It's not so easy to reason in the abstract.
report abuse
vote down
vote up
Votes: +0
...
written by Puddinhead, March 26, 2012
written by thebeeg, March 26, 2012
While I appreciate the various scenarios and their effect on the player, in the end the probability doesn't change. The player has two choices, each with a 50% probability of being correct. History does not affect probability.


It's in the definition of the problem. It is not a question of the probability of WHETHER he will find it based on a choice, but WHERE it is before the choice. If he runs through this 100 times and simply guesses each time (coin flip), he'll find it 50% of the time. That doesn't change the fact that it will be located behind the door he did not originally pick 66% of the time.
report abuse
vote down
vote up
Votes: +1
...
written by OldProf, March 26, 2012
@BBC - I am very skeptical of this claim, for a reasonable rate of spin. Even in the Eudaemonic Pie, they had to wait until the wheel had slowed down.
report abuse
vote down
vote up
Votes: -1
Sons and daughters
written by Gaius Cornelius, March 27, 2012
I have before wondered about the "keep tossing a coin until you get heads" type of question in relation to population control in a culture that values one gender of babies over those of another. Simply limiting the number of children that a couple can have can lead to problems such a gender imbalance because people will resort to artificial means to select the gender of a child.

Assuming that there is a natural 50% chance of either gender then a culture could enforce a rule that allowed couples to have as many children as they wished until they had a child of the favoured gender. This way, all couples would get one child of the favoured gender and there would be no imbalance in number of babies of each gender.

At least, I think that is statistically right...
report abuse
vote down
vote up
Votes: -1
...
written by OldProf, March 27, 2012
@thebeeg - "History does not affect probability".

Probability is simply a matter of measuring uncertainty before the fact. The more information one has, the closer the measurement.
report abuse
vote down
vote up
Votes: -2
@OldProf
written by BBC, March 27, 2012
"I am very skeptical of this claim, for a reasonable rate of spin…"

It was demonstrated on an old Japanese program called "Happy Family Plan." A regular guy off the street would be selected, and he would have one week to learn to duplicate a feat demonstrated by an expert. It tended to be things like riding a unicycle around an obstacle course, or throwing a playing card from across a room to cut a cucumber in-two. (Which they did!) I don't remember if anybody ever hit the number they were going for in the three tries they were given, but I know they came close. (They would issue the same challenge multiple times.) The croupier, of course, was always much better. Keep in mind, that when somebody does the exact same thing over and over for years, they can develop skills that can seem really extraordinary. (Ever try to do what an experienced drywaller does in thirty minutes in less than a week. Talk about humbling.)

By the way, I'm not the one that voted you down. It is a legitimate skepticism. I am relying on the veracity of Japanese gameshows.
report abuse
vote down
vote up
Votes: -2
Roulette and "hitting the absorption wall" with regard to gambling
written by JasonEngland, March 27, 2012
I've been studying gambling and cheating at gambling in all its forms for about 20 years now. No case has ever surfaced in which a croupier was determined to have been controlling the ball to land in a specific number at roulette with any accuracy beyond that which chance indicated. When the wheel is properly working and spun at the correct speeds, and the ball is spun in the opposite direction at the correct speeds, it's difficult enough for a modern computer to predict the area of the lower wheel head where the ball will impact first. And this takes into consideration the known velocities of both the wheel and the ball. Now imagine that you're not trying to determine a given impact point, but you're trying to actually create that impact point with precise speeds and a perfectly times release.

It's a myth - pure and simple.

Now I'll tell you what does happen from time to time....

A tourist walks up to an empty roulette table and makes some comment about being down to his last few bucks. The croupier says conspiratorially, "Put ten bucks on 17 and I'll see what I can do for you." The tourist does so and the croupier now has what we call a "free roll." If the ball misses 17, he shrugs it off and says, "Sorry, I tried." If by some chance 17 comes up, the tourist pockets 350 bucks (minus a hefty tip for the dealer) and walks away with a story he'll tell for the rest of his life. He never realizes he was the victim of a simple but effective "tip hustle."

Any dealer that tries this a few times a day just for fun will have a small army of "believers" wandering about after 6 months or so.

With regard to gambling against a casino using a game that contains no inherent house advantage: If Dr. Novella believes that his hypothesis is true (it isn't), then in theory the casinos daily/weekly/monthly/yearly "take" or profit could be calculated. Let's assume for the sake of argument that this casino/game situation exists and that the profits added up to 2% of all money gambled at the casino in a given year. That would mean that the casino could offer games in which the actually had a 1% disadvantage and they would still make money due to the "absorption wall" theory.

You can't have it both ways. Either the players (collectively) will win when they have a 1% edge or they will lose in the long run. Which is it? As has been stated earlier by others, Dr. Novella is confusing the likelihood of an individual going broke with the likelihood of all of the other gamblers going broke.

Incidentally, Dr. Novella has stumbled upon a serious issue that plagues professional gamblers (card-counters, primarily). That subject is capitalization. If an individual isn't properly capitalized he/she runs a significant risk of going broke even when playing with an edge of .5% - 1% or even slightly more. Many professional counters join forces and begin playing on teams to pool their resources and lower their variance over a given period.

Although it's a complex subject, anyone interested in reading more about how to determine if you're properly capitalized for a given gambling session and how to divide your individual bets for maximum profits should read the section entitled, "Bankroll Management" in David Sklansky and Mason Malmuth's How To Make $100,000 a Year Gambling For a Living. Despite the overly grand title, this isn't a "get rich quick in Vegas" book by a long shot. It's a good treatment of professional gambling (mostly card counting and professional poker) written by two professional gamblers and top-notch mathematicians and theoreticians. You can also look up the "Kelly criterion" online and spend some time reading about bet sizing and how it relates to both bankroll and your edge in a given game.

Jason England
report abuse
vote down
vote up
Votes: +4
...
written by Steven Novella, March 28, 2012
Thanks for all the feedback. I definitely agree that the absorption wall does not result in the house winning, for the reasons stated. Most importantly, the wall increases the number of losers but also limits the amount people can lose, so winners win more.

Regarding the Monty Hall problem - there is no question, switching gives you a 2/3 chance of winning. It does assume that Monty chooses the goat door at random when there are two, so that does not provide any more information. It also assumes the contestant knows the rules. In this case switching is identical to being offered to switch to the two unopened doors. Of course you would switch. The fact that Monty opens one to reveal a goat changes nothing, because not matter what your initial choice he can and will do this.
report abuse
vote down
vote up
Votes: +0
Easier way of looking at the problem
written by GrahamZ, March 29, 2012
The answer may not seem obvious when you look at it from the point of view of a single gambler. But if you look at it from the point of view of the casino, the answer is a lot clearer. From the point of view of the casino, you have lots of gamblers who can quit gambling at any time, either through choice, or through running out of cash. The reason why a gambler stops playing in no way affects the odds for the casino. So the results are simply determined by a large set of even bets.

You posed the question in a good way -- what's the outlook for the casino. But then you then confused yourself by looking at individual gamblers instead.

The fun thing about problems like this is not the actual math, but figuring out how to properly pose the question so that, through simple logic, the answer becomes obvious.
report abuse
vote down
vote up
Votes: +0
...
written by GrahamZ, March 29, 2012
@Gaius Cornelius
You have the probability a little bit wrong there. This actually is similar to an old brain teaser that goes like this:
A King rules a kingdom that is almost always at war, so he wants to encourage the birth of male children so that the nation can have more soldiers in the army. He sets a rule stating that a couple may continue to have children as long as they wish, so long as those children are male. A couple bearing a female child will no longer be permitted to have children. The brain teaser goes on to ask, "Statistically, would the King's policy most likely result in more male children than female children?"

The answer is not surprisingly (if you are familiar with probability) no.

Look at it this way -- say there are 1000 couples that want to have children. After one cycle of births, you'll average about 500 male and 500 female children. That leaves you with 500 couples still legally able to bear children. So the 500 can then have up to 500 children, half of whom will still be male, the other half female. And So on. and so on.

If the odds are 50/50, then taking someone out of the pool does not change those odds, and thus, does not change the expected ratio of males to females. That's because each birth is an independent event, and having a male child does not affect future odds.
report abuse
vote down
vote up
Votes: +1
Sorry...
written by GrahamZ, March 29, 2012
@Gaius Cornelius
I realized after my post that I had misread what you wrote (sorry, not enough sleep). You actually said the same thing as I did.
report abuse
vote down
vote up
Votes: +0
@GrahamZ
written by Gaius Cornelius, March 29, 2012
OK. Yes, that is what I meant. Somewhat counter intuitively it should be possible for a community to control its population and maintain a 50/50 gender balance while still allowing couples to have one child of the preferred gender. Furthermore, this is done without resorting to selective abortion or infanticide. Of course, this does not take human nature into account, but some communities may find it makes sense to encourage this pattern of behaviour.
report abuse
vote down
vote up
Votes: +0
NY Times Article
written by huw, March 29, 2012
I love the following article from the New York Times on "Monty Hall".
www.nytimes.com/1991/07/21/us/...swer.html
It just shows, once again, that you have to be really careful with these kinds of problems.
Huw
report abuse
vote down
vote up
Votes: +0
Re: NY Times Article
written by huw, March 29, 2012
Hmm. URL got chopped for some reason. The suffix was:
behind-monty-hall-s-doors-puzzle-debate-and-answer.html
report abuse
vote down
vote up
Votes: +0

Write comment
This content has been locked. You can no longer post any comment.
You must be logged in to post a comment. Please register if you do not have an account yet.

busy