From: collins@newton.apple.com (Scott Collins)
To: ph@netcom.com (Peter Hendrickson)
Message Hash: c8eb75b432819e446cee5dc2d65700273b3536af2fc7e9571d19089fdcc5ca06
Message ID: <9404190216.AA04828@newton.apple.com>
Reply To: N/A
UTC Datetime: 1994-04-19 04:25:13 UTC
Raw Date: Mon, 18 Apr 94 21:25:13 PDT
From: collins@newton.apple.com (Scott Collins)
Date: Mon, 18 Apr 94 21:25:13 PDT
To: ph@netcom.com (Peter Hendrickson)
Subject: Re: 15 out of 16 times (math, not laundry)
Message-ID: <9404190216.AA04828@newton.apple.com>
MIME-Version: 1.0
Content-Type: text/plain
>Actually, the casinos win in Las Vegas because the odds of almost
>every bet are in their favor.
In most cases the odds favor the house---I never claimed otherwise---and
that certainly speeds up the inevitable process of cash extraction.
>Larger capital allows you to affect the distribution of winnings, but
>not whether or not the underlying bet is a good one.
If the difference in bankrolls exceeds a tolerance related to the `odds',
the quality of the bet is immaterial.
The direct implication of the weak law of large numbers is: a) the longer
you play, the more certain you will experience a `run of bad luck'; b) the
party with less money goes broke waiting for their `run of bad luck' to
end. When one part goes broke, the game is over, even if the distribution
of winnings does not match the theoretical expectations (and in the case of
going broke, it can't ... or you wouldn't have played).
>Every casino, in effect, takes on the whole world. As all the bets
>are independent, it doesn't matter if they are played by one player or
>by a new player every time. The world has much more capital. Yet the
>casinos consistently win.
No. The whole world doesn't go broke as a unit. Individuals stop playing,
leaving their money in an unexpected distribution, when they _personally_
go broke.
In fact, most gambling decisions are related in some way to cash resources
of the participants. For example, I propose a hypothetical game where you
(the player) flip a fair coin. If it comes up heads on the first toss, I
pay you $2; game over. If it comes up heads on the second, I pay you $4;
game over. $8, $16... How much would you pay me (the house) to play this
game? The theoretical value is infinite; you could win any amount of money
at this game -- 1/2 the time $2 dollars, 1/4 of the time $4, 1/8 of the
time $8... expectations = Sum_{n \goesto \infty}{n \over n}.
Let's say I'm an actual casino, and could reasonably pay out winnings up to
but not beyond $4.3 billion. You should pay no more $33 for a chance at
that money. Derivation as an exercise for the reader. Consider this from
the perspective of the house. The house is using the Martingale system
against you, doubling its bet every time it loses until it gets that $33.
That means that to launder $33, one party could conceivably lose
$4.3billion. Obviously no mathematicians work at my casino. They all left
to persue jobs that ensure a paycheck.
These are _not_ my personal conclusions. This is sound, if disturbing,
probability theory---known for at least 250 years. This particular effect
goes by many names including "Gambler's Ruin". Given the odds, and the
respective bankrolls, you can calculate the probability that any given
party will go broke in extended play. The problem of "Duration of Play"
was solved by Bernoulli and published posthumously in 1713.
Scott Collins | "That's not fair!" -- Sarah
| "You say that so often. I wonder what your basis
408.862.0540 | for comparison is." -- Goblin King
................|....................................................
BUSINESS. fax:974.6094 R254(IL5-2N) collins@newton.apple.com
Apple Computer, Inc. 5 Infinite Loop, MS 305-2D Cupertino, CA 95014
.....................................................................
PERSONAL. 408.257.1746 1024:669687 catalyst@netcom.com
Return to April 1994
Return to “ph@netcom.com (Peter Hendrickson)”