1994-04-21 - Gambler’s Ruin, 15 out of 16, and a Probability Parable

Header Data

From: collins@newton.apple.com (Scott Collins)
To: ph@netcom.com (Peter Hendrickson)
Message Hash: ce609b8372d9cecbd95918f9c2ac7bd4ed78e76de9d81226ae0cbc6372967229
Message ID: <9404212055.AA18745@newton.apple.com>
Reply To: N/A
UTC Datetime: 1994-04-21 21:34:37 UTC
Raw Date: Thu, 21 Apr 94 14:34:37 PDT

Raw message

From: collins@newton.apple.com (Scott Collins)
Date: Thu, 21 Apr 94 14:34:37 PDT
To: ph@netcom.com (Peter Hendrickson)
Subject: Gambler's Ruin, 15 out of 16, and a Probability Parable
Message-ID: <9404212055.AA18745@newton.apple.com>
MIME-Version: 1.0
Content-Type: text/plain


Howdy Peter,

OK, though it's been enjoyable, I won't try any further to convince you.

I peppered this message with smileys to let you know that I think
mathematical debates are about differing observations, not differing
values.  In such conversations, its easy to lose sight of that and take
something the wrong way.  Please don't; it has been fun and just because
neither of us has convinced the other (yet) doesn't mean I think the less
of you (or, hopefully, the reverse... I know to you, I must seem pretty
`thick').

I, myself, would like a little more explanation of _your_ point of view
(see my question below beginning with "Why?").  I will recapitulate the
high points of my problems with your previous arguments so that you can
clear them up for me in private e-mail.  I also quote some equations that
summarize the point I was trying to make, so that you can examine them and
offer up alternatives that represent your point.  I am cc'ing cypherpunks
on this final message so that they can see these equations.  Here we go :-)


I wrote a conjecture:

  SC>A.1   As parishoners play and leave, the division of wealth approaches the
  SC>      `odds' of the game.

Which you agreed with:

  PH>I agree with both conjectures.

I then repeated the conjecture in my argument:

  SC> [A.1] predicts that as ... the number of players goes to infinity,
  SC> ... the fraction of money won by the church approaches ... the probability
  SC> the church will win a single trial.

Which you do _not_ accept as the statement you agreed with:

  PH>There is a slight difference between [A.1] and
  PH>this statement.  [A.1] predicts that as ... the number of bets
  PH>goes to infinity the fraction of bets won will approach ... the
  PH>probability that the church will win a single trial.

On the probability of the player's ultimate ruin you say:

  PH>Each parishioner has a high probability of losing their savings and a
  PH>low probability of winning everything owned by the church.  It is
  PH>possible for any single parishioner to win everything, but it is
  PH>unlikely.

Why?  Why is the probability not almost `even', like the odds of the game,
.51 vs .49?  What other information influences this _new_ probability, the
probability of the player going broke, if it is not---as I say---the
difference in cash resources between the player and the house?

I didn't ask you this question in my earlier messages---I thought I was
supplying the answer---but you did provide an alternate explanation:

  PH>This player wins because he or she was fortunate enough to place the
  PH>first bet in the series [of sufficient consective bets lost by the house].

  PH>The player needs to be lucky.

To paraphrase my "Why?" question above: can you qualify `lucky'?  How
`lucky' does the player have to be?  I submit to you that given individual
trials where the players probability of winning a single unit in a single
trial is p, the total amount of money at stake in the series of trials is
C, the amount currently held by the player is d, the house C-d=D, that the
ultimate chance for the players ruin is given by the equation (from
[Weaver] cited in an earlier message):

                      1-p
            where r = --- 
                       p
                                             r^C - r^d
       R_d (prob. of ruin given d capital) = ---------
                                              r^C - 1

Though in the limit (a fair game) you would derive a friendlier form as:

                        d
              R_d = 1 - -
                        C

...and, of course, at the other extremes, where p=1, or p=0, the player
never or always goes broke respectively.

These equations are consistent the proposition that the probability of ruin
depends on both the odds of the game _and_ the initial distribution of
capital.  Note their behavior as C increases with respect to d.  Soon, this
difference dominates even in the face of good `odds'.  I invite you to
experimentally verify, at your leisure, the `fair game' version with two
players and different amounts of pennies where each bet is a single penny
and decided by a coin toss.

Finally, you offer me this comfort :-)

  PH>This can be very confusing.  I've seen two professional mathematicians
  PH>and a futures textbook make this mistake.

Thank you ;-)  If I, two professional mathematicians, a textbook, a book I
cited to you, and several other cypherpunks all erred similarly, then it
must be a treacherously easy mistake to make; I don't feel any shame.  But,
I would also relate this little probability parable (again, from
[Weaver])---of course drawing no comparisons:

  In the card room of the Quadrangle Club at
  the University of Chicago, years ago, a hand con-
  sisting of thirteen spades was dealt.  The celebrated
  mathematician Leonard Eugene Dickson was one of
  the players.  (Those who know his interest in bridge
  realize that the probability of his being one of the
  players was not far below unity.)  At the request of
  his companions, he calculated the probability of this
  deal  (It is roughly 10^-13.)  A young know-it-all gaily
  reported at lunch the next day that he had calculated the
  probability of dealing thirteen spades, and had found that
  Dickson had made a mistake.  Another famous
  mathematician, Gilbert Bliss, was present; he
  properly dressed down the youngster by saying,
  "Knowing that Dickson calculated a probability and
  got one result, and you had tried to calculate the
  same probability but got another result, I would
  conclude that the probability is practically unity that
  Dickson was right and you are wrong."


Be happy and keep wondering---that's what makes us great,



Scott Collins   | "That's not fair!"                         -- Sarah
                | "You say that so often.  I wonder what your basis
   408.862.0540 |  for comparison is."                 -- Goblin King
................|....................................................
BUSINESS.    fax:974.6094    R254(IL5-2N)    collins@newton.apple.com
Apple Computer, Inc.  5 Infinite Loop, MS 305-2D  Cupertino, CA 95014
.....................................................................
PERSONAL.    408.257.1746       1024:669687       catalyst@netcom.com







Thread