The National Coin Flipping League Championship Series

The National Coin Flipping League Championship Series

  • Comments 62

No tech today, but a little basic math.

In baseball, a sport I know little about, apparently the Boston Red Sox have recently come back from a three game deficit to win a best-of-seven series against their traditional rival team, the New York Yankees.

Baseball is a game which attracts statisticians, and many have noted that this is the first time in major league baseball history that a team has won a best of seven series after being down three games to none.

However, it has happened twice in hockey.

I have a modest proposal. Suppose once a year, the National Hockey League and Major League Baseball decide all their various championships without going to all the trouble and expense of playing the game. Rather, they could simply hold a best-of-seven coin-flipping championship. (Call it the Numismatic Hockey League if you'd like.)

Suppose Boston calls heads. The odds of Boston flipping T T T and then coming back to win with H H H H are one in 128.

Therefore, there should be one such occurrence on average every 128 series. There are four such series a year: the American and National League finals, one "world" series (for which only North American teams are eligible, strangely enough), and one Stanley Cup. You'd expect to wait 128 / 4 = 32 years on average between occurrences.

We've been playing pro baseball and hockey, what, about a hundred years in North America?

Three such series, in about a hundred years -- or, roughly one every 32 years. It seems like the math works out rather nicely. Maybe they have been deciding the games via coin flipping and just not telling anyone. Hmm...

Is Boston's victory really that impressive? I mean, the last time I played Risk I rolled three sixes on three dice and England crushed Iceland -- odds of that are 1/216, almost twice as long as Boston coming back from a three tail deficit in the National League Coin Flipping Championship. That's because my blue plastic army guys really worked together as a team and gave 110%!

And yet it didn't make headlines in even the local paper.

In related news, if Houston wins their championship, and it ends up being Texas vs. Massachussets in both baseball AND the presidential election, that's going to be freaky weird. What are the odds of that?

  • This topic should make some interesting contributions to "Riddle Me This, Google: Part Three".
  • One certainly hopes so.

    Between May and August of this year I was averaging 200-300 Google hits a day.

    It was over _600_ yesterday, and some of those have got to be gems.
  • I sense a great disturbance in the force... as if billions of Statistics professors cried out at once, and then fell silent. Your attempts to analyze a sports event using this statistical model is flawed at its core:
    1. In statistics, we assume that a coin toss has exactly a 50% chance of each outcome. This is not true in a sports game.
    2. In statistics, a series of coin tosses is "memory-less". The coin doesn't remember what happened the last time it was tossed, so it has the same 50% chance of each outcome on every toss. Humans do have memory.

    Those are the two mathematical errors in your model. There are also errors in your analysis, such as the fact that there are far more 7-game series than you had mentioned, but that's not a mistake in the model; only in its validation.

    In summary: never trust the words of a man who writes in a purple font...
  • Flawed at its VERY CORE, eh?

    Your first complaint is the same one I addressed forty comments ago. Don't like the odds? Pick different odds. Weight the coin differently. The chances of getting a TTTWWWW series change, but not by much.

    Your second point is also easily dispensed with. Suppose for instance a team has a .5 chance of winning the first game, and then a .75 chance of winning the next game if they won the previous game. In such a system, the chances of a TTTHHHH series go UP, not DOWN! Pick any conditional probability model you want, it's only going to make such events more common.

    No, the real flaw is in the validation. The model is insufficiently complex to explain on probabalistic grounds why there have been so few TTTWWWW series given that there are more seven game series than I stated.

    We could come up with a new model in which all these concerns were met. Divide all pairings of teams into "thoroughly outclassed" and "about even".

    In the vast majority of "thoroughly outclassed" matchups, one team has such a higher probability of winning that four-game series are practically inevitable.

    But in a few "about even" matchups, we should expect to see a proper coin-flipping distribution. About 1/8 of them should be sweeps, about 1/64 of them should be three losses followed by four wins, etc.

    As argued in the comments above, clearly Boston and the Yankees were in the "about even" category.
  • Eric,

    <quote>
    You'll note that I did not mention WHEN I last played Risk. It was probably eight or ten years ago.

    How is it then that I so clearly remember my blue armies in England crushing Iceland? Because my Risk strategy is:

    * always play blue, and
    * get into a huge, endless fight over Europe, sapping my strength until eventually the guy who's held Russia/Asia for the last ten turns cashes in 120 armies worth of cards (card limits are for wimps!) and sweeps across Europe in a bloody rampage.

    So at some point England ALWAYS crushes Iceland with triple sixes.

    It's not an _effective_ winning strategy, but it's fun. Particularly if you can make a last stand on Iceland with a hundred armies or so.
    </quote>

    ROFL...
  • In the spirit of this thread, here's an interesting logical/statistical conundrum: Suppose I offered you to choose one of two identical sealed envelopes, telling you that both contain an unspecified amount of money, but that one contains double the amount of the other. Obviously the choice is random.

    Now suppose that before you open the envelope I give you the chance to exchange it. If we assume your envelope contains X $, then the other envelope contains either half of X or double X. So, you’ve got a 50% chance of loosing 1/2 X, and a 50% chance of gaining X. So it would seem that contrary to reason, it makes sense to exchange the unopened envelopes.

    And say I offered you the chance again, and again, and again. It would seem that we would stand there forever, exchanging the unopened envelopes. What’s wrong here?
  • This is a fairly well-known paradox. Google "Newcomb's Paradox" for another.

    The resolution of this paradox lies in the fact that you're using "X" to mean two different things in one sentence. In one half of the sentence, X represents the larger amount, and in the second half it represents the smaller amount, so of course they're different.

    Stop using "X" and start using dollar amounts and the paradox goes away.
  • But what if instead of picking heads or tails, Boston and New York each have to pick a sequence of three outcomes (e.g. HHT), and New York has to pick first, and whoever's sequence appears first wins ...

    :)
  • Is this what you C&O people do for fun?
  • The Envelope Paradox gets into knots because of the probabilities. So let's state it without probabilities.

    Statement 1: Let the amount in your envelope be X. You either gain X by switching, or lose 1/2 X by switching. Since it's obviously better to gain X than lose 1/2 X, you should switch.

    Statement 2: Let the amount in the smaller envelope be Y. You either have Y or 2Y. If you switch, you might gain Y or you might lose Y. There's no benefit to switching.

    Clearly at least one of these statements is wrong because they contradict each other. But now the misdirection is obvious:

    "You either gain X (=Y) by switching, or lose 1/2 X (=Y) by switching."

    X and X/2 can't _both_ be Y in the same sentence!
  • Eric, one of the better answers that I got. Good for you!
  • I helped an elderly gentleman win thousands of dollars at a casino last year... I rolled double 6's at one point. He looked me in the eye, said "I trust you, I know you'll do it again" and put $50 on it... I rolled it again. He put another $50 on it, and I rolled it a third time. He collected his wad of dough and then left, thanking me - with words.
  • I don't think Mr. Lippert explained the envelope problem.

    > "You either gain X (=Y) by switching, or
    > lose 1/2 X (=Y) by switching."
    > X and X/2 can't _both_ be Y in the same
    > sentence!

    True but so what? The "quoted" sentence has this meaning:
    "Either X = Y or X/2 = Y. If you open the envelope then you know the value of X but you don't know the value of Y until later.[*] You either gain X (and later find out that X = Y) by switching, or lose X/2 (and later find out that X/2 = Y) by switching."
    The paradox is not in that explanation.

    The actual explanation was determined by some participants in rec.puzzles a few years ago. I almost remember enough math to understand it. There is no such thing as a uniform distribution over the set of integers.

    I can add a bit to the explanation. Pretend that the assumed probability distribution (uniform over the set of integers) were possible. Your expected gain is infinity divided by infinity, or infinity minus infinity, however you want to express it.

    [* Yes, part of the apparent paradox is that you don't have to open the envelopes in order to decide that you want to switch back and forth an infinite number of times. Nonetheless it is true that IF you open your envelope THEN you will know X while still not knowing Y. This still explains why the possibility of either X matching Y or X/2 matching Y is not a paradox.]
Page 4 of 5 (62 items) 12345