*Posted by Dr Fro 9:36 PM*

A couple months ago, I was at an event with a speaker from the Baker Panel (that investigated the explosion at the BP refinery in Texas City). She is a professor at some jack-ass school in Cambridge, Mass, but she seems to know what she is talking about.

One of her main points was about the fallacy surrounding multiple "low-probability, high-consequence events." For example, let's say that the the only way I could blow up a refinery is if System A, B and C all stopped working (each with a probability of 1%), what is the probability of the refinery blowing up?

Well, bad math would say 1% x 1% x 1%. That math would be good as long as the probability of each was independent of the other. But that is rarely the case. To take an extreme case of dependence, assume that only one factor can make the systems go out (say, some fuse blows, probability of 1%). Then the probability of the refinery blowing up is exactly 1%. This is significantly different than the probability you would (incorrectly) calculate if you assumed no independence of probabilities (you would calculate 0.0001%).

As with all things in life, my mind quickly drifted to a poker application. How many big ticket things affect each probability to an extent that they are no longer independent? I came up with a few. I'll let you think about it and email me with your thoughts. Thus far, I have come up with drinking and general not paying attention.