Tuesday, February 16, 2016

Evidence and conviction - more not always better



More is always better especially if it comes to data. Unfortunately, that may not the case.  A recent paper called "Too good to be true: When overwhelming evidence fails to convince" suggests that if data is overwhelming in favor of a choice, there may be a problem. Too much confirmation evidence should reduce confidence and suggest that there is a hidden bias. Call it the "paradox of unanimity"

This is a complex issue associated with Bayesian decision-making, but think of it this way - if you have overwhelming data supporting an argument, it may just mean that the data is not independent but closely correlated or biased. You are pulling from a single group that does not provide enough diversity or the data are actually suggesting that the test is not true.

Why is this important for quant analysts? There is a clear focus on the power of tests and the ability to have confidence in a conclusion, but you should be aware that there could be data that are too good to be true. For example, if there a random police line-up and large number roof witnesses all agree on the same person, there is a likelihood that the test was not random but has a bias. In this case, agreement suggests less certainty. We can be less certain of our conclusions if there is complete agreement. While not directly the same, as the adage goes, "If everyone is thinking the same, then someone is not thinking."

No comments: