Here is an excellent article on Bayesian statistics and decision-making: How an 18th-century priest gave us the tools to make better decisions. Ironically, this might be my confirmation bias talking, but in my interview with Sid Mukherjee, he referred to Bayesian logic as perhaps one of the great neglected ideas in medicine, and in the world in general.
“Everything has priors,” Sid says, “and you need to understand those priors before you can understand the posteriors.” Posterior probability is the revised probability of an event occurring after taking into consideration new information. An oft-used example is a man at the street fair tossing a fair coin (I believe a version of this was used in The Black Swan by Nassim Taleb) and it lands on heads 99 times in a row. This man asks the crowd, what is the probability of the next flip landing on tails? The “logical” answer that any mathematician would tell you is 50%. But a child with zero understanding of mathematics could give you the more accurate answer: the probability is somewhere very close to zero, not 50/50.
Without revealing too much more detail, Sid also counters the Bayesian Yin with an important Yang that together comprises two of the three laws of medicine he proposed in his eponymous book from 2015.
Some gems below from the Vox article above (read the entire article, but I couldn’t help myself from dropping these here):
- “The phenomenon of being swayed from accurate belief-building by our personal desires or emotions is known as motivated reasoning, and it affects every one of us, no matter how rational we think we are.
- “Take confirmation bias, for example. This is our inclination to eagerly accept any information that confirms our opinion, and undervalue anything that contradicts it. It’s remarkably easy to spot in other people (especially those you don’t agree with politically), but extremely hard to spot in ourselves because the biasing happens unconsciously. But it’s always there.
- “If there is one thing Bayes can teach us to be certain of, however, it is that there is no such thing as absolute certainty of belief. Like a spaceship trying to reach the speed of light, a posterior likelihood can only ever approach 100 percent (or 0 percent). It can never exactly reach it.
- “[S]cience never officially ‘proves’ anything — it just seeks evidence to improve or weaken current theories until they approach 0 percent or 100 percent. This should serve as a reminder that we should always remain open to the possibility of changing our minds if strong enough evidence emerges.”
Comment policyComments are welcomed and encouraged on this site, but there are some instances where comments will be edited or deleted as follows: