Bernoulli's Fallacy
Statistical Illogic and the Crisis of Modern Science
Failed to add items
Add to Cart failed.
Add to Wish List failed.
Remove from wishlist failed.
Adding to library failed
Follow podcast failed
Unfollow podcast failed
Prime members: New to Audible?Get 2 free audiobooks during trial.
Buy for $7.99
-
Narrated by:
-
Tim H. Dixon
-
By:
-
Aubrey Clayton
There is a logical flaw in the statistical methods used across experimental science. This fault is not a minor academic quibble: It underlies a reproducibility crisis now threatening entire disciplines. In an increasingly statistics-reliant society, this same deeply rooted error shapes decisions in medicine, law, and public policy, with profound consequences. The foundation of the problem is a misunderstanding of probability and its role in making inferences from observations.
Aubrey Clayton traces the history of how statistics went astray, beginning with the groundbreaking work of the 17th-century mathematician Jacob Bernoulli and winding through gambling, astronomy, and genetics. Clayton recounts the feuds among rival schools of statistics, exploring the surprisingly human problems that gave rise to the discipline and the all-too-human shortcomings that derailed it. He highlights how influential 19th- and 20th-century figures developed a statistical methodology they claimed was purely objective in order to silence critics of their political agendas, including eugenics.
Clayton provides a clear account of the mathematics and logic of probability, conveying complex concepts accessibly for listeners interested in the statistical methods that frame our understanding of the world. He contends that we need to take a Bayesian approach - that is, to incorporate prior knowledge when reasoning with incomplete information - in order to resolve the crisis. Ranging across math, philosophy, and culture, Bernoulli’s Fallacy explains why something has gone wrong with how we use data - and how to fix it.
PLEASE NOTE: When you purchase this title, the accompanying PDF will be available in your Audible Library along with the audio.
©2021 Aubrey Clayton (P)2021 Audible, Inc.Listeners also enjoyed...
People who viewed this also viewed...
My intellectual voyage through this domain was profoundly enriched by Clayton's work, which bestowed upon me the essential historical context of the Bayesian versus frequentist discourse, underscoring Jaynes' work as a pivotal intellectual achievement.
Entitled "Bernoulli’s Fallacy," the book adeptly traces the trajectory of statistical thought, journeying from Bernoulli's pioneering efforts to the unsettling application of statistics in the pursuit of eugenic agendas. It also confronts the contemporary "crisis of replication" afflicting various research fields, a crisis stemming from an excessive dependence on statistical significance and p-values in hypothesis evaluation.
In its initial chapters, the book articulates its core concepts, which, though not revolutionary, remain critical and frequently misunderstood in modern discussions. These concepts pivot around the idea of probability as a subjective belief informed by available knowledge, the imperative of articulating assumptions in probability statements, and the transformation of prior probabilities into posterior probabilities via observation. The book underscores that data alone cannot yield inferences; rather, it reshapes our existing narratives based on their plausibility.
A pivotal insight from the book is the acknowledgment that improbable events do indeed transpire. This realization challenges the practice of deducing the veracity or fallacy of hypotheses solely based on the likelihood of observations. Instead, it advocates for adjusting our subjective belief in the plausibility of a hypothesis in relation to other competing hypotheses.
Moreover, the book elucidates a critical distinction: Bayesian and frequentist methods are not merely two different perspectives but rather, the Bayesian approach forms the bedrock of probability understanding, with the frequentist method emerging as a historical aberration, a specific instance within the expansive Bayesian paradigm.
It was particularly enlightening to learn how a small cadre of British mathematics professors, namely Galton, Fisher, and Pearson, engineered an entire statistical school of thought. This school, founded on flawed and convenient principles, served to justify and rationalize their eugenic and racist viewpoints, reinforcing the Victorian-era racial supremacy of the British upper class through a veneer of mathematical rationalization. This review offered a fascinating glimpse into a quasi-scientific method employed by researchers who, standing on shaky ground, resort to limited group sampling and mathematical subterfuge to lend false precision and authority to their biased models and probability findings.
Statistical method based upon Racist Justification
Something went wrong. Please try again in a few minutes.
I confess that the middle sections dragged when talking about the various factions within the frequentist camps. There are also some very long equations that slow the text down in audiobook format in the middle. Still, this was deeply enjoyable and worthwhile. Recommend it!
Loved It
Something went wrong. Please try again in a few minutes.
Eye-Opening
Something went wrong. Please try again in a few minutes.
Unhinged and thought provoking.
Something went wrong. Please try again in a few minutes.
Rigorously Bayesian
Something went wrong. Please try again in a few minutes.