• Bernoulli's Fallacy

  • Statistical Illogic and the Crisis of Modern Science
  • By: Aubrey Clayton
  • Narrated by: Tim H. Dixon
  • Length: 15 hrs and 14 mins
  • 4.6 out of 5 stars (107 ratings)

Prime logo Prime members: New to Audible?
Get 2 free audiobooks during trial.
Pick 1 audiobook a month from our unmatched collection.
Listen all you want to thousands of included audiobooks, Originals, and podcasts.
Access exclusive sales and deals.
Premium Plus auto-renews for $14.95/mo after 30 days. Cancel anytime.
Bernoulli's Fallacy  By  cover art

Bernoulli's Fallacy

By: Aubrey Clayton
Narrated by: Tim H. Dixon
Try for $0.00

$14.95/month after 30 days. Cancel anytime.

Buy for $27.55

Buy for $27.55

Pay using card ending in
By confirming your purchase, you agree to Audible's Conditions of Use and Amazon's Privacy Notice. Taxes where applicable.

Publisher's summary

There is a logical flaw in the statistical methods used across experimental science. This fault is not a minor academic quibble: It underlies a reproducibility crisis now threatening entire disciplines. In an increasingly statistics-reliant society, this same deeply rooted error shapes decisions in medicine, law, and public policy, with profound consequences. The foundation of the problem is a misunderstanding of probability and its role in making inferences from observations.

Aubrey Clayton traces the history of how statistics went astray, beginning with the groundbreaking work of the 17th-century mathematician Jacob Bernoulli and winding through gambling, astronomy, and genetics. Clayton recounts the feuds among rival schools of statistics, exploring the surprisingly human problems that gave rise to the discipline and the all-too-human shortcomings that derailed it. He highlights how influential 19th- and 20th-century figures developed a statistical methodology they claimed was purely objective in order to silence critics of their political agendas, including eugenics.

Clayton provides a clear account of the mathematics and logic of probability, conveying complex concepts accessibly for listeners interested in the statistical methods that frame our understanding of the world. He contends that we need to take a Bayesian approach - that is, to incorporate prior knowledge when reasoning with incomplete information - in order to resolve the crisis. Ranging across math, philosophy, and culture, Bernoulli’s Fallacy explains why something has gone wrong with how we use data - and how to fix it.

PLEASE NOTE: When you purchase this title, the accompanying PDF will be available in your Audible Library along with the audio.

©2021 Aubrey Clayton (P)2021 Audible, Inc.

What listeners say about Bernoulli's Fallacy

Average customer ratings
Overall
  • 4.5 out of 5 stars
  • 5 Stars
    79
  • 4 Stars
    17
  • 3 Stars
    6
  • 2 Stars
    1
  • 1 Stars
    4
Performance
  • 4.5 out of 5 stars
  • 5 Stars
    73
  • 4 Stars
    16
  • 3 Stars
    1
  • 2 Stars
    3
  • 1 Stars
    1
Story
  • 4.5 out of 5 stars
  • 5 Stars
    67
  • 4 Stars
    17
  • 3 Stars
    5
  • 2 Stars
    2
  • 1 Stars
    3

Reviews - Please select the tabs below to change the source of reviews.

Sort by:
Filter by:
  • Overall
    5 out of 5 stars
  • Performance
    4 out of 5 stars
  • Story
    5 out of 5 stars

Statistical method based upon Racist Justification

Immersed in the labyrinthine realms of statistical theory, I found myself captivated by the nuanced debate between the frequentist and Bayesian schools of thought. In the book I had the pleasure of reviewing, Clayton masterfully illuminates the stark incompatibilities that lie at the heart of these two methodologies. His adept critique of frequentist assertions, which he then artfully deconstructs, proved both enlightening and accessible, demanding no more than a foundational understanding of undergraduate statistics.

My intellectual voyage through this domain was profoundly enriched by Clayton's work, which bestowed upon me the essential historical context of the Bayesian versus frequentist discourse, underscoring Jaynes' work as a pivotal intellectual achievement.

Entitled "Bernoulli’s Fallacy," the book adeptly traces the trajectory of statistical thought, journeying from Bernoulli's pioneering efforts to the unsettling application of statistics in the pursuit of eugenic agendas. It also confronts the contemporary "crisis of replication" afflicting various research fields, a crisis stemming from an excessive dependence on statistical significance and p-values in hypothesis evaluation.

In its initial chapters, the book articulates its core concepts, which, though not revolutionary, remain critical and frequently misunderstood in modern discussions. These concepts pivot around the idea of probability as a subjective belief informed by available knowledge, the imperative of articulating assumptions in probability statements, and the transformation of prior probabilities into posterior probabilities via observation. The book underscores that data alone cannot yield inferences; rather, it reshapes our existing narratives based on their plausibility.

A pivotal insight from the book is the acknowledgment that improbable events do indeed transpire. This realization challenges the practice of deducing the veracity or fallacy of hypotheses solely based on the likelihood of observations. Instead, it advocates for adjusting our subjective belief in the plausibility of a hypothesis in relation to other competing hypotheses.

Moreover, the book elucidates a critical distinction: Bayesian and frequentist methods are not merely two different perspectives but rather, the Bayesian approach forms the bedrock of probability understanding, with the frequentist method emerging as a historical aberration, a specific instance within the expansive Bayesian paradigm.

It was particularly enlightening to learn how a small cadre of British mathematics professors, namely Galton, Fisher, and Pearson, engineered an entire statistical school of thought. This school, founded on flawed and convenient principles, served to justify and rationalize their eugenic and racist viewpoints, reinforcing the Victorian-era racial supremacy of the British upper class through a veneer of mathematical rationalization. This review offered a fascinating glimpse into a quasi-scientific method employed by researchers who, standing on shaky ground, resort to limited group sampling and mathematical subterfuge to lend false precision and authority to their biased models and probability findings.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    5 out of 5 stars

Amazing book

Great read and must have for everyone in risk management community. Yet another wake up call to the flaws in many traditional risk analysis techniques.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

1 person found this helpful

  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    5 out of 5 stars

Changes World Views

Occasionally one finds a book or audio presentation that challenges the roots, the rock, on which all you thought you and beliefs are based is dissolves and is taken away. For me, “Bernoulli’s Fallacy: Statistical Illogical and the Crisis of Modern Science” is that kind of book.

As a child, I always wanted to be a scientist when I grew up, even though I never worked as a scientist, science was my passion, the ability to use numerical analysis to aid in understanding the world, business, finance, production control, and scientific research and publications was the rock I based my view of reality on. From the earliest learning to graduate school in philosophy, it was what could be counted on and trusted. Logic, Mathematics, and Philosophy could be used to solve any problem. Then I read both text and digital versions and listened to the audio rendition once, twice, and now many more times.

Slowly, with the precision of a surgeons knife, Aubrey Clayton has cut the roots of my knowing and smashed the rock on which they were anchored.

Coming to see the logical fallacy upon which much of modern statics (the orthodox Frequentist methods) has deceived me in a since that many of my key beliefs and understanding are built on / based on errors, logical errors, that, under some conditions approximate what is correct or valid. However, when applied in general as the prescribed method of analysis, criterion for publication, and the preferred method of analysis, above all others, one finds that these methods lead to many issues and often bogus or even silly conclusions.

Even worse, the methods are all that has been taught at all levels of education in the statistics departments. The result of starting with logical errors, all that follows results in asking the wrong questions, designing the wrong experiments, analyzing incorrectly and getting result for the orthodox methods that lend themselves to easy manipulation, uncertainty, and the ability to cleverly wave the hands of complex methods and conclude the most absurd of all possible outcomes that may result in millions of deaths.

Hopefully more will read and study the text and ideas and arrive at conclusions that aid them in doing better science, living more wholesome life’s, and having a deeper appreciation for clear and accurate thinking.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    5 out of 5 stars

A strong case for Bayes

Good intro to Bayesian statistics but the descriptions of equations and graphs were distracting. I bought the book for those.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    5 out of 5 stars

No punches pulled!

There has been some effort to make frequentist and Bayesian approaches seem compatible in the last few years. But they really aren’t compatible. Clayton gives a full explanation of why this is the case. The reader should know introductory statistics at the undergraduate level well to appreciate the arguments, but more advanced understanding beyond that is not required. Clayton is very generous in recapping basic claims in frequentist statistics before turning them upside down and demonstrating their absurdity.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

4 people found this helpful

  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    5 out of 5 stars

Excellent and persuasive

I read the book along with listening to the Audible narration. I'm a big Edwin Jaynes fan, so this was preaching to the choir. In particular, a Presbyterian sermon from Probability Theory, driving home its themes thoroughly.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    5 out of 5 stars

The best introduction to Bayesian stats I’ve read

The walk through the history of stats was very enlightening, and the discussion around frequency and probability explain why I’ve always had a hard time with stats in the past.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    5 out of 5 stars
  • M
  • 01-06-23

Explanation of Bayesian (Jaynesian) statistics

The "Fallacy" in the title is this: The validity of a hypothesis can be judged based solely on how likely or unlikely the observed data would be if the hypothesis were true. The author, Aubrey Clayton, calls it Bernoulli's Fallacy because Jacob Bernoulli's Ars Conjectandi is devoted to determining how likely or unlikely an observation is given that a hypothesis is true. What we need is not the probability of the data given the hypothesis, but the probability of the hypothesis given the data.

In the preface, Clayton describes the Bayesian vs Frequentist schism as a "dispute about the nature and origins of probability: whether it comes from 'outside us' in the form of uncontrollable random noise in observations, or 'inside us' as our uncertainty given limited information on the state of the world." Like Clayton, I am a fan of E.T. Jaynes's "Probability Theory: The Logic of Science", which presents the argument (proof really) that probability is a number representing a proposition's plausibility based on background information -- a number which can be updated based on new observations. So, I am a member of the choir to which Clayton is preaching.

And he is preaching. This is one long argument against classical frequentist statistics. But Clayton never implies that frequentists dispute the validity of the formula universally known as "Bayes's Rule". (By the way, Bayes never wrote the actual formula.) Disputing the validity of Bayes's Rule would be like disputing the quadratic formula or the Pythagorean Theorem. Some of the objections to Bayes/Price/Laplace are focused on "equal priors", a term which Clayton never uses. Instead, he says "uniform priors", "principle of insufficient reason", or (from J.M.Keynes) "principle of indifference".

I appreciate that it is available in audio. The narrator is fine, but I find that I need the print version too.

As someone already interested in probability theory and statistics, I highly recommend this book. I can't say how less interested individuals would like it.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

3 people found this helpful

  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    4 out of 5 stars

A well-marked path that cuts to the chase

If read/listened to attentively, it guides directly to the present [and past] day A.I. fallacy. Picture this:
Noah = Mathematics
on his barge
an Elephant = Statistics
and
a Penguine = Computer Science
Noah is pointing to their offspring, a creature with the body of a penguine [C.S.] and, attached to it, an elephant head [Statistics].
Noah [Mathematics]: "What the hell is this?!..."
E.g. Lifting oneself by one's own hair is unlikely to come down to horsepower.
[.... as Artur Avila pointed out (2014) for which he won the Fields Medal - hands down, to everyones' maximum satisfaction - puting in The Last Word on entire fields of Mathematics!]

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

3 people found this helpful

  • Overall
    4 out of 5 stars
  • Performance
    4 out of 5 stars
  • Story
    4 out of 5 stars

Rigorously Bayesian

Ignore the review from the snowflake triggered by the word Berkeley. This book is good. It sets up a sound logical argument against frequentist statistics. It give interesting historical details and explains why Bayesian methods are more robust.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!

19 people found this helpful