A former Wall Street quant sounds an alarm on the mathematical models that pervade modern life - and threaten to rip apart our social fabric
We live in the age of the algorithm. Increasingly the decisions that affect our lives - where we go to school, whether we get a car loan, how much we pay for health insurance - are being made not by humans but by mathematical models. In theory this should lead to greater fairness: Everyone is judged according to the same rules, and bias is eliminated.
But as Cathy O'Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable even when they're wrong. Most troublingly, they reinforce discrimination: If a poor student can't get a loan because a lending model deems him too risky (by virtue of his zip code), he's then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a "toxic cocktail for democracy". Welcome to the dark side of big data.
Tracing the arc of a person's life, O'Neil exposes the black-box models that shape our future, both as individuals and as a society. These "weapons of math destruction" score teachers and students, sort résumés, grant (or deny) loans, evaluate workers, target voters, set paroles, and monitor our health.
O'Neil calls on modelers to take more responsibility for their algorithms and on policy makers to regulate their use. But in the end, it's up to us to become savvier about the models that govern our lives. This important book empowers us to ask the tough questions, uncover the truth, and demand change.
©2016 Cathy O'Neil (P)2016 Random House Audio
"Weapons of Math Destruction opens the curtain on algorithms that exploit people and distort the truth while posing as neutral mathematical tools. This book is wise, fierce, and desperately necessary." (Jordan Ellenberg, University of Wisconsin-Madison, author of How Not to Be Wrong )
"Weapons of Math Destruction shines invaluable light on the invisible algorithms and complex mathematical models used by government and big business." (Astra Taylor, author of The People's Platform)
A WMD, or weapon of math destruction, are usage of algorithm that end up being discriminatory toward some people, or that cause problem with their wide scale deployment. For example, an algorithm that identify poor people can deny them services that help them, making them poorer. The algorithm prediction become self-fulfilling and prevent people from improving their condition.
The premise of the book is very good, and there are indeed a lot of good example of how misuse of big data algorithms can wreak havoc among society. The problem is that the author indignation push her away from what should have been the main subject of the book.
In the course of the book, the author raise a lot of recurring problem with WMD, like the "Flock of the feathers" generalization, the "self-fulfilling" prediction, the "discriminating proxy variable ", the "non-appealable conclusion" problem, the "non-measurable important factor". But those categories of problem, which, in my opinion, should have been the focus of the book, take a backseat toward the real subject of the book: how much the United State has social problems.
Each chapter is written to for denounce a specific social problem in the US, like predatory ads toward the poor, racial discrimination toward minority, terrible working hour among low wage workers, and so on. Some of those subjects are indeed caused by WMD. But for some, the link with the purported subject of the book is a bit strenuous. In some case, the author even exclaims "well, that has nothing to do with WMD of course". And a lot of time, WMD are not the root cause of the problem, they only exacerbate an existing one.
That leave you with a book that is more like a classical sociology book denouncing the ill of the American society, with some talk about big data sprinkled on top. If, like me, you are not an American, you may feel a bit left out by that book. This is a shame, because by refocusing the book on the generic problem caused by WMD that I described above, the book could have had a much broader appeal. Don't get me wrong: The problem O Neil talk about ARE important social problem. But they are very specific to her own country, and the militant tone can become grating. I felt at time that the author was not explaining to me how WMD work and how to deal with them, but was rather trying to force her opinion of how the world should be over me. She was dictating me how I should think, rather than helping me shape my own opinion.
In the end, I would have preferred a more objective tone and a better focus on WMD themselves, with conclusion that can be applied more broadly to everyone, not just US citizens.
O'Neil makes a strong case for the increasing importance of ethics in data science. The evidence for discrimination, whether intentional or not, is compelling. This book is a must for data professionals and anyone concerned with growing inequality in the economy.
This book is totally worth the listen for the intro and first chapter alone. It's very well-written and easy to follow, and manages to tell clear stories about how the software we use to assess teacher performance or insurance risk is all to often encoded with the prejudices and blind spots of the people who make it. It shows how that is already damaging equality and democracy, and warns of areas where it may get worse.
As a software designer, the one thing I would have loved from this book would be a little more depth about how software products might avoid these pitfalls. However, I'm probably coming at this book with unfair expectations, and it's likely a subject I just need to research more deeply.
Overall, if you enjoy podcasts like Freakonomics and Planet Money, you'll probably love this. Happy I listened!
I like Cathy's writing and analysis. I wish they had gotten a professional reader though, it would have made it more enjoyable. It's not as if Cathy is awful or anything, she is just not professional.
The story is presented in a series of topics disclosures that compare by theme, data models can be made and used in ways that can damage society and make bad situations worse. Cathy O'Neil reveals how data models can be relied on with good intentions in mind, and by ignorance, dismissal or narrow-sightedness, can misrepresent, injure and derail people and societal function-ability.
This is a must read! I thoroughly enjoyed the real world examples of how everything I do is a data point that is being used against me.
It is well written and easy to listen to.
The author is a card carrying member of Occupy Wall Street. The entire book is a rant against the unfairness in society (e.g., poor people have bad credit because they are poor and society holds them back).
The algorithms and deep learning approach are not really analyzed in any depth but just serve as a convenient scapegoat for injustice.
There is nothing wrong with such a diatribe but it sure wouldn't sell many books and has been the topic of many, many books already.
As a graduate student studying Machine Learning this book provided me with several new outlooks. I always new that what I work on had the pontential to change the world, but I never thought about how it could change it for the worse. Though I feel the author spends too little time talking about the good her overall point is good.
It's a good try to show the limitations of a growing algorithm dependent society, but instead of rehashing the sociopolitical impacts, it would have been great to delve into how to address the short falls.
A very well explained argument into how algorithms can unintentionally be corrosive towards those who are economically challenged.
Grabs your attention
How dangerous wellness programs established under the affordable care act, can be to you overall freedom.
The societal power of the Data Scientists
Just simply a fun and great book.
Report Inappropriate Content
If you find this review inappropriate and think it should be removed from our site, let us know. This report will be reviewed by Audible and we will take appropriate action.