Nobody wants to fail. But in highly complex organizations, success can happen only when we confront our mistakes, learn from our own version of a black box, and create a climate where it's safe to fail.
We all have to endure failure from time to time, whether it's underperforming at a job interview, flunking an exam, or losing a pickup basketball game. But for people working in safety-critical industries, getting it wrong can have deadly consequences. Consider the shocking fact that preventable medical error is the third-biggest killer in the United States, causing more than 400,000 deaths every year. More people die from mistakes made by doctors and hospitals than from traffic accidents. And most of those mistakes are never made public because of malpractice settlements with nondisclosure clauses.
For a dramatically different approach to failure, look at aviation. Every passenger aircraft in the world is equipped with an almost indestructible black box. Whenever there's any sort of mishap, major or minor, the box is opened, the data is analyzed, and experts figure out exactly what went wrong. Then the facts are published and procedures are changed, so the same mistakes won't happen again. By applying this method in recent decades, the industry has created an astonishingly good safety record.
Few of us put lives at risk in our daily work, as surgeons and pilots do, but we all have a strong interest in avoiding predictable and preventable errors. So why don't we all embrace the aviation approach to failure rather than the health-care approach? As Matthew Syed shows in this eye-opening audiobook, the answer is rooted in human psychology and organizational culture.
Syed argues that the most important determinant of success in any field is an acknowledgment of failure and a willingness to engage with it....
©2015 Matthew Syed (P)2015 Penguin Audio
When you begin this book, it seems as if it will be a straight comparison between the airline safety model of reviewing and learning from accidents (open) and the medical system model for covering up mistakes (closed), and it does describe few powerful illustrative examples from each of those fields. However, it turns out to have quite a few more dimensions and lessons, For example, it also turns its focus on the criminal justice system (closed) and the political system (closed). These analyses alone would make it a good book and support a strong argument that learning from mistakes is hugely important.
However, the author takes it a step further and looks at some of the psychological reasons why all of us find it so difficult to admit mistakes (cognitive dissonance), and how we so naturally create narratives that support our original decisions. Like some of the best books in this genre, the book forces us to admit that we also are subject to the same kinds of biases that make it difficult to create and maintain "open" systems that encourage us to regularly test our ideas, even while it provides one example after another of why mistakes are essential to learning.
Simon Slater is a good narrator: pace, accent, and expression contribute to an excellent audio book.
I love listening to books when cycling, paddleboarding, etc but I press pause when I need to concentrate. Its safer & I don't lose the plot!
This book is all about failure. It’s about the fact that we hide and stigmatise failure when we should be embracing it - and using it to continuously improve all our enterprises by submitting them to trial and error.
He gives many excellent, moving and gripping examples of contexts where this approach was lacking and resulted in dire consequences: In the medical profession, senior doctors have very high status and self-esteem, and they don’t like to admit their errors. They use euphemisms such as ‘a complication’ or an ‘adverse event’. The author argues that the lack of openness about error means that we are deprived of the opportunity to analyse what went wrong and use this information to continuously improve our systems. He gives a graphic example of a woman who needlessly dies because a group of doctors are finding it difficult to pass a breathing tube during a routine operation. They become fixated with this task and they lose track of time, when they could have performed an emergency tracheotomy – a relatively straightforward lifesaving procedure. The nurse was there ready with the tracheotomy kit - but she only hinted instead of speaking up forcefully, because of the steep authority gradient between her and the doctors.
A second example is criminal law. Since the invention of DNA testing, it has become apparent that our jails are full of innocent people wrongly convicted. But the legal system has been slow to admit its errors and to introduce processes to fix this. Again, high status people, such as investigators and prosecutors are reluctant to admit that they are error prone.
One industry that seems to get this right is aviation. All errors are investigated thoroughly and recommendations are made to change practice. For example, in aviation there have been many crashes resulting when junior members of a team wouldn’t speak up to alert the captain of a danger, because the captain was the commander and speaking up could have resulted in severe rebuke. So the aviation industry changed the culture to a teamwork approach and encouraged all crew members to speak up. This has been a great success, and lessons from this have now been adopted in many medical settings.
In the field of sociology, there was an initiative introduced called ‘Scared Straight’ - designed to put potential delinquents off serious crime by sending them to a prison for 3 hours to spend time with hardened criminals. It appeared to work, and was subsequently adopted Worldwide. But nobody actually tested it to see if it really did work, except to send out some questionnaires. Once it was subjected to rigorous scientific testing using a randomised controlled trial it was shown that this intervention actually increased criminality in the subjects by about 25%.
The point is, you don’t know if something is going to succeed or fail unless you test it. You can’t predict whether something will work or not purely by intuition or because it seems logical – the world is just too complex and there are too many unknown variables. So you should test your idea, then change it and test it again, and so on. This process works the same way that natural selection works in evolution. The entrepreneur who invented the very successful Dyson vacuum cleaner made over 5,000 prototypes and this resulted in an excellent product – he wasn’t afraid of failure, he harnessed it as a tool to drive continuous improvement.
As you have probably guessed if you’ve read this far, I enjoyed this book. It’s interesting and as well as giving an insight into how major institutions and industries could be improved if they embraced failure, it also shares some ideas that we can all apply in our own lives.
Qurie de Berk
I had no expectation of this book. I don't even know how I came by it. One day I just picked it up and started to read it. At the opening chapter I almost put it down. What a horrible story! But I stuck with it and soon I was unable to put it down. It is likethree books into one. The goodstuff just keeps on coming. So rich with information that it is too much to take in in one go.
Wonderful book! This must have taken the author ages to write.
I really enjoyed this book. The concepts are very important, innovative, and current 2016. I also enjoyed how the author ties in other great books I've been meaning to read (creativity inc, taleb, )
As an avid non fiction consumer I highly recommend this book
Let's face it, these authors aren't paying me, so there's no need to lie!!
This is one of those books that, if I were God, I would force everybody to read. It describes the motivation (spoiler alert: cognitive dissonance) behind many of the dumbest decisions that human beings make.
One strikingly egregious example is criminal prosecutors, and their reluctance to immediately release someone from prison, after exonerating DNA evidence has been presented, post-conviction. Cognitive dissonance. It is the great destroyer of logic and rationale and the more you learn about it, the easier it will become for you to spot it and call people out on it.
Trust me, this is a book that EVERYONE should read. Women, Men, kids of all ages, EVERYONE will appreciate its science, research, and conclusions.
It opened my eyes nice and wide.
Used over-simplified assumptions in extremely complex issues, and vice versa, as proof. A bit longer than it had to be and a bit too dramatic at times. Overall pretty good book and serves as a reminder of some basics you probably already knew.
There are a lot of Fad management styles out there. This book simply poses the questions "Do any of them work, and how do you know?"
This book offered me a new way of thinking and seeing the world. There are things we take for granted but are critical to the outcomes we experience.
I love the book and the lessons are very applicable.
Report Inappropriate Content
If you find this review inappropriate and think it should be removed from our site, let us know. This report will be reviewed by Audible and we will take appropriate action.