“And this is invariably how progress happens. It is an interplay between the practical and the theoretical, between top-down and bottom-up, between creativity and discipline, between the small picture and the big picture. The crucial point – and the one that is most dramatically overlooked in our culture – is that in all these things, failure is a blessing, not a curse. It is the jolt that inspires creativity and the selection test that drive evolution.” (Black Box Thinking)
Author: Matthew Syed
Publisher: John Murray
Publication Date: 2015
Origin: There are two reasons why I read Black Box Thinking. First, I’d picked up Matthew Syed’s first book, Bounce, on a whim – and loved it. I’ve subsequently recommended it to many friends and colleagues. So, by default I’m interested in what Syed has to say. Second, and not coincidentally, I’ve read a great deal about ‘success’: what it means, how it’s achieved, the factors involved, the role of luck, and so on. Black Box Thinking promised to add to that trove of information and insight. I want to learn from my own failures and from the failures of others, and I want to make use of that knowledge both as an individual and as a member of or leader of organizations.
Summary: Learn from your mistakes!
OK, so there’s a bit more to it than that, and I don’t provide that simple summary above to be flippant or cavalier.
In Syed’s words:
“The purpose of this book is to offer a radically different perspective. It will argue that we need to redefine our relationship with failure as individuals, as organisations, and as societies. This is the most important step on the road to a high-performance revolution: increasing the speed of development in human activity and transforming those areas that have been left behind. Only by redefining failure will we unleash progress, creativity and resilience.”
That radically different perspective is a concept called ‘black box thinking’. Again, quoting Syed:
“This, then, is what we might call ‘black box thinking’. For organisations beyond aviation, it is not about creating a literal black box; rather, it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.”
To build, illustrate, and argue his case, Syed structures the book into six parts:
- Part one explains the logic of failure, using often-tragic examples to illustrate and to contrast different approaches. This part culminates be explaining the paradox of success: that it is built on failure.
- Part two dives into cognitive dissonance (one of the best dissonances!) to highlight that our brains are often jerks when it comes to learning from our mistakes. The examples from the criminal justice system will chill your blood – and if they don’t, well, you might want to get checked for the evil gene and see if they can CRISPR it out of you.
- Part three, “Confronting Complexity”, confronts complexity; that is, this part starts to explain how learning from mistakes is both necessary and often difficult in a very complex world filled with complex problems. We start to see the powerful combination of top-down and bottom-up approaches working in tandem to achieve what neither could, alone, and we also explore some pitfalls, particularly using the example of the “Scared Straight” programs that are popular with criminal justice types.
- Part four delves into marginal gains, explaining both their power and practicality in the aforementioned annoyingly complex world or version of The Matrix in which we reside.
- Part five touches on how to create productive, learning-oriented failure models without succumbing to the blame game, which is filled with fingerpointing.
- Part six, “Creating a Growth Culture”, brings it all together, so we can apply the knowledge gained and lessons learned throughout.
My Take: While I was reading Black Box Thinking, I kept mentioning it to people. That’s a pretty strong sign that a book is really resonating with me. In some cases, I was sharing some of the specific examples (e.g., the Unilever nozzle, some of the justice system points, contrasting healthcare with aviation, etc.), and in other cases it was the more general concept.
I remember distinctly speaking with a colleague about cognitive dissonance, and how loathe people are to change their minds about something when their egos are involved; the colleague and I were both perplexed, as we each identify with a growth mindset, rather than an, “I have to be right all the time!” mindset.
Anyway, what does this rambling represent? Most likely: that I like and recommend Black Box Thinking, and that I think folks in all manner of walks of life (hey, that’s you!) could benefit from understanding its ideas and recommendations.
Read This Book If: …You want to help create organizations that learn from mistakes, or at the very least want to be able to understand why many organizations (and people) don’t.
Notes and Quotes:
“A progressive attitude to failure turns out to be a cornerstone of success in any institution.”
- p12, after a heart-wrenching introductory story about a routine medical operation gone tragically wrong: “It sounds simple, doesn’t it? Learning from failure has the status of a cliché. But it turns out that, for reasons both prosaic and profound, a failure to learn from mistakes has been one of the single greatest obstacles to human progress. Healthcare is just one strand in a long, rich story of evasion. Confronting this could not only transform healthcare, but business, politics and much else besides. A progressive attitude to failure turns out to be a cornerstone of success in any institution.”
- p16: “It is probably worth stating here that nobody wants to fail. We all want to succeed, whether we are entrepreneurs, sportsmen, politicians, scientists or parents. But at a collective level, at the level of systemic complexity, success can only happen when we admit our mistakes, learn from them, and create a climate where it is, in a certain sense, ‘safe’ to fail.” Perhaps worth adding that in Lean Startup methodology, hypotheses are consciously tested, and failure is seen as a tool to validate learning.
- p19: note to self
- p19, after continuing the introductory anecdote to illustrate the reasoning and the language people use when discussing failure: “This kind of reasoning represents the essential anatomy of failure-denial. Self-justification, allied to a wider cultural allergy to failure, morphs into an almost insurmountable barrier to progress.”
- p27, quoting Eleanor Roosevelt: “Learn from the mistakes of others. You can’t live long enough to make them all yourself.”
- p29, after describing some aviation disasters: “In each case the investigators realised that crews were losing their perception of time. Attention, it turns out, is a scarce resource: if you focus on one thing, you will lose awareness of other things.” I have two thoughts, here: first, this behaviour is the negative flipside of Mihaly Csikszentmihalyi‘s well-examined concept of “flow”. Second, we must endeavour to train ourselves to be able to step back and think in in wider context when we’re in pressure situations, else we’ll fall victim to losing perception of time.
- p30, on mitigated language: “This is now a well-studied aspect of psychology. Social hierarchies inhibit assertiveness. We talk to those in authority in what is called ‘mitigated language’.”
- p33, defining the titular term (note that “black box” is used in the sense of a data recorder, not an unknown process): “This, then, is what we might call ‘black box thinking’. For organisations beyond aviation, it is not about creating a literal black box; rather, it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.”
- p36-38 explains the well-known story of Abraham Wald, and how he contributed to understanding aircraft vulnerabilities during wartime. I love the story (and its lessons) and have told it in professional settings a few times over the years after first learning about it in Brilliant Blunders.
“You have to take into account all the data, including the data you cannot immediately see, if you are going to learn from adverse incidents.”
- p39 summarizes: “This is a powerful example because it reveals a couple of key things. The first is that you have to take into account all the data, including the data you cannot immediately see, if you are going to learn from adverse incidents. But it also emphasises that learning from failure is not always easy, even in conceptual terms, let alone emotional terms. It takes careful thought and a willingness to pierce through the surface assumptions. Often, it means looking beyond the obvious data to glimpse the underlying lessons. This is not just true of learning in aviation, but in business, politics and beyond.”
- p44, quoting Captain Chesley Sullenberger a few months after his famous landing on the Hudson river, underscores both the cost and the returns of black box thinking: “Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died… We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.”
- p51, a note for anyone who has to conduct performance evaluations: “Feedback, when delayed, is considerably less effective in improving intuitive judgement.”
- p52, TIL about Phlogiston Theory
- p52: “Success is always the tip of an iceberg. We learn vogue theories, we fly in astonishingly safe aircraft, we marvel at the virtuosity of experts. But beneath the surface of success – outside our view, often outside our awareness – is a mountain of necessary failure.”
- p57: note to self
“Failure is inevitable in a complex world. This is precisely why learning from mistakes is so imperative.”
- p58, just underscoring what many still don’t see: “Failure is inevitable in a complex world. This is precisely why learning from mistakes is so imperative.”
- p61, with a fun/terrible fact, quoting Dr. Michael Gillam, director of the Microsoft Medical Media Lab: “The total time from Lancaster’s definitive demonstration of how to prevent scurvy to adoption across the British empire was 264 years.”
- p73: “We cannot learn if we close our eyes to inconvenient truths, but we will see that this is precisely what the human mind is wired up to do, often in astonishing ways.”
- p78-79 tells the story of When Prophecy Fails to illustrate cognitive dissonance
- p80: “When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.”
“The difficulty with (accepting our original judgements as being wrong) is simple: it is threatening. It requires us to accept that we are not as smart as we like to think. It forces us to acknowledge that we can sometimes be wrong, even on issue on which we have staked a great deal.”
- p81: “The difficulty with (accepting our original judgements as being wrong) is simple: it is threatening. It requires us to accept that we are not as smart as we like to think. It forces us to acknowledge that we can sometimes be wrong, even on issue on which we have staked a great deal.” In my own experience, I observe that people would rather cling to a known and provable falsehood than to be of sufficient character to genuinely pursue truth. This phenomenon baffles me.
- p83 clarifies and extends, a bit, on the circumstances in which we suffer from dissonance: “It is only when we have staked our ego that our mistakes of judgement become threatening. That is when we build defensive walls and deploy cognitive filters.” So there’s a small lesson here: if you want to change someone’s mind, then help them decouple their ego from the decision.
“Lying to oneself destroys the very possibility of learning. How can one learn from failure if one has convinced oneself that a failure didn’t actually occur?”
- p95, on internal and external deception: “Self-justification is more insidious. Lying to oneself destroys the very possibility of learning. How can one learn from failure if one has convinced oneself – through the endlessly subtle means of self-justification, narrative manipulation, and the wider psychological arsenal of dissonance-reduction – that a failure didn’t actually occur?”
- p109: “In his seminal book, Why Smart Executives Fail: And What You Can Learn from Their Mistakes, Sydney Finkelstein, a management professor at Dartmouth College, investigated major failures at over fifty institutions. He found that error-denial increases as you go up the pecking order… The reason should by now be obvious. It is those at the top of business who are responsible for strategy and therefore have the most to lose if things go wrong. They are far more likely to cling on to the idea that the strategy is wise, even as it is falling apart, and to reframe any evidence that says otherwise. Blinded by dissonance, they are also the least likely to learn the lessons.“
- p111 has a neat example, the important lesson of which is that to avoid confirmation bias one needs to consciously try to falsify a hypothesis, rather than just test for validity.
- p112, including a wonderful passage from Karl Popper: “It provides another reason why the scientific mindset, with a healthy emphasis on falsification, is so vital. It acts as a corrective to our tendency to spend our time confirming what we think we already know, rather than seeking to discover what we don’t know. As the philosopher Karl Popper wrote: ‘For if we are uncritical we shall always find what we want: we shall look for, and find, confirmations, and we shall look away from, and not see, whatever might be dangerous to our pet theories. In this way it is only too easy to obtain … overwhelming evidence in favour of a theory which, if approached critically, would have been refuted.'”
“Intelligence and seniority, when allied to cognitive dissonance and ego, is one of the most formidable barriers to progress in the world today.”
- p116, in no uncertain terms: “Intelligence and seniority, when allied to cognitive dissonance and ego, is one of the most formidable barriers to progress in the world today.”
- p127: note to self
- p135-137 have a terrific example from Unilever involving the design of a detergent nozzle
- p144, reminiscent of some of the evidence in Antifragile: “Tinkering, tweaking, learning from practical mistakes: all have speed on their side. Theoretical leaps, while prodigious, are far less frequent. Ultimately, technological progress is a complex interplay between theoretical and practical knowledge, each informing the other in an upward spiral. But we often neglect the messy, iterative, bottom-up aspect of this change because it is easy to regard the world, so to speak, in a top-down way. We try to comprehend it from above rather than discovering it from below.”
“We are so eager to impose patterns upon what we see, so hardwired to provide explanations, that we are capable of ‘explaining’ opposite outcomes with the same cause without noticing the inconsistency.”
- p147, on narrative fallacies (a human propensity to create clear and compelling stories/explanations after an event): “We are so eager to impose patterns upon what we see, so hardwired to provide explanations, that we are capable of ‘explaining’ opposite outcomes with the same cause without noticing the inconsistency.”
- p147, quoting Daniel Kahneman: “Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen. Any recent salient event is a candidate to become the kernel of a causal narrative.”
- In the margins of p152 I’ve scrawled, “Ugh, how many times have I said this???”, alongside: “The desire for perfection rests upon two fallacies. The first resides in the miscalculation that you can create the optimal solution sitting in a bedroom or ivory tower and thinking things through rather than getting out into the real world and testing assumptions, thus finding their flaws. It is the problem of top-down over bottom-up. The second fallacy is the fear of failure. Earlier on we looked at situations where people fail and then proceed to either ignore or conceal those failures. Perfectionism is, in many ways, more extreme. You spend so much time designing and strategising that you don’t get a chance to fail at all, at least until it is too late… You are so worried about messing up that you never even get on the field of play.” It’s worth noting that these problems are exactly the ones that can be avoided by implementing Lean Startup methodologies.
- p158 contrasts the ballistic model of success with the guided missile model, because what happens after you pull the trigger matters a lot: “The key is to adjust the flight of the bullet, to integrate this new information into the ongoing trajectory. Success is not just dependent on before-the-event reasoning, it is also about after-the-trigger adaptation. The more you can detect failure (i.e., deviation from the target), the more you can finesse the path of the bullet onto the right track. And this, of course, is the story of aviation, of biological evolution and well-functioning markets.”
- p158 continues: “It is by getting the balance right between top-down strategy and a rigorous adaptation process that you hit the target. It is fusing what we already know, and what we can still learn.”
“Clinging to cherished ideas because you are personally associated with them is tantamount to ossification. As the great British economist John Maynard Keynes put it: ‘When my information changes, I alter my conclusions. What do you do, sir?'”
- p159: “Clinging to cherished ideas because you are personally associated with them is tantamount to ossification. As the great British economist John Maynard Keynes put it: ‘When my information changes, I alter my conclusions. What do you do, sir?'”
- p178, after explaining how the widely adopted Scared Straight programs are actually counterproductive: “Often, you need to test the counterfactual… And this is really the point. It doesn’t require people to be actively deceitful or negligent for mistakes to be perpetuated. Sometimes it can happen in plain view of the evidence, because people either don’t know how to, or are subconsciously unwilling to, interrogate the data. But how often do we actually test our policies and strategies? How often do we probe our assumptions, in life or at work?”
- p190, on the ethics of randomized clinical trials: “Critics of randomised clinical trials often worry about the morality of ‘experimenting on people’. Why should one group get X while another is getting Y? Shouldn’t everyone have access to the best possible treatment Put like this, RCTs may seem unethical. But now think about it in a different way. If you are genuinely unsure which policy is the most effective, it is only by running a trial that you can find out. The alternative is not morally neutral, it simply means that you never learn. In the long run this helps nobody.”
- p190 goes on, quoting French-born economist Esther Duflo on the importance of incremental/marginal gains: “It is very easy to sit back and come up with grand theories about how to change the world. But often our intuitions are wrong. The world is too complex to figure everything out from your armchair. The only way to be sure is to go out and test your ideas and programme, and to realise that you will often be wrong. But that is not a bad thing. It leads to progress.”
“Marginal gains is not about making small changes and hoping they fly. Rather, it is about breaking down a big problem into small parts in order to rigorously establish what works and what doesn’t.”
- p192: “Marginal gains is not about making small changes and hoping they fly. Rather, it is about breaking down a big problem into small parts in order to rigorously establish what works and what doesn’t.”
- p195, I just liked this quote: “Creativity not guided by a feedback mechanism is little more than white noise. Success is a complex interplay between creativity and measurement, the two operating together, the two sides of the optimisation loop.”
“Creativity not guided by a feedback mechanism is little more than white noise. Success is a complex interplay between creativity and measurement, the two operating together, the two sides of the optimisation loop.”
- p203 talks a little bit about local maxima, which is something for which I’m always on-guard due to my engineering background. I most recently cautioned a former colleague about local maxima as a potential negative side-effect of optimization during a conversation about digital marketing. Syed references Eric Ries’ post, Lessons Learned: Learning is better than optimization (the local maximum problem)
- p207, quoting Sir James Dyson: “People think of creativity as a mystical process. The idea that creative insights emerge from the ether, through pure contemplation. This model conceives of innovation as something that happens to people, normally geniuses. But this could not be more wrong. Creativity is something that has to be worked at, and it has specific characteristics. Unless we understand how it happens, we will not improve our creativity, as a society or as a world.”
- p212, another short, punchy line: “Removing failure from innovation is like removing oxygen from a fire.”
- p219, again quoting Dyson: “If insight is about the big picture, development is about the small picture. The trick is to sustain both perspectives at the same time.”
- p220, continuing the Dyson story: “Dyson was not the first to come up with the idea of a cyclone vacuum cleaner. He was not even the second, or the third. But he was the only one with the stamina to ‘fail’ his concept into a workable solution. And he had the rigour to create an efficient manufacturing process, so he could sell a consistent product.”
- p221, quoting Jim Collins: “The great task, rarely achieved, is to blend creative intensity with relentless discipline so as to amplify the creativity rather than destroy it.”
- p227, with a nice summary: “And this is invariably how progress happens. It is an interplay between the practical and the theoretical, between top-down and bottom-up, between creativity and discipline, between the small picture and the big picture. The crucial point – and the one that is most dramatically overlooked in our culture – is that in all these things, failure is a blessing, not a curse. It is the jolt that inspires creativity and the selection test that drive evolution.”
- p245, on the risks of blame defeating a culture of openness: “The question, according to Sidney Dekker, is not: who is to blame? It is not even: where, precisely, is the line between justifiable blame and an honest mistake? because this can never be determined in the abstract. Rather, the question is: do those within the organization trust the people who are tasked with drawing that line? It is only when people trust those sitting in judgement that they will be open and diligent.“
- p249 lists how projects often flow: “The six phases of a project: 1. Enthusiasm 2. Disillusionment 3. Panic 4. Search for the guilty 5. Punishment of the innocent 6. Rewards for the uninvolved”
- p250, once again quoting Karl Popper: “True ignorance is not the absence of knowledge, but the refusal to acquire it.”
- Hey hey, I can relate to this, from p286: “This hints at one of the great paradoxes about school and life. Often it is those who are the most successful who are also the most vulnerable. They have won so many plaudits, been praised to lavishly for their flawless performances, that they haven’t learned to deal with the setbacks that confront us all.”
- We owe this eloquent quote warning against confirmation bias, from p298, to Francis Bacon: “The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate.”
- p310: “Another practical issue when it comes to harnessing the power of failure is to do so while minimising the costs. One way to achieve this for corporations and governments is with pilot schemes. These provide an opportunity to learn on a small scale. But what is vital is that pilots are designed to test assumptions rather than confirm them.”
- p311, on the concept of a ‘pre-mortem’: “Another ‘failure-based’ technique, which has come into vogue in recent years, is the so-called pre-mortem. With this method a team is invited to consider why a plan has gone wrong before it has even been put into action. It is the ultimate ‘fail fast’ technique. The ideas is to encourage people to be open about their concerns, rather than hiding them out of fear of sounding negative. The pre-mortem is crucially different from considering what might go wrong. With a pre-mortem, the team is told, in effect, that ‘the patient is dead’: the project has failed; the objectives have not been met; the plans have bombed. Team members are then asked to generate plausible reasons why. By making the failure concrete rather than abstract, it alters the way the mind thinks about the problem.” I’ll add that this approach would also be a handy way to determine metrics and milestones that might be used while a project is in-flight; it’s much better to think of these things ahead of time, than to try to shoehorn them in later on.
“True ignorance is not the absence of knowledge, but the refusal to acquire it.” – Karl Popper
[…] the margin of p46, I’ve scrawled Black Box Thinking, because Comey tells the story of how he and his wife had a soon who died of a treatable condition […]
[…] which will be familiar to anyone with control systems training, or who’s read things like Black Box Thinking and The Lean Startup: “In so many other high-stakes endeavors, forecasters are groping in […]
[…] my book report on Matthew Syed’s Black Box Thinking, I mentioned that, “p111 has a neat example, the important lesson of which is that to avoid […]