Books : reviews

Alina Tugend.
Better by Mistake: the unexpected benefits of being wrong.
Riverhead Books. 2011

rating : 3.5 : worth reading
review : 10 August 2018

New York Times columnist Alina Tugend delivers an eye-opening big idea: embracing mistakes can make us smarter, healthier, and happier in every facet of our lives.

In this persuasive book, journalist Alina Tugend examines the delicate tension between what we’re told—we must make mistakes in order to learn—and the reality—we often get punished for them. She shows us that mistakes are everywhere, and when we acknowledge and identify them correctly, we can improve not only ourselves, but our families, our work, and the world around us as well. Bold and dynamic, insightful and provocative, Better by Mistake turns our cultural wisdom on its head to illustrate the downside of striving for perfection and the rewards of acknowledging and accepting mistakes and embracing the imperfection in all of us.

We often hear that we should learn from our mistakes (or, preferably, from the mistakes of others). Here Tugend digs down into this aphorism, and shows how we are actually given mixed messages: if mistakes are such a wonderful learning opportunity, why are we actually ashamed of them, punished for them, and why do we try so hard to avoid them to the extend of over-constraining our ambitions?

Tugend recounts some work showing that mistakes while learning can be valuable, for some people at least, and should be allowed, even encouraged, to enable the deeper learning that results.

[p93] Gully and his colleagues found that not everyone learned better by making mistakes, but those with certain types of personality traits—who are good at processing information, open to learning, and not overly conscientious—were more effectively trained by being encouraged to make mistakes rather than avoid them. In other research, Gully and colleagues found that telling people to perform well during training resulted in higher immediate performance; but it also resulted in shallower processing of information, more superficial learning, and less confidence. In contrast, those people who were told to learn—and nor worry about mistakes—during training did more poorly initially but ended up with deeper processing of information, more complex learning, and more confidence about performance. Those in the latter group also showed higher performance when faced with a really challenging version of the task they were trained to perform.

Of course, mistakes while learning are somewhat different from mistakes that happen during operational use. But we are actually learning all the time, and so these other mistakes should also be exploited as learning opportunities, both to educate the one who errs, and to make the overall system more robust to inevitable mistakes.

[p112] “when an adverse event occurs, the important issue is not who blundered, but how and why the defenses failed,” […] In life-or-death situations, it is important to set up a system in which, to whatever degree possible, one person’s error cannot sink the ship. […] focussing on active errors lets the latent errors remain in the system, and “their accumulation actually makes the system more prone to future failure.”

The key point here is that a systems approach to error does not look just at the “active error”, the immediate surface error that seems the cause of the problem, but also digs down to find the “latent errors”, the systemic issues that allowed the active error both to occur and to have such poor consequences. Just fixing active errors results in a patchwork of arbitrary rules and regulations that can make the system more complex and fragile.

How to use errors during learning requires a supportive culture. Tugend gives an example of the culture in Japanese schooling.

[p191] Making a mistake, therefore, isn’t a reflection of your lack of ability or intelligence, but simply that you haven’t learned something yet. “You have to show you’re trying hard—they have this expression for ‘facing the desk,’ ” […]

So there are two aspects. First, the learner has to be trying: their mistake is a good-faith error, not the result of carelessness, or even deliberate sabotage; the learner genuinely thought that their mistake was the right thing to do. Second, the mistake can then be recognised as a signal that the learner needs to learn something, maybe a new fact (surface error), or a revised mental model of the problem space (latent error). Occasionally, it is the teacher who has to change: the learner’s mental model might be more advanced than the teacher had realised, and their “mistake” is due to tackling an over-simplistic problem in their more sophisticated manner. A systems approach to diagnosing the latent error should be able to distinguish these cases.

Of course, not all mistakes occur when we are trying our hardest: we might be tired, or distracted, or careless, or selfish. A systems approach can help diagnose these problems as well: we don’t need to learn new procedures, rather a new attitude. Although Tugend doesn’t explore this aspect (other than in the chapter on apologising), this links to the rationale behind punishment of errors: the assumption that the error is in some sense “deliberate”, and the way to learn the required new attitude is through some sort of pain. But a culture change is needed, to distinguish good-faith learning opportunity errors, from sloppy poor attitude unnecessary mistakes – and to treat them differently.

There is more in the book, including how the air transport industry learns from mistakes and its use of checklists, how men and women react differently to their mistakes, and apologising for mistakes. There is a lot of food for thought here. I would have liked more guidance on the productive exploitation of mistakes, but maybe that is for another book.