In this persuasive book, journalist Alina Tugend examines the delicate tension between what we’re told—we must make mistakes in order to learn—and the reality—we often get punished for them. She shows us that mistakes are everywhere, and when we acknowledge and identify them correctly, we can improve not only ourselves, but our families, our work, and the world around us as well. Bold and dynamic, insightful and provocative, Better by Mistake turns our cultural wisdom on its head to illustrate the downside of striving for perfection and the rewards of acknowledging and accepting mistakes and embracing the imperfection in all of us.
We often hear that we should learn from our mistakes (or, preferably, from the mistakes of others). Here Tugend digs down into this aphorism, and shows how we are actually given mixed messages: if mistakes are such a wonderful learning opportunity, why are we actually ashamed of them, punished for them, and why do we try so hard to avoid them to the extend of over-constraining our ambitions?
Tugend recounts some work showing that mistakes while learning can be valuable, for some people at least, and should be allowed, even encouraged, to enable the deeper learning that results.
Of course, mistakes while learning are somewhat different from mistakes that happen during operational use. But we are actually learning all the time, and so these other mistakes should also be exploited as learning opportunities, both to educate the one who errs, and to make the overall system more robust to inevitable mistakes.
The key point here is that a systems approach to error does not look just at the “active error”, the immediate surface error that seems the cause of the problem, but also digs down to find the “latent errors”, the systemic issues that allowed the active error both to occur and to have such poor consequences. Just fixing active errors results in a patchwork of arbitrary rules and regulations that can make the system more complex and fragile.
How to use errors during learning requires a supportive culture. Tugend gives an example of the culture in Japanese schooling.
So there are two aspects. First, the learner has to be trying: their mistake is a good-faith error, not the result of carelessness, or even deliberate sabotage; the learner genuinely thought that their mistake was the right thing to do. Second, the mistake can then be recognised as a signal that the learner needs to learn something, maybe a new fact (surface error), or a revised mental model of the problem space (latent error). Occasionally, it is the teacher who has to change: the learner’s mental model might be more advanced than the teacher had realised, and their “mistake” is due to tackling an over-simplistic problem in their more sophisticated manner. A systems approach to diagnosing the latent error should be able to distinguish these cases.
Of course, not all mistakes occur when we are trying our hardest: we might be tired, or distracted, or careless, or selfish. A systems approach can help diagnose these problems as well: we don’t need to learn new procedures, rather a new attitude. Although Tugend doesn’t explore this aspect (other than in the chapter on apologising), this links to the rationale behind punishment of errors: the assumption that the error is in some sense “deliberate”, and the way to learn the required new attitude is through some sort of pain. But a culture change is needed, to distinguish good-faith learning opportunity errors, from sloppy poor attitude unnecessary mistakes – and to treat them differently.
There is more in the book, including how the air transport industry learns from mistakes and its use of checklists, how men and women react differently to their mistakes, and apologising for mistakes. There is a lot of food for thought here. I would have liked more guidance on the productive exploitation of mistakes, but maybe that is for another book.