The author's thesis is that we all have 'mental blindspots' or 'mental tunnels' where we systematically make grave errors and get sidetracked into the wrong answer in certain kinds of problems. He claims these tunnels are the cognitive equivalent to optical illusions: they are a result of the way we are. He illustrates them with fairly simple examples: the reader is invited to fall into various traps, and, once inside, has the trap lucidly explained.
One example I particularly liked was being offered two options, A and B (to do with the efficacy of different medical therapies), which, according to probability theory, have the same 'expectation value'. So, by probability theory, there should be no preference for one rather than the other. However, most people, myself included, choose A. The author stresses that it is not necessarily irrational to prefer A: we might simply weight the chances of certain survival differently. But then comes the crunch: we are now offered C and D, which are equivalent to A and B respectively; they have simply been phrased negatively. Now most people, myself included again, choose D! He points out that this is irrational behaviour.
I am probably rather more numerate than the target audience, but even so, I managed to fall into about half of the offered traps. The ones I managed to avoid were mostly of the form of well known probability paradoxes ('paradoxical' only in that the solution conflicts with naive intuition). For example, consider a 'reliable' test, one with a low false positive rate. We tend to think that, if we get a positive result, then the subject is very likely to have the trait being tested for. But, if we have applied the test to a population with a very low probaility of having the trait, then most of the positive results would actually be false ones! This is simply because the test is applied to so many subjects who do not have the trait, that false positives have a good chance of occuring. This kind of mental tunnel can have serious consequences: consider DNA-fingerprint evidence.
I have only one quibble. Since the author claims that these traps can be avoided with suitable training, and that this training is difficult to transfer to even slightly different problem areas, I would have hoped for a wider range of examples to help prime my 'intuition pump'.
But no matter; the examples presented are well thought out, and well explained. I am now on my guard in a few more areas.