There are a few books which were very influential on “The Unaccountability Machine”, but which I ended up not writing about for a variety of reasons. Some were just basically off topic rabbit holes and some were too technical to summarise usefully. But there were a couple which I ended up putting back on the shelf thinking that I was just not the right person to write that story.
As you might have noticed, I’m a fundamentally whimsical person. It doesn’t suit me to adopt the kind of tone that’s appropriate to writing about things like the experience of Chile under Pinochet. Consequently, although the CYBERSYN experience is incredibly important in the development of Stafford Beer and management cybernetics, it’s dealt with in very circumscribed and technical terms in my book – I tried, more than once, to write about the wider context, and was left with something that definitely had to be destroyed before it caused serious offence to millions of people.
The other big one was Gill Kernick’s book “Catastrophe and Systemic Change: Learning from the Grenfell Tower Fire and Other Disasters”. After the Chile fiascos, I was aware that there was no way I was going to set out on the project of trying to do justice to Grenfell, but this is an extremely good and important book which deals with questions of accountability, complex systems and disastrous failure in exactly the sensitive and perceptive way that I was scared I wouldn’t.
Bicycles are more my speed – here’s an edited excerpt from “The Brompton”
A folding bike is, fundamentally, a massive compromise by its very nature. Placing hinges in a bicycle frame is exactly the wrong thing to do in terms of the properties you want a bicycle to have. So having made that initial design compromise, your scope for making further compromises is greatly limited. And tolerances tend to add up; a fraction of a millimetre in one component can be magnified by a similar amount in another one, and before long you can end up being three or four millimetres out at the other end of a long tube. Three or four millimetres is definitely enough to affect the ride quality; if the floor underneath you right now moved by half a centimetre it would feel like there had been a small earthquake.
One of Andrew Ritchie’s key insights was that the Brompton would have to be manufactured to much greater precision than non-folding bikes; in the factory at Greenford in West London things are generally calibrated to +/- 0.2mm tolerance. This means that the final tolerance in the overall wheel alignment, once all the frame parts are put together, is no more than 2mm. A factory making a non-folding bike, of course, only has one frame part to worry about, so they can work to 2mm tolerances. If we did the same, the wheels could be out by plus or minus two centimetres
The problem of “stacking tolerances” is a staple of engineering courses; you have to be prepared for the case where Sodd’s Law applies, and every single component is bent to the maximum extent in the same direction. In engineering contexts, this will happen rarely, and that means that in mass production contexts it will inevitably happen again and again.
One of the lessons of Gill Kernick’s book, though (I don’t think she puts it in exactly these words), is that in regulatory contexts, the equivalent phenomenon to tolerance stacking will happen very frequently. Because although the engineering concept of a “tolerance” is a symmetrical variation around a central ideal, that’s not the way that regulatory tolerances work.
Kernick’s book is written a little bit defensively, as it was one of the first analyses to come out after the inquiry, and a lot of questions of legal liability hadn’t been settled, but she paints a pretty clear picture of how the deadly cladding came into use. The manufacturer tested it in the most favourable conditions possibly allowed under the relevant standards, and it barely passed. However, “barely passed” is just another term meaning “passed”, and so the cladding was certified for use in a much wider variety of conditions.
Then the question arose; how should one set up a safety standard for a different kind of application (retrofitting rather than building), and decisions were made based on the normal kind of insulation passed under the standards. But of course, the normal kind wasn’t the one that was used; the cheapest kind was. And so the interaction between the cladding and the fixings became another tolerance, stacked on top of the original one.
It went on from that (the firefighting and evacuation principles were also designed for the central case, in a sense), but the general issue is clear. Certification is an information-saving technology (I’ve been on about this since Lying For Money). It attenuates information; rather than checking the exact properties of a building material and doing a load of engineering calculations, you look at the certificate that it’s fit for a particular purpose.
But in a lot of contexts, you have to assume that the information which has been attenuated is as unfavourable as it could possibly be. Because lots of systems set up incentives which ensure that rules will be barely obeyed to the letter, which causes stacked-tolerance problems if the rest of the system is based on the assumption that they will be obeyed in spirit with a margin of safety.
Anticipating a sensible and legitimate comment - yes, the victims of the Boeing 737MAX are just as dead as those of Pinochet and Grenfell, and were also real human beings with relatives and loved ones. All I can say is that it didn't seem as visceral to me as Chile and Grenfell, and I tried to keep it tasteful in the book.
If you wanted to encapsulate it as a rule, I guess the way would be "all regulations must be set as if all the other regulations are being obeyed at their *worst case*"
In fact, this would probably force some of the worst cases to be scaled right back (e.g. the cladding wouldn't have passed because all the other certifications wouldn't have wanted to have to be a lot more rigorous), and in situations where they weren't barrel-scraping, you'd get a bit of tolerance for the unexpected.
Of course, then you get grumbling about stuff being over-engineered, not to mention that "worst-case" is not always trivial to calculate.