Anticipating a sensible and legitimate comment - yes, the victims of the Boeing 737MAX are just as dead as those of Pinochet and Grenfell, and were also real human beings with relatives and loved ones. All I can say is that it didn't seem as visceral to me as Chile and Grenfell, and I tried to keep it tasteful in the book.
If you wanted to encapsulate it as a rule, I guess the way would be "all regulations must be set as if all the other regulations are being obeyed at their *worst case*"
In fact, this would probably force some of the worst cases to be scaled right back (e.g. the cladding wouldn't have passed because all the other certifications wouldn't have wanted to have to be a lot more rigorous), and in situations where they weren't barrel-scraping, you'd get a bit of tolerance for the unexpected.
Of course, then you get grumbling about stuff being over-engineered, not to mention that "worst-case" is not always trivial to calculate.
When we mistake a random process for an adversarial one, we tend to form conspiracy theories. What about when we mistake an adversarial process for a random one? What is that? A complacency theory?
I very much like and intend to steal "complacency theory". I guess it's related to Robert Anton Wilson's point that there ought to be a short descriptive phrase for "the *rational* belief that the political system is corrupt and fixed"
I've had conversations with people who complain about the local building department (or the health department) being too strict and inflexible, and the best defense I have is, "this is their one chance to set standards. Once they complete the permit/inspection and given approval then they're out of the picture. "
Thanks for the post Dan. Something these issues raise is not just how strict regulation is and should be but the extent to which what happens in a regulatory structure can reflect some critical thought all along the way. Right now a great deal of the training one gets at uni in many, many disciplines is not how to navigate a complex pathway of thought being critical throughout. Rather they are direct inductions into bureaucratic systems.
We saw how disastrous this was during COVID — particularly looking at the US CDC whose performance was just extraordinarily bad, and self-evidently so. For instance in holding up testing for chrissake.
Somewhat relatedly, when I screw something up (which happens often) and my backup system works as intended, I always regard this as unmerited good luck rather than the natural course of things. That way, when the backup system fails (which, as you say, happens rarely but inevitably), I don't have to rage against the injustice of the universe.
it's almost like there's one of the few practical differences between Bayesian and Frequentist statistics here; when you're doing quality control, you really need to be clear that what you're saying is that 3 out of every 100 parts are defective, not that you're 97% confident in every single one of them.
One of the people who was very much in my mind when I decided to give up on writing about Chile was a very wonderful guy who's an architect; he claimed that whenever there was a big earthquake he would drive around town so he could look at his buildings and work out if he needed to make a quick relocation to Peru
Anticipating a sensible and legitimate comment - yes, the victims of the Boeing 737MAX are just as dead as those of Pinochet and Grenfell, and were also real human beings with relatives and loved ones. All I can say is that it didn't seem as visceral to me as Chile and Grenfell, and I tried to keep it tasteful in the book.
If you wanted to encapsulate it as a rule, I guess the way would be "all regulations must be set as if all the other regulations are being obeyed at their *worst case*"
In fact, this would probably force some of the worst cases to be scaled right back (e.g. the cladding wouldn't have passed because all the other certifications wouldn't have wanted to have to be a lot more rigorous), and in situations where they weren't barrel-scraping, you'd get a bit of tolerance for the unexpected.
Of course, then you get grumbling about stuff being over-engineered, not to mention that "worst-case" is not always trivial to calculate.
When we mistake a random process for an adversarial one, we tend to form conspiracy theories. What about when we mistake an adversarial process for a random one? What is that? A complacency theory?
I very much like and intend to steal "complacency theory". I guess it's related to Robert Anton Wilson's point that there ought to be a short descriptive phrase for "the *rational* belief that the political system is corrupt and fixed"
I've had conversations with people who complain about the local building department (or the health department) being too strict and inflexible, and the best defense I have is, "this is their one chance to set standards. Once they complete the permit/inspection and given approval then they're out of the picture. "
Thanks for the post Dan. Something these issues raise is not just how strict regulation is and should be but the extent to which what happens in a regulatory structure can reflect some critical thought all along the way. Right now a great deal of the training one gets at uni in many, many disciplines is not how to navigate a complex pathway of thought being critical throughout. Rather they are direct inductions into bureaucratic systems.
We saw how disastrous this was during COVID — particularly looking at the US CDC whose performance was just extraordinarily bad, and self-evidently so. For instance in holding up testing for chrissake.
I tried to speak to some of these things here
https://clubtroppo.com.au/2020/07/09/thinking-keep-it-adaptive-stupid/
Somewhat relatedly, when I screw something up (which happens often) and my backup system works as intended, I always regard this as unmerited good luck rather than the natural course of things. That way, when the backup system fails (which, as you say, happens rarely but inevitably), I don't have to rage against the injustice of the universe.
it's almost like there's one of the few practical differences between Bayesian and Frequentist statistics here; when you're doing quality control, you really need to be clear that what you're saying is that 3 out of every 100 parts are defective, not that you're 97% confident in every single one of them.
One of the people who was very much in my mind when I decided to give up on writing about Chile was a very wonderful guy who's an architect; he claimed that whenever there was a big earthquake he would drive around town so he could look at his buildings and work out if he needed to make a quick relocation to Peru