I’m trying to write a very short summary of the key concepts of management cybernetics, mainly for my own use because I’m going to a conference where I might have to talk about it to some very clever people who haven’t necessarily come across it before. Obviously, being the precise kind of nut I am, the first step in this process was to come up with about a zillion digressions and side-issues that distract from the central project. Of which, this is one.

The grandfather of management cybernetics was “Operations Research”, which originally meant “applied mathematics, in a military context”, then after the Second World War was developed to mean “applied mathematics, in an industrial context”, before further developing to “structured thinking about management, in general” and then disappearing into a death spiral in which it eventually came to mean “arguing about the technical merits of simulation languages”.

But in the early days, it gave me one of my favourite examples of a case where structured and mathematical thinking can give a straightforward and irrefutable answer to an otherwise troublesome question. This is taken from “The Pleasures of Counting” by Thomas William Körner (as you can see from that link, it’s out of print, and in sufficient demand to only be available at silly prices – my copy turned up in a charity shop).

The question is one of naval strategy which was incredibly important in the Battle of the Atlantic. Should the Allies ship goods and weapons from America to England in lots of small convoys, or a smaller number of big convoys? Small convoys are faster and can make more trips, and have more chance of going undetected, so there are a lot of intuitions pointing you in this direction. But if you consider the formula for the area of a circle, pi-r-square, then you immediately realise that the number of merchant ships in a convoy is a matter of area, while the number of warships needed to escort them safely depends on the perimeter (2-pi-r). Consequently, big convoys are much more efficient than small ones, particularly when you take into account that it’s much more than twice as difficult for a U-Boat to get away with sinking four boats than two. And finally, a little estimation exercise shows you that even a big convoy is really small compared to the Atlantic Ocean; they are not *that *much more easily detected.

Körner records that the War Office recruited professors from all of Britain’s top universities, and that the Operations Research effort made a huge difference to the fighting of the war. He then points out that almost none of these contributions required anything even approaching undergraduate mathematics; they were almost all based on arithmetic, elementary geometry and things that the admirals would certainly have learned in school.

But, he argues, it’s not so much a question of needing to be a professor to know the answer – you needed to be a professor to know that it *was* the right answer, to be sufficiently confident to maintain it in a situation where the stakes were unimaginably high, and to be sure that objections from admirals whose experience and intuition went the other way were wrong.

Something like this is very much the case with management cybernetics. It’s based on information theory, as invented by Shannon and Wiener. The basic principle is Ashby’s Law of Variety – the information-handling capability of a management system has to be greater than or equal to the amount of variation in the system it’s trying to manage. Making sure the informational balance sheet balances is the fundamental “go” of the thing – as Stafford Beer said, it stands to management science in the same relationship that Newton’s Third Law does to rocket science.

But it’s information theory applied to a context of “extremely connected, extremely complex” systems, where the idea of *measuring* the amount of information, as a number of bits, is ridiculous. The numbers are absurdly huge, because connective complexity multiplies up rather than adding up. So the fundamental law just becomes “make sure that one unimaginably huge number that you have no way of measuring is greater than or equal to another unimaginably huge number that you have no way of measuring”. Which isn’t very helpful.

Consequently, you have the corpus of work known as “all the rest of management cybernetics”. Which is a set of rules of thumb and analogies, all based on experience, and on the fact that although you can’t measure the amounts *ex ante*, you can find out *ex post* which one was the larger, because the system will either have blown up and become unregulated, or not done that.

And so, although in order to understand management cybernetics you have to make at least an effort at information theory, the mathematical understanding is in some way a spiritual exercise. It doesn’t really give you formulas you can use; it’s something that you put yourself through so that you can be genuinely confident that the rules of thumb and analogies work. Having convinced yourself, you might as well forget the entropy formula and all the rest, because you aren’t going to be using it much any more.

In the book, I advance a few theories about why it is that economists, rather than any other kind of social scientist, are so influential on public policy, but this might be a better reason than any of them. A lot of policy economics is basic adding-up stuff – balance sheet identities, dividing output by employment to get productivity and the like, what is pejoratively (and in my view stupidly) called “chartblogging”. It doesn’t have much to do with the sort of thing you find in a graduate textbook like Mas-Collel *except* that going through the fixed point theorems and real analysis has a similar effect to spending a week contemplating episodes in the life of Jesus for five hours at a stretch; it gives you the inner strength to stick to the sacred doctrine.

Ah yes, the stations of the Keynesian cross.

The last few sentences were the most amusing thing I have read in weeks. Thank you.