I was asked by a friend to expand on this short comment, in obituary for the political thinker and author of “Seeing Like A State”, James C Scott:
What I meant by it is that Scott is an example of what I’ve called elsewhere “intellectual carcinization” – the phenomenon whereby people from other fields reinvent some of the important principles of management cybernetics simply because they’re studying the same problems and the underlying mathematical structures are there to be found. I broadly agree with Brad DeLong that “Seeing Like A State” is operating in the same space as FA Hayek. And notoriously, Stafford Beer thought that Hayek was a kindred spirit and very surprised indeed to find out in the 1970s that he wasn’t.
So, inaugurating a new series called “If I Had Been Present At The Creation (I could have offered some useful advice)”, I will try in this post to explain how Seeing Like A State could have addressed many of the objections later made by Brad and by Henry Farrell in the post linked above, if only its author had ever heard of management cybernetics.
To start with. The central problem of Seeing Like A State is recognisable as what I’ve argued in the past is the central problem of all management theory – that of “getting a drink from a firehouse”, or of attenuating the flood of information coming out of the system in order to, literally “make it manageable”.
States do this by engaging in “The Politics Of Large Numbers”, to take the title of Alain Desrosière’s excellent book. They collect numbers. As always, collecting numbers is never an innocent or technical business, you only get into the expensive and tedious business of data tabulation if you want to do something with the numbers, and the way you collect and tabulate will very much depend on what you want to do.
Furthermore, the act of tabulation and collection is always also an exercise in deciding what you’re not going to collect information about – the way that you attenuate information is by throwing some away. In Scott’s book, this is called the distinction between “techne” and “metis”[1], and, oversimplifying mightily, he argues that lots of things go to crap because the bureaucracy isn’t able to handle “metis” type information and so ends up trampling over populations and institutions that it doesn’t understand. To be honest, although I’m sure that the publisher was pleased with “How Certain Schemes to Improve the Human Condition Have Failed” as a subtitle for Seeing Like A State, I’d have gone with the greatly superior “Why Big Systems Make Terrible Decisions - and How The World Lost its Mind”.
And the schemes have indeed often failed. There’s an absolutely horrifying anecdote (from Susan Greenhalgh’s book but told to me by Dan Wang) about how the Chinese one-child policy basically began when one of the country’s best engineers was first introduced to the idea of using applied mathematics and control technology for the purposes of rational administration, and immediately plugged some population and production numbers into an IBM machine, reached the Malthusian conclusion and thence to imposing a demographic disaster based on the infallible logic of the model. Scott’s describing a real and important phenomenon that happens in the world.
But … it’s a big step from noting that things can go off the rails in this way, to presuming that it’s an intrinsic failing of the bureaucratic state, and that it’s all about techne/metis and therefore blah blah anarchism. Couldn’t it just be a design failure? Early rockets and steam engines blew up, a lot, but that didn’t mean that they were intrinsically bound to fail as a means of propulsion – it just meant that when you’re designing something based on the use of heat to expand gases, you need to spend much more time and effort on ensuring that the gas expansion happens in a controlled fashion with ways to vent excess pressure, than on the comparatively easy problem of pushing a piston or directing an exhaust.
In other words, and as I think Scott would have seen if he’d come at things from a more cybernetic perspective as designed objects, rather than an anthropological one, although plumbing isn’t architecture, big systems are very dependent on the “plumbing” of their information. In particular, big corporations and states need to have information channels going from the bottom to the top – the “red handle signals”, as I call them in my book, that can bypass the normal hierarchy and get information to the decision making centre, in time and in a form which can be understood. It’s the lack of that which has caused so many schemes to fail, and its continued presence, albeit in highly unsatisfactory and attenuated form, which accounts for the fact that the democratic industrialised world still does kind of work, a bit.
[1] one of the reasons I know Scott would have been an excellent cybernetician is that he shares their passion for coining words from Greek roots in the erroneous belief it makes things easier to understand.
I am wondering if there's not a link that can be made between cybernetics (especially nested cybernetic systems) and anarchism? Anarchism is about situating control at appropriate levels, which may or may not necessarily be at the very bottom layer ie if we might design an anarchist approach to climate change it might look very much like a set of cybernetic systems. They needn't necessarily be in conflict and a cybernetic 'state' might look very much like an anarchist one?
But Hayek **was** a kindred spirit. But he was also a right-wing nutjob. And the right-wing nutjobbery caused him to get stuck in a local minimum with respect to the issues he really wanted to address... Yours, Brad