17 Comments
User's avatar
Trevor Austin's avatar

Does “muddy boxes” have a negative connotation in these sources? It sounds like it fits a pattern I generally see called a best practice in commercial software development.

You give some team a project, typically that will last for a few weeks to a few months. You treat the team as a black box while the project is ongoing; they don’t need to get higher approval for the steps they take along the way. At the project’s conclusion, whether success or failure, you crack the box open and examine the decisions made along the way. You might call this a retrospective, after action report, postmortem, murder board, etc.

That lets the team move quickly in the moment, while preserving the ability to justify actions after the fact that were positive expected value but didn’t pan out. Sometimes they’re “blameless postmortems” with deliberate friction between the retrospective analysis and personal performance reviews.

This has always felt in practice like a really good model; is there a hidden downside I’m not considering?

Louise Ankers/Pixel Sisterhood's avatar

Agile software development along the actual principles of the manifesto itself has very little downsides in my view. If you review the work with the team in shippable increments of a few weeks to a few months then it's really useful

https://agilemanifesto.org/

EMANUEL DERMAN's avatar

Coarse graining in physics is a sort of recursive process so that when you look into a coarse grain there are smaller coarse grains and when you zoom out there are bigger ones. The recursive process in physics leads to a mathematical group., the renormalization group.

Wikipedia first line: "In theoretical physics, the renormalization group (RG) is a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales."

In physics people first started using renormalization in quantum electrodynamic field theory -- Feynman Schwinger Tomonaga the first ones to do it successfully, because some of the finer grains produced infinities and they had to absorb those infinities into the observed properties of the coarse grain and not worry about them except insofar as they produced little changes in the coarse grain. I think they didn't extend the single renormalization into a group. Later Leo Kadanoff and then Ken Wilson extended that idea into solid state physics and, and now it's become a standard way of modeling and theorizing. It led to great advanced in solid state physics in examining phase transitions between different states.

Kadanoff had introduced the idea of "block spins" in aggregating arrays of spinning (magnetic) molecules

"The blocking idea is a way to define the components of the theory at large distances as aggregates of components at shorter distances." -- Wikipedia.

Doing all this recursively produces a group. It's led to great advances in quantum field theory and solid state physics

Kent's avatar

I was only familiar with "career risk" as a concept among investment advisors, where it is a very useful concept indeed. The idea goes something like this: (1) all investors are subject to a variety of risks (risk of loss from an investment; inflation risk; concentration risk (i.e. having too much of your portfolio betting on one thing ... something that can be obvious or non-obvious to the investor); sequence of returns risk (even if the average returns over time of your portfolio are fine, if you happen to retire right before a massive drawdown then your required spending from a much smaller portfolio may blow up your retirement plan); and so on. A lot of investors don't want to deal with all these risks themselves, and so are tempted to hire an investment advisor to help them out. But if you're going to do that, you want to be aware that you're adding another risk to your portfolio: career risk, which in this case refers to the fact that your advisor may not steer you toward the investments that are best for you on a risk-adjusted basis, but rather will steer you (and all of his/her clients) toward those investments that can be almost guaranteed not to go so badly that your advisor loses her job.

It's the old saw <<if I recommend my clients buy something they've never heard of and it goes down, they'll say "what's wrong with my advisor?" but if I recommend buying IBM and it goes down, they'll say "what's wrong with IBM?">>

Anyway, your expansion of the concept of career risk, and connecting it to the concept of black boxes and professionalism, is all very useful to me. Obviously it connects deeply with the idea of accountability sinks: black boxes are just people who don't have recourse to the accountability sink excuse.

Love it! Thanks.

Dan Davies's avatar

Yes this was one of the things that got me started - it's an obviously generalizable concept that has been studied to death in the context of portfolio managers and as far as I can tell, not at all anywhere else

alkali's avatar

"... 'career risk'. It’s one of those things that everyone instinctively knows what it is ..."

The term is not used in the referenced essay and I must admit I literally don't know what it means. Is there at least a naive definition of it somewhere?

Dan Davies's avatar

you're probably overthinking it - just "someone doing something (or stopping something being done) because of worries about the effect on their own career rather than anything else".

alkali's avatar

Me? Overthinking? No risk of that surely

I actually had guessed something more existential, like "the risk a career track disappears in view of technological change"

Dan Davies's avatar

in the book, the chapter begins with the helpful advice that if you're ever stuck for something to say in a meeting, "of course, the actual decision is all going to boil down to career risk" tends to sound very wise and attractively cynical

RedRover's avatar

Or political change

RedRover's avatar
1dEdited

I understood it all wrong then. I heard “the risk that you’ve chosen a wrong career path” because you’ve mismatched your skills, the requirements of your profession have changed and you no longer meet them, or your field has ceased to exist.

ETA: System-instance problem. If we’re just talking about risk to my advancement/employment at wherever I am now, that’s not quite the same as risk to “will I continue to be able to monetize my brainwork.”

Jim Grafton's avatar

I think black boxes are actually essential to effective organisational design. No one can cognitively hold the full context of an entire enterprise, so scalable systems require abstraction and delegated autonomy.

The failure mode is when organisations psychologically resist that reality. Managers remain accountable for outcomes, but instead of fully delegating across boundaries, they maintain semi-transparent control into the internals of the “box”.

At that point the boundary stops being coherent. Responsibility, authority, and visibility become blurred. Everyone holds partial context of everyone else’s work, coordination costs explode, and the organisation compensates through meetings, reporting, and governance layers.

The problem isn’t black boxes. It’s incoherent boundaries around them.

Dan Davies's avatar

Not really no, it's just the phrase Stafford Beer initially used; other people call them dark boxes or something else indicating that you can look into them if you want to but it's not as easy as if they were clear

Doug Clow's avatar

My intuition is that it might be fruitful to hammer away at bit at what you're meaning by 'career risk' here. It's quite a stretch to have the same concept cover all of: the jobsworth who won't make an exception to the rules they are required to apply (who is not a black box), the academic who reluctantly passes weak students because failing them will blight their advancement (who is a black box), and the trader who decides against deploying their alpha because while they do expect get credit for beating the market, their career will end if one of their (correct in expectation) bets goes south against the conventional wisdom (sometimes a black box, sometimes muddy).

roger daventry's avatar

You are missing Stafford Beer’s anastomotic reticulum formulation et al

Alex Tolley's avatar

Isn't "career risk" at heart about people realizing that they have reached their Peter Principle level and that losing either the position or a whole career implies that they shouldn't have been in that position at all? In a true meritocracy, losing a position due to a "mistake" should just mean getting another without much problem. However, we all know of senior managers who could hardly walk and chew gum at the same time, and who make nonsensical arguments at meetings. How they got to that position may have been by brown-nosing or clever politicking. [Just look at some of the UK politicians vying for the PM's job. As for the US cabinet...]

One effect is to create a "herd instinct" so that any adverse decision was also made by many others, so you should be safe from being singled out. banking, funds management, and dare I say, CEOs chasing AI for "productivity gains."

Michael Pollak's avatar

I love the 2nd paragraph explanation of why black boxes are black. It keeps getting better.