Knightian Uncertainty
6 May 2011 at 10:02 am Peter G. Klein 9 comments
| Peter Klein |
A nice post from good-twin Brayden. He quotes Chuck Perrow:
In what should be considered a classic case of the failure to take a possibilistic approach, consider this statement by Tsuneo Futami, a nuclear engineer who was the director of Fukushima Daiichi in the late 1990s: “We can only work on precedent, and there was no precedent. When I headed the plant, the thought of a tsunami never crossed my mind.”
Futami was not alone in his thinking. Experts throughout the nuclear industry and government regulatory agencies not only failed to predict the likelihood of a giant earthquake and tsunami, but also failed to examine the vulnerabilities of Fukushima Daiichi’s design to a natural disaster of this scale. Instead, they relied on a history of successful operation as an assurance of future safety. As a result, they ignored or underestimated a number of major risks that have since doomed the plant.
In other words, decision makers do not enumerate possible outcomes, assign probabilities to each, compute expected values, and act accordingly, a la Luce and Raiffa (1957). Instead, they use heuristics, follow precedent, update priors, and so on. In Knightian terms, they use judgment.
My forthcoming book with Nicolai explores these aspects of Knightian judgment in much greater detail. Look for some excerpts to follow.
Entry filed under: - Klein -, Management Theory.
1.
srp | 6 May 2011 at 11:45 am
The quotation from Futami supports exactly the idea that people enumerate possible outcomes, assign probabilities to each, compute expected values, and act accordingly. The issue is at the first step–they missed one of the possibilities. It is fair to say that no mechanistic approach to decision making can conquer a failure of imagination.
I read that Burt Rutan, who developed the Virgin Galactic White Knight/Spaceship combination that won the X-Prize for suborbital spaceflight, never tried to prove that his design was safe. Rather, he and his team tried to think of ways it could be unsafe, and then dealt with each possibility. So all he could say was, in effect, “We are unable to prove that it is unsafe, but we are continuing to look at it.”
2.
Peter Klein | 6 May 2011 at 11:52 am
Um, well, you’ve just defined all approaches to decision-making under uncertainty as identical, which kind of begs the question, doesn’t it? If I close my eyes and choose a course of action randomly, with no thought whatsoever, you’d explain that as following the standard Luce-Raiffa model, except I ignored all forks of the decision tree but one?
3.
Bo | 6 May 2011 at 2:19 pm
If you close your eyes and chose a course of action randomly – will it contain no thoughts whatsoever? Choice theory/modeling would predict otherwise as would free will and rational behavioral theory..
Some people may indeed close their eyes and make what appears to be sound decisions; whereas others (I will not name names here…) may decide with wide open eyes, all the information available etc. and still miss important cues…but is that judgment?
In my humble opinion, the example you give above is more about reducing costs by assigning little or no probability to natural disasters – a perfectly rational economic behavior which ONLY IN RETROSPECTIVE seems odd, foolish and wrong.
My point here is – it is very easy to point a finger at people ex post for missing what at the time seemed unrealistic, unprobablistic, and more importantly too costly to control for…similarly, prior to 9/11 nobody thought it necessary to secure a tall building against an airplane and so on..Decision making under uncertainty is precisely an exercise in judgment; but judgment does not exclude (in fact it includes) enumeration of possible outcomes, assignment of probabilities, and computation of expected values etc…
4.
Peter Klein | 6 May 2011 at 2:34 pm
Bo, yes, I agree. Perhaps I was unclear in quoting Perrow (actually, this may be the first time I’ve ever quoted him favorably). Perrow seems to take examples as criticisms of “rationality” more generally. Not me. I’m simply contrasting Knightian judgment with the standard economics treatment of decision-making under uncertainty. Indeed, the very definition of Knightian judgment is action under uncertainty when the actor can’t specify the set of possible outcomes and assign probabilities to them. That doesn’t make judgment “irrational” — far from it. No finger pointing here.
5.
srp | 7 May 2011 at 5:40 pm
You shoot, you score! There is no difference among these theories of uncertainty. It’s all probabilistic decision-making based on imperfect data. Whether an event never occurs to you at all, or appears as a glimmer in your mind only to be dismissed as too unlikely to bother with, or gets explicitly estimated with a low probability and then rationally ignored is pretty much immaterial.
Should the Japanese engineers have worried about an earthquake/tsunami combination in designing the plant? I await an argument that proves that they should have. But their behavior is consistent with the idea that the risk seemed too low to waste time on, that it crossed people’s mind fleetingly and was dismissed as ridiculously unlikely, or that it was not considered at all.
6.
Peter Klein | 7 May 2011 at 10:12 pm
Awesome, a theory of decision-making that’s non-falsifiable and observationally equivalent to every other theory of decision making! Where have you been all my life?
7.
Mike Sykuta | 12 May 2011 at 1:50 pm
I remember reading shortly after the earthquake that the Fukushima plant was built to withstand the largest known (historical) earthquake/tsunami in the region…data going back hundreds of years. Although the true probability distribution may be unknown, that report would suggest the engineers that built the plant did use the best estimate of the probability distribution available based on historical records and built based on that probability function.
What is not clear is whether their estimate of the probability function was simply wrong, or whether it really just was one of those 1-in-10,000,000,000,000 type events. Just because the probability of an outcome is approaching zero doesn’t mean it cannot happen..
Anyhow, carry on debating the relative merits of alternate theories of decision-making under uncertainty. But I’m not sure the present case is a good illustration for a pure Knightian argument.
8.
srp | 14 May 2011 at 9:05 pm
Don’t hate the player, hate the game. It is simply a fact that these theories are observationally equivalent in real-world examples. We can deploy psychological ideas such as ambiguity and distinguish them in lab settings, at least as far as certain behaviors go, but you can’t look at a decision process where a low-probability event was ignored and use it support one theory over another. Ignoring low-probability events is a feature of almost any realistic economic theory.
9. A Systematic Approach to Sensitivity Analysis: Knightian Uncertainty | GroundwaterGo Blog | 8 July 2012 at 7:12 pm
[…] Part 1 I discussed irreducible uncertainty, which is somewhat related to the idea of Knightian uncertainty. Frank Knight was an economist who, in the 1920s, drew a distinction between risk and […]