In considering contingencies that are likely to have unacceptable consequences, we might want to be firefighters. We'd need fire insurance maps, underwriting rules, and risk prevention services. Or, we might want to have a standing army positioned just about everywhere, the problem then being that the resource demands are very substantial and your imagination of potential contingencies may not be adequate. You might want greater flexibility, adjusting your investments and deployments to match contingencies that actually occur, although there will surely be contingencies that you ignore that will happen and have even have greater fatal consequences.
In any case, you will want to bound uncertainty. You will need some way of estimating your capacity to do such bounding (both in imagining untoward possibilities and in using resources to restrict the actions of the source of negative contingencies). You need a capacity for constant vigilance, and then a capacity for prudent judgment with the products of that vigilance: You will have to learn how to do lessons-learned exercises. You will have to learn from imagined scary scenarios, so that you react out of prudence rather than fear or the desire to prevent everything. You will have to learn to think in terms of what others will do, and how others think of what you will do. Those others may be agents or they may be organizations or they may be physical/biological systems.
You will also want to purchase insurance to avoid some problems, lessen their consequences, and compensate for unanticipated losses. Even if you convince yourself that the economy will provide signals that encourage invention and technology for replacing depleting materials, so finding new resources, there will be disruptions and shocks. So you need to anticipate the need to bounce-back, catch up when you've gone down a wrong path, and change direction. Perhaps this is a matter of resilience and redundancy, but neither are free. For what you are doing is looking at the variance of your probability estimates, and considering situations where probability estimates do not help you. So Wall St. firms have daily reports of Value at Risk, about 4-6 sigma estimates of losses, but they realize that beyond that, probabilities they estimate are unreliable (covariances and not-independent behavior by agents can lead to very big losses). And of course, there are those uncertainties that cannot be assigned useful probabilities.
You can make investments, real options, to give you the chance to make choices in the future, giving you greater flexibility at the cost of current actions.
Moreover, you can develop methods of deterring untoward action by others, in part by linking their actions to yours (alliances). What you want to avoid are others going-for-broke, for they feel that many small moves will kill them (by the central limit theorem), and since they feel that it is all-or-nothing, they are sensible to make one very large bet. You will also need to have a sense of acceptable losses, in the sense that military leaders realize that there will be substantial casualties attendant to some strategies.
In the end you can pool some risk (or find a diverse set of insurers, who collectively believe that the premiums paid are adequate), but rare events will be beyond insurance and perhaps even self-insurance.
In any case, you will do sensitivity analysis to get a sense of residual uncertainties, uninsured contingencies, and likely deductibles you have incurred.
In the background (or the foreground!) is your forward presence and readiness. You have active movies of resources and responses stationed throughout the environment, and you have a substantial investment in current immediately-available agents and methods. What you want to avoid is having those forward presence and readiness capacities activated by events--you want to choose which events are worthy of defense, which might be allowed to play themselves out.
Also in the background is your Red Teams, constantly challenging your current plans and contingency responses, your bureaucratic processes, and your imagination. Presumably there are Blue Teams to challenge the Red Teams. And you have Intelligence capacities, in part to inform our decisionmaking with perspective and comparison, in part to be on the lookout. When those Red Teams and Intelligence capacities show their limitations, what is sometimes called "intelligence failures," you will need to have good judgment about whether those limitations are really avoidable or are part of what it means to work well outside probabilities and conventional imagination.
Finally, Will the survivors envy the dead?, the question asked by Herman Kahn about nuclear war. Put differently, Life goes on. Or so it would seem from all the devastations we have visited upon ourselves or have been visited upon us. Yes, the dinosaurs were extinguished by a meteorite hitting the earth and its environmental consequences, but it's likely that meteorites are not what we should be worrying about (although others disagree with my position)
No comments:
Post a Comment