Fukushima + Air France 447: Risk and The Deadly Human Factor
The coincidence of reports assessing these two catastrophes, and the analysis offered, suggests many lessons as we move atMoore’s-Law exponential speed into ever-close relationships between human flourishing and our management of complex technology systems. The immediate lesson fromFukushima and AF447 is that we are in deep trouble.
Humans and technology interface in several distinct ways. We design systems. They shape our experience. We depend on them. We control them, one way and another; and although we are building in ever-more autonomous capacities, the buck still stops with Homo sapiens. In these cases, AF pilots, their trainers, Japanese nuclear engineers, and their government regulators. Fail, fail, fail, fail.
Fukushima suggests a case of the revolving doors. AF447 a case of the unanticipated risk. What’s scary is that we are not dealing with installations and airplanes in resource-poor parts of the planet, though there are plenty there. Japan is one of the most technologically advanced of human societies, and its democracy stable and professional in its institutions. (Ahem, think Wall Street and 2008, if you dare.) And by the same token, Air France. So why did no-one in the relevant roles anticipate the disasters which unfolded? This set of questions should pre-occupy us – us, that is, the species, not just “risk” professionals – as we look ahead up the curve to increasingly autonomous systems with growing risk potentials built-in.