The causes: a breakdown in communication; unclarified assumptions about team roles; deskilling of the workforce; and arrogance about previous success.
Sound familiar? Whether it is a highly complex, state of the art airplane or a medical technology designed, honed and manufactured under highly regulated conditions: we can become our own worst enemies.
William Langewiesche analyzed the tragedy of Flight 447.* Early in his article, he states his thesis clearly:
Over the years, “automation has made it more and more unlikely that ordinary airplane pilots will ever have to face a raw crisis in flight – but also more and more unlikely that they will be able to cope with such a crisis if one arises.”
3 hours and 41 minutes into the flight from Rio to Paris, ice crystals clogged 3 air-pressure tubes. That knocked out the cockpit’s 3 airspeed indicators. This did not materially affect the performance of the aircraft. But it startled the pilots. “The episode should have been a non-event, and one that would not last long.” But it set off a chain of reactions by the pilots in the cockpit. It became unclear who was in charge. Assigned roles were ignored. There was confusion about who had done what and why. As a result, the plane went into a stall – clearly announced by the computer, but virtually ignored at first in the actions they took. In less than 5 minutes, the plane hit the surface of the ocean at a descent rate of 11,000 feet per minute.
As in the case of the West Africa Ebola epidemic, we see catastrophic consequences of admirable and generally successful efforts. Langewiesche is clear about this in the article: “the new airplanes deliver smoother, more accurate, and more efficient rides – and safer ones, too. . . . Since the 1980s . . . the safety record has improved fivefold.”
But the resultant “deskilling” of airplane pilots means that for today’s pilots “most of their experience had consisted of sitting in a cockpit seat and watching the machine work.”
The hard-won wisdom of project management can help us better use – and protect ourselves from – advanced technologies.
Risk assessment: “What can go wrong?”
Think about the big picture. “How will this piece of technology fit into the human experience of x?”
Surface the assumptions that underlie this project – and proceed to “perfect” the technology cautiously.
View people – veteran, resistant pilots; newer, less experienced pilots; regulators; clients; naysayers – as allies to learn from, not as adversaries to beat down.
*Vanity Fair, October 2014, “The Human Factor: Anatomy Of an Airliner Crash”