Dava Sobel tells the story of the British navy’s loss of ships and 2,000 sailors when they struck the rocks of the Scilly Isles on October 22, 1707.* The disaster, Sobel writes, could have been avoided had Sir Clowdisley and Admiral Shovell listened to an ordinary seaman who warned them that they were off course. Because only officers could address navigation issues, Shovell hanged the “impudent” man, ignoring his warnings that could have saved the fleet. Could there be a more blatant example of an officious bureaucracy failing because of its rigid structure?
Not all warnings that come from unexpected sources are correct, but there’s some sense in heeding them even skeptically. Julius Caesar learned that the hard way, but I think he got the point, especially from Brutus. However apocryphal the soothsayer’s warning about the Ides of March might be as a classical story, such a warning would have warranted Caesar’s outright skepticism. Unless the soothsayer knew what the senators had planned, he based his warning on a mere feeling. (Supposedly, President Kennedy was similarly warned though this, like the Caesar story, might be apocryphal)
Do you find it interesting that some people heed warnings that are based on nothing more than feelings or myth yet reject warnings that are based on quantifiable measurements? Some heeded the warnings of those who looked at an Aztec calendar and declared that the END was nigh in 2012. (It wasn’t) What, other than myth, was the source of the warning? But the warning of the British sailor doesn’t fall into that category. He had kept a careful navigational record, and in matters quantitative, it matters to check the quantities.
That British sailor does fall into the same category as Bob Ebling, a Morton Thiokol engineer who with other engineers in 1986 warned that launching the Space Shuttle on a cold January morning could result in disaster because the O-rings were temperature sensitive. Yet, no one in the bureaucracies of NASA or Morton Thiokol took their warnings seriously enough to postpone the launch, and seven astronauts died. Could there be a more blatant example of an officious bureaucracy failing because of its rigid structure?
Obviously, someday someone will intuitively say, “The World is going to end tonight,” or “The volcano is going to erupt tomorrow” and be correct just by coincidence. But there are differences in warnings. Some, like the soothsayer’s, might be based on feelings; others might be based on almost certain probabilities. Volcanologists have learned much from the eruptions of Mt. St. Helens and Mount Pinatubo in 1980 and 1991. A volcanic eruption has precursor earthquakes, such as a seven-plus magnitude earthquake and thousands of smaller earthquakes that shook Luzon before that 1991 eruption. Heeding the specific warnings, military and civil authorities saved thousands of lives. They did not act on intuition. They had data that was associated with a high probability of an eruption. In contrast, NASA and Morton Thiokol’s upper management dismissed the warnings about the effect of cold temperatures on O-rings.
Is there a defense for not taking advice or heeding warnings that might save lives? Certainly, Sir Clowdisley and Admiral Shovell might have considered that an experienced sailor’s reckoning just might be accurate in spite of the rules against sailors questioning officers’ opinions. But then, if you let one sailor question your decision, you open that Pandora’s bottle.** Nevertheless, the officers had to deal with darkness and fog at a time when longitude was yet still just a guess (the subject of Sobel’s book), so, maybe just in this one instance…. But what about the NASA managers? Surely, since they were composed largely of engineers and scientists, such a group would consider matters rationally. Isn’t testing part of science? They had never really tested O-rings under such freezing conditions. Were they thinking, “The absence of evidence permits us to decide on the basis of an announced schedule.”
Just about all of us get advice and warnings for which we have to ask ourselves a fundamental question: Is there something measureable in the warnings? If there is, then such warnings are worth heeding regardless of the stature of the person carrying the message. But many warnings and bits of advice come from common sense. “Don’t overeat. Exercise. Study.” What are we to do with those? When matters are personal, other matters don’t matter.
No one stands above the need for warnings and advice. Yet, we encounter dilemmas regularly, and the Valentine Day massacre in a Florida school is one of those instances. Those students who thought the shooter exhibited aberrant behavior might be akin to sailors who feared the consequences of speaking up, of telling the authorities they were off course. Look what happened to the one sailor that did speak up. In Florida, if any adults saw warning signs, they seemed to have ignored them or failed to act on them. The ship was headed for the rocks, and no one either offered advice or was willing to accept it. What defense? Well, it wasn’t a quantitative matter, unless one counted suspensions from schools. But who knows what transpires in the mind of another unless, unless that mind posted a number of telling images on social media.
Maybe we’re just all a bit too hardheaded to think that others can give us advice or warnings we should follow. Those 2,000 British sailors and those seven astronauts would not have died if those in power had minds open to warnings. And maybe those who died in a Florida school would simply go on to turn in their homework because they, unlike the sailors and the astronauts, could have safely returned home.
*Sobel, Sava, Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time. New York. Penguin Books, 1995, pp. 11-13.
**Yes, bottle: μπουκάλι